Deprecated: Creation of dynamic property lastRSS::$cache_dir is deprecated in /home2/mivanov/public_html/psyresearch/php/rss.php on line 430

Deprecated: Creation of dynamic property lastRSS::$cache_time is deprecated in /home2/mivanov/public_html/psyresearch/php/rss.php on line 431

Deprecated: Creation of dynamic property lastRSS::$rsscp is deprecated in /home2/mivanov/public_html/psyresearch/php/rss.php on line 267

Warning: Undefined variable $onclick in /home2/mivanov/public_html/psyresearch/php/rss.php on line 597

Warning: Undefined variable $span_id in /home2/mivanov/public_html/psyresearch/php/rss.php on line 597

Warning: Undefined variable $onclick in /home2/mivanov/public_html/psyresearch/php/rss.php on line 597

Warning: Undefined variable $span_id in /home2/mivanov/public_html/psyresearch/php/rss.php on line 597

Warning: Undefined variable $onclick in /home2/mivanov/public_html/psyresearch/php/rss.php on line 597

Warning: Undefined variable $span_id in /home2/mivanov/public_html/psyresearch/php/rss.php on line 597

Warning: Undefined variable $onclick in /home2/mivanov/public_html/psyresearch/php/rss.php on line 597

Warning: Undefined variable $span_id in /home2/mivanov/public_html/psyresearch/php/rss.php on line 597

Warning: Undefined variable $onclick in /home2/mivanov/public_html/psyresearch/php/rss.php on line 597

Warning: Undefined variable $span_id in /home2/mivanov/public_html/psyresearch/php/rss.php on line 597

Warning: Undefined variable $onclick in /home2/mivanov/public_html/psyresearch/php/rss.php on line 597

Warning: Undefined variable $span_id in /home2/mivanov/public_html/psyresearch/php/rss.php on line 597

Warning: Undefined variable $onclick in /home2/mivanov/public_html/psyresearch/php/rss.php on line 597

Warning: Undefined variable $span_id in /home2/mivanov/public_html/psyresearch/php/rss.php on line 597

Warning: Undefined variable $onclick in /home2/mivanov/public_html/psyresearch/php/rss.php on line 597

Warning: Undefined variable $span_id in /home2/mivanov/public_html/psyresearch/php/rss.php on line 597

Warning: Undefined variable $onclick in /home2/mivanov/public_html/psyresearch/php/rss.php on line 597

Warning: Undefined variable $span_id in /home2/mivanov/public_html/psyresearch/php/rss.php on line 597

Warning: Undefined variable $onclick in /home2/mivanov/public_html/psyresearch/php/rss.php on line 597

Warning: Undefined variable $span_id in /home2/mivanov/public_html/psyresearch/php/rss.php on line 597

Warning: Undefined variable $onclick in /home2/mivanov/public_html/psyresearch/php/rss.php on line 597

Warning: Undefined variable $span_id in /home2/mivanov/public_html/psyresearch/php/rss.php on line 597

Warning: Undefined variable $onclick in /home2/mivanov/public_html/psyresearch/php/rss.php on line 597

Warning: Undefined variable $span_id in /home2/mivanov/public_html/psyresearch/php/rss.php on line 597

Warning: Undefined variable $onclick in /home2/mivanov/public_html/psyresearch/php/rss.php on line 597

Warning: Undefined variable $span_id in /home2/mivanov/public_html/psyresearch/php/rss.php on line 597

Warning: Undefined variable $onclick in /home2/mivanov/public_html/psyresearch/php/rss.php on line 597

Warning: Undefined variable $span_id in /home2/mivanov/public_html/psyresearch/php/rss.php on line 597

Warning: Undefined variable $onclick in /home2/mivanov/public_html/psyresearch/php/rss.php on line 597

Warning: Undefined variable $span_id in /home2/mivanov/public_html/psyresearch/php/rss.php on line 597

Warning: Undefined variable $onclick in /home2/mivanov/public_html/psyresearch/php/rss.php on line 597

Warning: Undefined variable $span_id in /home2/mivanov/public_html/psyresearch/php/rss.php on line 597

Warning: Undefined variable $onclick in /home2/mivanov/public_html/psyresearch/php/rss.php on line 597

Warning: Undefined variable $span_id in /home2/mivanov/public_html/psyresearch/php/rss.php on line 597

Warning: Undefined variable $onclick in /home2/mivanov/public_html/psyresearch/php/rss.php on line 597

Warning: Undefined variable $span_id in /home2/mivanov/public_html/psyresearch/php/rss.php on line 597
School Psychology Quarterly
PsyResearch
ψ   Psychology Research on the Web   



School Psychology - Vol 39, Iss 6

Random Abstract
Quick Journal Finder:
School Psychology Quarterly The flagship scholarly journal in the field of school psychology, the journal publishes empirical studies, theoretical analyses and literature reviews encompassing a full range of methodologies and orientations, including educational, cognitive, social, cognitive behavioral, preventive, dynamic, multicultural, and organizational psychology. Focusing primarily on children, youth, and the adults who serve them, School Psychology Quarterly publishes information pertaining to populations across the life span.
Copyright 2024 American Psychological Association
  • Single-case design in school psychology: Issues of design, analysis, and training.
    Single-case design (SCD) is a methodology that provides researchers with a rigorous means of evaluating interventions, yet it is often underutilized in school psychology. This introduction to a special issue highlights critical barriers to the broad use of SCD, including insufficient training, unclear reporting standards, and debates surrounding data analysis methods, such as visual and quantitative approaches. It also addresses how methodological issues, such as observer bias and improper data presentation, affect research outcomes. The special issue encompasses articles that explore these themes, aiming to advance SCD by examining training gaps, improving analytic strategies, and refining methodological guidelines. Collectively, these contributions seek to promote the broader adoption of SCD as a scientifically rigorous and practical research methodology in school psychology. (PsycInfo Database Record (c) 2024 APA, all rights reserved)
    Citation link to source

  • A review of visual analysis reporting procedures in the functional communication training literature.
    Few guidelines exist for conducting and reporting visual analysis procedures and results in single-case research. Previous research examining how authors describe their analytic procedures and results has found that authors use key terms such as level, trend, and variability infrequently. Additionally, in a previous review, the authors rarely agreed with the original study authors on their conclusions. The purpose of this study was to document single-case researchers’ analytic procedures, including use of key visual analysis terms; description of data features; within- and between-condition analysis; and inclusion of descriptive statistics, effect sizes, or inferential statistics in the literature on a common Tier 3 behavior intervention, functional communication training. We also compared our determinations about functional relations to the authors’ conclusions. Our results suggest that most authors describe level, but almost a third did not describe trend or variability. Agreement with study authors was better than in previous studies but still below minimally acceptable thresholds. We discuss areas for future research and implications for reporting the analysis and results of single-case research. (PsycInfo Database Record (c) 2024 APA, all rights reserved)
    Citation link to source

  • Evaluating the correspondence between expert visual analysis and quantitative methods.
    Visual analysis is the primary methodology used to determine treatment effects from graphed single-case design data. Previous studies have demonstrated mixed findings related to interrater agreement between both expert and novice visual analysts, which represents a critical limitation of visual analysis and supports calls for also presenting statistical analyses (i.e., measures of effect size). However, few single-case design studies include results of both visual and quantitative analyses for the same set of data. The present study investigated whether blind review of single-case graphs constructed using up-to-date recommendations by experts in visual analysis would demonstrate adequate interrater agreement and have correspondence with an effect size metric, log response ratio. Eleven experts (i.e., professors in school psychology and special education with visual analysis experience) analyzed 26 multiple-baseline graphs evaluating implementation planning, a fidelity support, on educators’ implementation and student outcomes, presented in a standardized format without indication of the variable being measured. Results suggest that there was strong correspondence between raters in their judgments of the presence or absence of treatment effects and meaningfulness of effects (particularly for graphs of adherence and quality). Additionally, a quadratic relationship was observed between aggregate results of expert visual analysis and effect size statistics. Implications for future research and limitations are discussed. (PsycInfo Database Record (c) 2024 APA, all rights reserved)
    Citation link to source

  • The effect of the level of data presentation on visual analysts’ decisions.
    Single-case design (SCD) is frequently utilized in research and applied settings to evaluate the effect of an intervention over time. Once collected, single-case data are typically graphed and analyzed visually in both research and practice contexts. Despite the ubiquity of visual analysis in SCD, this analytic framework has often been critiqued due to findings of limited reliability across visual analysts. Recent research has identified that the way a graph is constructed may contribute to the limitations of visual analysis. The present study sought to evaluate the effect of visually representing multiple measurement occasions as a single data point (e.g., combining measurements taken daily into a weekly composite data point) on visual analysts’ decisions regarding the magnitude of an intervention effect. Eleven participants viewed identical data sets, plotted to show different numbers of measurement occasions combined as a single data point, and provided ratings regarding the magnitude of intervention effect depicted within the graph. Results indicated a significant main effect, with data sets with higher levels of data combination being rated as demonstrating significantly larger intervention effects. The results of the study provide additional support for standardization of data presentation and graph construction within SCD in both research and practice contexts. (PsycInfo Database Record (c) 2024 APA, all rights reserved)
    Citation link to source

  • Effects of portable interventions on school psychologists’ graph-rating inconsistency.
    The visual analysis of data presented in time-series graphs are common in single-case design (SCD) research and applied practice in school psychology. A growing body of research suggests that visual analysts’ ratings are often influenced by construct-irrelevant features including Y-axis truncation and compression of the number of data points per X- to Y-axis ratio. We developed and tested two brief interventions, based on the research in cognitive and visual science, to reduce visual analysts’ inconsistency when viewing unstandardized graphs. Two hundred practicing school psychologists visually analyzed data presented on standardized graphs and the same data again on unstandardized graphs. Across all conditions, participants were more willing to identify meaningful effects on unstandardized graphs and rated the data as showing significantly larger effects than on the corresponding standardized graphs. However, participants who answered additional (task-relevant) questions about the level or trend of graphed data showed greater rating consistency across the types of graphs in comparison to participants who answered task-irrelevant but challenging questions or control participants. Our results replicated prior research demonstrating the impact of SCD graph construction on practicing school psychologists’ interpretations and provide initial support for an intervention to minimize the impact of construct-irrelevant factors. Limitations and future directions for research are discussed. (PsycInfo Database Record (c) 2024 APA, all rights reserved)
    Citation link to source

  • Single-case design effect-size distributions: Association with procedural parameters.
    Visual analysis is historically and conventionally used to draw conclusions about outcomes in single-case studies, but researchers are increasingly using effect sizes to supplement conclusions drawn about functional relations with additional information about magnitude of behavior change. However, there is limited information about the extent to which methodological choices (i.e., design type, measurement system) may impact the magnitude of behavior change. We conducted a systematic review of interventions conducted in elementary school classrooms to characterize effect sizes for engagement behaviors and challenging behaviors in those studies. We found that researchers most often used A-B-A-B and multiple baseline across-participants designs, that a variety of measurement systems were used for engagement but not challenging behavior, and that some variability in effect-size distributions can be explained by dependent variable type, design type, and measurement system. The empirically derived distributions from this study may be helpful for single-case researchers to contextualize past, ongoing, and future work related to engagement and challenging behavior. (PsycInfo Database Record (c) 2024 APA, all rights reserved)
    Citation link to source

  • Examining the impact of design-comparable effect size on the analysis of single-case design in special education.
    Initially excluded from many evaluations of education research, single-case designs have recently received wider acceptance within and beyond special education. The growing approval of single-case design has coincided with an increasing departure from convention, such as the visual analysis of results, and the emphasis on effect sizes comparable with those associated with group designs. The use of design-comparable effect sizes by the What Works Clearinghouse has potential implications for the experimental literature in special education, which is largely composed of single-case designs that may not meet the assumptions required for statistical analysis. This study examined the compatibility of single-case design studies appearing in 33 special education journals with the design-comparable effect sizes and related assumptions described by the What Works Clearinghouse. Of the 1,425 randomly selected single-case design articles published from 1999 to 2021, 59.88% did not satisfy assumptions related to design, number of participants, or treatment replications. The rejection rate varied based on journal emphasis, with publications dedicated to students with developmental disabilities losing the largest proportion of articles. A description of the results follows a discussion of the implications for the interpretation of the evidence base. (PsycInfo Database Record (c) 2024 APA, all rights reserved)
    Citation link to source

  • Fine-grained effect sizes.
    To make transparent individuals’ responses to intervention over time in the systematic review of single-case experimental designs, we developed a method of estimating and graphing fine-grained effect sizes. Fine-grained effect sizes are both case- and time-specific and thus provide more nuanced information than effect size estimates that average effects across time, across cases, or both. We demonstrate the method for estimating fine-grained effect sizes under three different baseline stability assumptions: outcome stability, level stability, and trend stability. We then use the method to graph individual effect trajectories from three single-case experimental design studies that examined the impact of self-management interventions on students identified with autism. We conclude by discussing limitations associated with estimating and graphing fine-grained effect sizes and directions for further development. (PsycInfo Database Record (c) 2024 APA, all rights reserved)
    Citation link to source

  • Meta-analysis of single-case design research: Application of multilevel modeling.
    This study describes the benefits and challenges of meta-analyses of single-case design research using multilevel modeling. The researchers illustrate procedures for conducting meta-analyses using four-level multilevel modeling through open-source R code. The demonstration uses data from multiple-baseline or multiple-probe across-participant single-case design studies (n = 21) on word problem instruction for students with learning disabilities published between 1975 and 2023. Researchers explore changes in levels and trends between adjacent phases (baseline vs. intervention and intervention vs. maintenance) using the sample data. The researchers conclude that word problem solving of students with learning disabilities varies based on the complexity of the word problem measures involving single-word problem, mixed-word problem, and generalization questions. These moderating effects differed across adjacent phases. These findings extend previous literature on meta-analyses methodology by describing how multilevel modeling can be used to compare the impacts of time-varying predictors within and across cases when analyzing single-case design studies. Future researchers may want to use this methodology to explore the roles of time-varying predictors as well as case or study-level moderators. (PsycInfo Database Record (c) 2024 APA, all rights reserved)
    Citation link to source

  • Single-case design emphasis in American Psychological Association-accredited school psychology programs.
    Single-case design (SCD) is an underutilized research methodology in school psychology literature. Despite its relevance to practitioners and applied intervention researchers alike, the majority of intervention research disseminated in school psychology journals tends to involve group designs, such as the randomized controlled trial. Group designs are useful for answering a wide variety of research questions but may not always be relevant to practitioners seeking procedures that work for their students or researchers seeking for the optimal design to answer their research questions. In comparison, the relative dearth of SCD studies in school psychology literature, in conjunction with common values among the field regarding data-based decision making and the scientist–practitioner model to training, raises questions about the cause of this gap. The present study sought to review doctoral training programs in school psychology and evaluate their relative emphasis on SCD in training. Seventy-six American Psychological Association-accredited school psychology programs were reviewed, and results indicated that roughly two thirds of programs do not emphasize SCD during training. Implications regarding school psychology training and research are discussed and recommendations are provided on future directions for school psychology trainers and researchers. (PsycInfo Database Record (c) 2024 APA, all rights reserved)
    Citation link to source

  • Exploring threats to internal validity of direct assessment in single-case design research.
    Single-case design research studies have historically used external observers to collect time series data that may be used to evaluate intervention effectiveness; however, single-case interventions implemented in educational settings may use the person implementing the intervention (e.g., teacher) to collect data in order to maximize feasibility. The implementer’s knowledge of intervention goals and phase has the potential to influence assessment of dependent variables, particularly when ratings involving some degree of judgment (e.g., Direct Behavior Rating-Single Item Scales [DBR-SIS]) are used. Given the potential for rater effects and expectancy to influence data collection, this study sought to determine whether DBR-SIS measuring social skills collected in vivo by interventionists with full knowledge of intervention goals and phase were equivalent to data collected by external raters masked to intervention phase. Results indicated in vivo DBR-SIS differed from those completed by masked external raters, which has the potential to result in different conclusions regarding intervention effectiveness. The potential for negative effects resulting from sole reliance on in vivo ratings conducted by an interventionist may be mitigated by including additional data streams collected by external personnel masked to intervention phase or by using effect sizes that account for baseline trends. Implications for training and practice are discussed. (PsycInfo Database Record (c) 2024 APA, all rights reserved)
    Citation link to source

  • What interventions are cost-effective? A systematic review of cost-effectiveness analyses of school-based programs from 2000 to 2020.
    The extent to which evidence-based practices (EBPs) are considered cost-effective influences educators’ adoption decisions. However, what it means to be cost-effective and how to interpret cost-effectiveness ratios may be unclear. This systematic review of cost-effectiveness analyses of EBPs in schools illuminates the many sources of variability in published estimates. Studies were limited to peer-reviewed, school-based studies conducted in the United States between 2000 and 2020. Seven studies examining eight programs were identified and then coded for program descriptions, outcomes, research designs, and cost-effectiveness methodology. Secondary analyses illustrated how published estimates can be adjusted to reduce methodological variability and increase the utility of comparisons. The small number of studies highlights the need for research to evaluate the cost-effectiveness of more EBPs. Implications for research and practice are discussed. (PsycInfo Database Record (c) 2024 APA, all rights reserved)
    Citation link to source

  • Student-centered instruction can build social–emotional skills and peer relations: Findings from a cluster-randomized trial of technology-supported cooperative learning.
    Given the uneven track record of adjunctive social–emotional learning (SEL) programs and waning effects by middle and high school, we propose a more integrative approach to SEL through cooperative learning (CL). CL has demonstrated the ability to improve social–emotional, behavioral, academic, and mental health benefits, but CL lessons are complex and thus can be difficult to design and consistently deliver with fidelity. The present study attempted to address this barrier by examining the effects of technology-assisted CL on five social–emotional competencies, as well as social and behavioral outcomes. Participants were 813 students (50.2% female, N = 408, and 70.7% White, N = 575) from 12 middle and high schools in the Pacific Northwest in a cluster-randomized design where six intervention schools implemented technology-assisted CL and six control schools conducted business as usual. Using multilevel modeling, intervention effects on all outcomes after 1 year were significant, with moderate to large effect sizes, inviting further evaluation of integrative approaches to SEL that are developmentally aligned with the needs of students in secondary education. Although there remains a dearth of universal school-based interventions with demonstrated impacts on social outcomes in middle and high school, the present study builds support for the use of integrative, relationship-based instructional approaches, supported by technology, to promote positive peer relations, and social competencies for this age group. (PsycInfo Database Record (c) 2024 APA, all rights reserved)
    Citation link to source

  • The importance of treatment integrity: Examining the relationship between dosage and writing intervention outcomes.
    This study explored the relationship between 391 third-grade students’ writing productivity and the amount of intervention dosage received over a 6-week period. In addition, the association between gender and the amount of intervention dosage received was examined. Results indicated that intervention dosage had a statistically significant relationship with students’ writing productivity at the conclusion of intervention implementation. In addition, there was not a statistically significant difference in the amount of intervention dose received between female and male students. Notably, less intervention dosage may be indicative of higher rates of school absenteeism, which is associated with adverse academic outcomes. Implications and future research directions are discussed. (PsycInfo Database Record (c) 2024 APA, all rights reserved)
    Citation link to source

  • Correlates of social justice values in school psychology graduate students.
    School psychologists are well-positioned to serve as advocates for marginalized students to address educational inequities and challenge systemic barriers to well-being. However, if they do not personally endorse social justice values, they may be unwilling to take personal and professional risks to engage in social justice work. The purpose of this study was to examine the extent to which personal characteristics and multicultural competence are associated with social justice values in school psychology graduate students. A sample of 108 graduate students completed the Social Justice Scale, School Psychology Multicultural Competence Scale, and Marlowe–Crowne Social Desirability Scale. Participants strongly endorsed having social justice values; however, they were less likely to report being in a context supportive of social justice work. There was no difference in reported social justice values based on gender or race; however, sexual marginalized students and those with very liberal political ideology reported more positive attitudes toward social justice. Students with very liberal political ideology also reported greater intent to engage in social justice actions in the future. Additionally, there were several positive correlations between social justice values and perceived multicultural competence. Multicultural competence accounted for most of the variance in participants’ perceived ability to engage in social justice actions and being in environments supportive of social justice, while personal characteristics explained most of the variance in participants’ intent to engage in future social justice actions. Notably, political ideology was the most consistent predictor of social justice values. Findings and implications for graduate education programs are discussed. (PsycInfo Database Record (c) 2024 APA, all rights reserved)
    Citation link to source

  • Reputable and affordable programs with a strong commitment to diversity: Factors influencing school psychology student admission decisions.
    A current issue in the field of school psychology is the extreme shortage of school psychologists, and this is likely to persist in the future. Effective recruitment into school psychology programs is one of the most important strategies to increase the number of school psychologists. Within the present study, researchers created the Graduate Enrollment Admissions Rating Scale (GEARS), a survey measuring several different factors that school psychology students consider when applying to graduate programs, to determine what factors contributed to school psychology students’ choice of program. The GEARS was sent via email to current school psychology graduate students. Overall, current students rated program quality, including faculty friendliness, as the most important factor influencing their decision. Second, considerations reflecting the program cost were influential. Diversity issues were the third most important factor in students choosing their school psychology programs. Costs and research/teaching opportunities were more important in the recruitment of doctoral students than specialist students, but specialist students valued convenience of a program more than doctoral students. Results of this study suggest that faculty members in charge of recruiting need to consider ways to manage tuition costs, develop relationships with future students, and strive toward high-quality programs as the best ways to increase the likelihood that students will attend their university. (PsycInfo Database Record (c) 2024 APA, all rights reserved)
    Citation link to source

  • Time spent on distance learning moderates changes in teachers’ work-related well-being one year after the first school closures.
    It is now well documented that school closures enforced at the beginning of the COVID-19 pandemic impaired teachers’ well-being. Yet, only a few studies tracked changes in teachers’ well-being during the subsequent phases of the pandemic, phases that were characterized by the discontinuous implementation of in-person teaching and distance learning. To fill this gap, we conducted a follow-up study at the end of the school year 2020–2021 (May–June 2021, T2), administering an online questionnaire to Italian teachers (N = 240) who had previously taken part in a data collection conducted at the end of the first school closures (May–June 2020, T1). Our first aim was to monitor changes in teachers’ psychological and work-related well-being between T1 and T2. Our second aim was to assess whether time spent on distance learning moderates these changes in psychological and work-related well-being. Results showed that teachers’ psychological well-being decreased between T1 and T2, whereas work-related well-being increased. What is more, time spent on distance learning moderated the general increase in work-related well-being observed at T2: The longer teachers implemented distance learning during the school year 2021, the less their work-related well-being increased. In conclusion, although it seems that teachers have adapted to the changes associated with the first school closures, this study showed that distance learning remains a possible risk factor for teachers’ well-being. (PsycInfo Database Record (c) 2024 APA, all rights reserved)
    Citation link to source

  • Characteristics of school psychology faculty in 2021.
    Although racial, ethnic, and linguistically minoritized school-aged students within the United States are increasing in population, school psychologists have historically been predominantly white, monolingual females. Diversity within the field of school psychology is important for improving students’ achievement and postsecondary success, particularly as it relates to underrepresented students. Research shows that the diversity of school psychology faculty is important for the recruitment and retention of minoritized graduate students. However, demographic information within school psychology has only been calculated within the context of memberships to psychological organizations (e.g., the National Association of School Psychologists), which could underestimate the actual diversity of school psychology faculty currently in the profession. The purpose of this study was to collect information on the demographic characteristics of school psychology faculty as of 2021. A total of 429 school psychology trainers completed a brief web-based survey in which they self-identified their employment characteristics, gender identity, sexual orientation, racial–ethnic identity, (dis)ability status, and languages spoken. At the time of the survey, most of the sample were employed as full professors (30.5%) or assistant professors (29.4%). Results demonstrated that majority of the sample identified as white (78.6%), cisgender female (66.2%), heterosexual (87.2%), non(dis)abled (95.1%), and monolingual English speaking (83.9%). Faculty of color were more likely to report a higher percentage of time spent teaching as compared to white faculty. Implications of these findings and future directions are discussed. (PsycInfo Database Record (c) 2024 APA, all rights reserved)
    Citation link to source



Back to top


Back to top