Combining Quasi-Experimental and Experimental Approaches to Improve Power & Precision in Social Work Intervention Research and Evaluation

Schedule:
Friday, January 16, 2015: 10:00 AM
La Galeries 1, Second Floor (New Orleans Marriott)
* noted as presenting author
Roderick A. Rose, PhD, Research Assistant Professor, University of North Carolina at Chapel Hill, Chapel Hill, NC
Natasha Bowen, PhD, Professor, University of North Carolina at Chapel Hill, Chapel Hill, NC
Background and Significance. Social work researchers often confront daunting barriers to conducting experiments to test their interventions, especially when research involves treatment assignment at the group level. Onerous cost and sample requirements often lead to under-powered studies, particularly in school and community research that have very large organizational clusters. However, as this study demonstrates, experimental and quasi-experimental methods for causal inference can be combined under the right conditions to produce statistically powerful designs with more well-defined and precise intervention effect estimates without sacrificing the internal validity of randomization.

Methods. We use the evaluation of an in-school practice model to show how a combination of experimental and two quasi-experimental techniques can improve precision. Five of 10 low performing schools in one school district were randomly assigned to use the Elementary School Success Profile (ESSP) ecological assessment tool with low performing students. School teams were given $1500 to $2000 to address threats to success shown in the assessment data. Five low performing schools were randomly assigned to control. The low sample size was typical of studies conducted in schools; however, there were 18 higher performing schools that did not meet the criteria for randomization.  We used regression discontinuity design (RDD), a quasi-experimental approach that exactly models the assignment mechanism, to include these schools in the statistical analysis and improve power. In addition, we illustrate how a difference-in-difference-in-difference (DDD) estimate adds both precision and reduces dilution by estimating the effect only for the low-performing students for whom the program was intended and by adjusting for students’ initial conditions. The effect of school-level assignment (difference 1) was combined with the effect of student-level assignment (difference 2), in the context of improvement over time (difference 3). The third difference adjusted for inter-individual differences in performance. The second difference adjusted for improvements typical of higher achieving students, focusing estimation on the relevant comparison with youth performing at the same level.

Results. Adding 18 schools using RDD reduced the standard errors of school-level predictors, suggesting a higher-powered design. The DDD yielded effect estimates for low-performing students that also had lower standard errors and were undiluted by both prior educational disadvantages and learning differences between high and low performing students. For example, the DDD analysis removed differences between low and high performing students at the beginning of the study and learning effects for all low performers from the estimate of the impact of the program on learning among the low-performing sample. 

Implications. Novel combinations of experimental and quasi-experimental designs can benefit social work researchers who typically have to work with low sample sizes, but who may also benefit from pre-selecting candidates for randomization based on performance criteria. Using an RCT of an in-school practice intervention, we show how these criteria can be put to use at both the school and student levels to produce higher-powered designs with less effect dilution and lower standard errors. These methods represent opportunities to address long-standing power and precision problems in social work research conducted with communities, schools, and other large clusters.