Abstract: Preventing Placement in Foster Care: Evidence from a Quasi-Experimental Design (Society for Social Work and Research 25th Annual Conference - Social Work Science for Social Change)

All live presentations are in Eastern time zone.

Preventing Placement in Foster Care: Evidence from a Quasi-Experimental Design

Schedule:
Thursday, January 21, 2021
* noted as presenting author
Scott Huhr, MA, Senior Research, University of Chicago, Chicago, IL
Background and Purpose: The Family First Prevention Services Act expands access to placement prevention services provided states invest in evidence-based interventions. The emphasis on evidence-based interventions is cause for concern given the paucity of interventions that pass the well-supported standard needed to secure long-term federal support. Among other implications, the law places a premium on moving interventions along the evidence ladder from promising to well-supported. However, implementing credible evaluation designs is a challenge.

In this paper, we describe a placement prevention program evaluation. The intervention is an intensive service model implemented on a broad scale in a medium-size state. To meet the methodological threshold set by reviewers, we adopted a novel methodological approach that incorporates random effects to control for county differences, uses exact matching to adjust for selection effects, and adjusts for worker-level referral bias using worker’s Empirical Bayes estimate.

Methods: Data for the study comes from linked administrative records. The state administrative records track CPS investigations, service referrals, placement histories, clinical acuity, and connects each child to the worker who would have made the referral decisions. Program data were sourced from the provider. Those data allowed us to locate the timing of the referral relative to the child’s placement history, if any. To capture referral tendencies, we used the Empirical Bayes residual to measures a worker’s referral rate relative to all other workers adjusted for the clinical differences of the cases encountered. We were able to use exact matching which allowed us to achieve baseline equivalence in the matched sample. To address censoring and the nested structure of the data we relied on multi-level discrete time hazard model.

Results: Prior to matching, program referrals were more likely to go into placement, which is indicative of the fact that referrals came from a higher risk sub-population. This assignment bias was confirmed with the assessment data. Caseworker were a significant source of bias. County placement rates were also an important source of variation. After exact matching along with a substantial set of covariates addressing various clinical aspects of the population, we found a significant average treatment effect in reducing placement. We found that the treatment effect varies with respect to time, with the largest treatment effect found in the 3 months following referral. We also tested for and found sustained effects 12 months post intervention.

Conclusions and Implications: With the growing emphasis on evidence-based interventions and the limited opportunities within child welfare for conducting random assignment evaluations, quasi-experimental designs are important. The willingness of forward-looking state leadership, private agency partners, and social scientists to collaborate paid important dividends. As part of the research team, we designed the study without interference. Methodologically, we found that worker referral tendencies are an important confound, but rarely if ever accounted for in QEDs. After a thorough review by the Evidence Clearinghouse, the intervention founds its way on to the list of supported programs, which is, in the end, a tribute to clinicians working with families directly.