Abstract: Using Observer Ratings to Predict Observation Rates and Implementation Quality: Implications for Improving Implementation Process (Society for Social Work and Research 20th Annual Conference - Grand Challenges for Social Work: Setting a Research Agenda for the Future)

Using Observer Ratings to Predict Observation Rates and Implementation Quality: Implications for Improving Implementation Process

Schedule:
Thursday, January 14, 2016: 1:30 PM
Meeting Room Level-Mount Vernon Square A (Renaissance Washington, DC Downtown Hotel)
* noted as presenting author
B. K. Elizabeth Kim, PhD, Postdoctoral Scholar, University of California, Berkeley, Berkeley, CA
Jennifer L. Fleming, MS, Research Associate, Devereux Foundation, Villanova, PA
Paul A. LeBuffe, MA, Director, Devereux Foundation, Villanova, PA
Valerie B. Shapiro, PhD, Assistant Professor, University of California, Berkeley, Berk, CA
Background and Purpose: Monitoring implementation fidelity is essential to implementation success and intervention effectiveness. The Promoting Alternative THinking Strategies (PATHS) Curriculum is a school-based prevention program with robust evidence of effectiveness (Greenberg, Kusche, Cook, & Quamma, 1995; Kam, Greenberg, & Kusche, 2004). Although findings suggest higher implementation quality is critical to achieving youth outcomes (Kam, Greenberg, & Walls, 2003), no research-informed guidance is available to suggest how implementation quality should be monitored. Current implementation guidelines suggest hiring a PATHS technical assistance provider (TA) to observe 20% (8-10) of PATHS lessons. However, the implied monitoring of 8-10 lessons in every K-2 classroom is challenging in routine practice. This paper seeks to understand: 1) whether elements of initial observations predict completion rates of recommended observations; 2) which elements of observations predict concurrent overall implementation quality; and 3) which elements of initial observations predict sustained implementation quality.

Methods: Data come from the first year of the Allentown Social Emotional Learning Initiative, a district-wide implementation of the PATHS curriculum in grades K-2. Two TAs conducted observations in 170 classrooms across 15 schools. TAs observed teacher characteristics (e.g., teacher is patient with students), adherence (e.g., teacher uses PATHS techniques), participant responsiveness (e.g., students enjoy PATHS activities), and the Overall Implementation Quality. Using multilevel modeling to account for clustering within schools, we examined the unique effect of each observation element on Overall Implementation Quality within and across time. Differences were examined between TAs.

Results:

Completion Rates - Observation completion rates ranged from 83% in Time 1 to 32% in Time 8. The initial level of Overall Implementation Quality was not related to observation completion rates. However, teachers’ observed commitment to high-level implementation in Time 1 explained 13% of variance (p<0.05) in completion rates.

Implementation Quality - The level of Overall Implementation Quality was high, although with significant differences between TAs at all time points. Within each time point, most measures of teacher characteristics, participant responsiveness, and adherence significantly predicted TAs’ ratings of Overall Implementation Quality (to varying extents). For example 47% of the variance (p<0.001) in Overall Implementation Quality at Time 2 was explained by teachers’ preparedness for PATHS activities. Although this relationship was true for both TAs, for one of the TAs preparedness explained 46% of the variance (p<0.001) in her rating of Overall Implementation Quality, while preparedness only accounted for 13% of the variance (p=ns) for the other TA.

Over Time - All elements of initial observations significantly predicted sustained Overall Implementation Quality.  For example, TAs’ ratings of how much students enjoyed PATHS activities at Time 1 explained 19% of the variance (p<0.01) in Overall Implementation Quality at Time 6.


Conclusions and Implications: Identifying predictors of completion rates and Implementation Quality could improve implementation in school settings. Preliminary findings suggest that initial buy-in and timely technical assistance may improve the quality of implementation and implementation monitoring. Differences between TA perceptions of Implementation Quality suggest that training and interrater reliability assessment are important in routine implementation.