Saturday, 14 January 2006 - 3:00 PM

Opening the “Black Box” of Implementation Fidelity in School-Based Interventions

Bridget E. Weller, MSW, University of North Carolina at Chapel Hill, Roderick A. Rose, MS, University of North Carolina at Chapel Hill, and Gary L. Bowen, PhD, ACSW, University of North Carolina at Chapel Hill.

Purpose: Meaning can only be attributed to evaluation outcomes when researchers understand program fidelity. Implementation evaluations measure the level of program fidelity by analyzing how programs are implemented, the degree to which programs deviate from prescribed protocols, and the sources and consequences of variation in implementation—opening up the so-called “black box” of evaluation research. This presentation reviews challenges encountered and strategies used in an attempt to understand program implementation in 11 North Carolina schools implementing the School Success Profile-Intervention Package (SSP-IP). The SSP-IP consists of a structured planning process, Results-focused Planning, as well as two assessment instruments—a survey of youth ecology and a survey of school organizational culture. A detailed program manual defines the intervention through recommended guidelines to be followed during six planning meetings focused on survey findings. Teams of school employees, parents, and students conduct these meetings without direct input from the SSP-IP team.

Methods: Several data collection strategies were employed. First, each school's mission statement, demographics and resources, building condition, and primary improvement goals were recorded during site visits. Second, after three months of implementation, employees at each school were interviewed. Finally, document review examined how closely schools followed manual guidelines (e.g., number of meetings held during the school year and length of time for each meeting). These reports were examined in the context of survey data collected using the School Success Profile-Learning Organization (SSP-LO), which was administered to employees to examine their ability to value and use information and tacit knowledge. Data were coded with ATLAS TI, and a holistic approach to data analysis was used.

Results: Though the manual was viewed as helpful in defining key terms and guidelines, the results demonstrated that some schools experienced variability in adhering to its protocols. “Specialty” schools--those experiencing a government mandated reform--were less likely to follow SSP-IP protocols than traditional schools. Schools that did not complete tasks not directly a part of the SSP-IP protocols were less likely to complete SSP-IP requirements. Schools that spent more time preparing for training had higher program fidelity. Finally, low implementation fidelity was not related to low SSP-LO scores, suggesting that other factors, such as school resources or selection bias in survey response, were responsible.

Implications: These findings support the notion that a program may have the necessary components to be effective but may not be implemented with high fidelity. First, the level of assistance provided in the first year should vary with the degree of program fidelity at each site, consistent with the concept of formative evaluation. Second, a limited window exists for solving problems—problems that arise should be addressed quickly. Third, regular weekly dialogue with each site encourages school employees to stay on track with the program. Finally, “specialty” schools should receive more direct support from intervention teams. Researchers must develop more rigorous methods of evaluating fidelity, including quantitative and mixed methods. Finally, they should also examine the protocols themselves for troublesome aspects that may invite poor attention to program guidelines.


See more of School-Based Services
See more of Oral and Poster

See more of Meeting the Challenge: Research In and With Diverse Communities (January 12 - 15, 2006)