Methods. An experimental design was used to test the intervention. Youth were recruited from local community-based neighborhood sites in Central New York after receiving parental consent and youth assent to participate in the program. Youth were then randomized into the treatment or control group. Pre-test surveys were administered during the first week of the program and post-tests at program end. The Core Instrument (DHHS-Office of Adolescent Pregnancy Prevention Core Data Set) included demographic data, attitudes about teen pregnancy, and risk behaviors. Completed surveys were scanned into an SPSS database using SNAP Professional Edition software from which data were converted for further analysis using STAT/Transfer 9.5. The statistical model used was the classic linear post-test with pre-test regressor model. An intensive process evaluation protocol was used to monitor program compliance with the program.
Results. The sample (N=351) was evenly comprised of boys and girls. Thirty percent reported Hispanic or Latino backgrounds, and 48% were African-American. Youth resided with at least one parent (88%) and 37% lived with both parents. At post-test there were few statistically significant differences between the treatment and control group on the outcome variables which led to the further examination of “intervention exposure” and the impact on outcomes. Attendance logs and other fidelity methods were examined to determine how well they adhered to the protocol. This analysis revealed a lower than expected program completion rate for the program (60%).
Conclusions and Implications.
Social workers in small, non-profit social service agencies are increasingly implementing rigorous evaluation designs. There are often resource limitations to systematically assess protocols for fidelity to determine their efficacy. This presentation examined the centrality of these processes and the implementation challenges. These process evaluation methods were carefully designed and implemented and incorporated into the organizational culture to assure continuity with the program goals. They provided tangible markers for referencing progress, effectively monitoring time-lines, and implementing a high-quality research intervention. However, they were not flawless. Although dosage (hours of exposure to the intervention) was found to be low, if it had been higher, could treatment effects have been larger. After school youth programs with voluntary attendance have to overcome these obstacles in order to compete effectively with other programs. Developing effective methods to implement rigorous program monitoring activities will be discussed.