32P
Keeping Track of Adaptations: A Tool for Practitioners and Evaluators

Schedule:
Thursday, January 15, 2015
Bissonet, Third Floor (New Orleans Marriott)
* noted as presenting author
Cynthia George, MSSW, Doctoral Candidate, Virginia Commonwealth University, Richmond, VA
Background and Purpose:

Evidence-based programs (EBP) in the form of manualized education materials are used in schools across the country to address a range of health and safety issues. However, local practitioners often adapt these manuals for local conditions. Local adaptations occur organically and thus often are not systematically tracked. There may also be norms of secrecy around making adaptations to manualized programs that also prevent their systematic tracking. It should be okay to make local adaptations. Further, it should be the norm for local groups to first assess a manualized program for local relevance before they invest resources in its implementation. It is especially important to track adaptations if there is also evaluation data being collected about the program. This project worked within a Community-Engaged Research (CEnR) framework to help solve a real-world problem driven by community needs for the evaluation of a program that had been locally adapted over time.

 

Methods:

Community partners requested a review of a manualized program on healthy relationship skills for teens. The program had been delivered over a ten year span in an urban county in a southern state. The program was provided to thousands of academically at-risk teens by peer mentors in various school-based settings. Multiple forms of engagement were employed to build mutual evaluation goals. Documents ranging from 1998-2013 were reviewed using organizational and personal files of program leaders. These included grant contracts, the original program manual, various iterations of the program manual, meeting minutes, strategic plans, and evaluation reports. An audit trail of documents reviewed was maintained and an electronic archive was created. A coded list of explanation codes for common reasons practitioners adapt programs was created as an analysis framework from what little literature exists on this subject.


Findings:

Member checking with various program leaders worked to identify that adaptations could be divided into three major phases, which coincided with major shifts in grant funding. The most common reasons for adaptation included: to accommodate common group characteristics; for use by peers/self-advocates; to accommodate implementation by a team; school consultation/logistics; expert consultation; revision of “dated” information; philosophical/paradigmatic changes; and updates to instructional technology. While the end product they had created was superior to the original manual, none of the adaptations had been systematically tracked and the learning objectives and evaluation measures were inconsistent.

Conclusion & Implications:

Tracking local adaptations and updating learning objectives and evaluation measures creates the opportunity for local program evaluation data about an EBP to actually be meaningful for effectiveness trials. While the specific findings informed local decision-making, the evidence-informed analysis framework created is expected to be useful to many practitioners and across a range of school-based health and safety related EBPs. This Track Your Adaptations Tool is a coded list of common adaptations that is transferrable and can serve as a: 1) discussion guide for assessing new programs, 2) review format to track adaptations to existing programs, and/or 3) translation device between academic/community partners to track adaptations over time. You can download this tool here: https://TrackYourAdaptations.WikiSpaces.com