Abstract: Coping with Complexity: Attempting to Measure and Compare Implementation Processes and Capacity Across Sites (Society for Social Work and Research 20th Annual Conference - Grand Challenges for Social Work: Setting a Research Agenda for the Future)

Coping with Complexity: Attempting to Measure and Compare Implementation Processes and Capacity Across Sites

Schedule:
Friday, January 15, 2016: 6:15 PM
Meeting Room Level-Meeting Room 15 (Renaissance Washington, DC Downtown Hotel)
* noted as presenting author
Megan E. Fitzgerald, PhD, Senior Research Specialist, James Bell Associates, Arlington, VA
Jill Sanclimenti, MBA, Expert Consultant, ICF International, Fairfax, VA
Mary I. Armstrong, PhD, Associate Professor, University of South Florida, Tampa, FL
Michelle Graef, Phd, Research Associate Professor, University of Nebraska, Lincoln, Lincoln, NE
BackgroundFrom 2009 to 2013, the Children’s Bureau funded five Implementation Centers (ICs) to provide intensive, multi-year technical assistance to help child welfare agencies foster organizational and systems change. The ICs worked with 25 Implementation Projects (IPs) that addressed diverse child welfare practice and systems issues.  IC evaluators worked together to develop common measures of implementation progress and capacity across projects.  First, they collaborated on a primarily quantitative instrument, the Implementation Process Measure (IPM) to measure implementation processes over time (Armstrong, et al., 2014).  Additionally, they created a common protocol, the Implementation Capacity Analysis (ICA), for a qualitative, retrospective examination of the development of implementation capacity. 

MethodsBoth the IPM and ICA were designed to reflect work guided by the National Implementation Research Network (NIRN) framework, including implementation stages, activities, and drivers.  The IPM included ratings of the stages and activities of implementation, and the salience and installation of implementation drivers.  These were reported in 6 month increments across most of the projects into a web-based dataset.  For the ICA, projects conducted focus groups with local implementation teams near the end of the IP project period.  Variant methods were used to record and report data from these measures. 

ResultsIPM results showed that sites spent an average of 8.86 months on each stage, and most IPs reached early design or initial implementation (n=16).  Evaluation activities, such as identification of fidelity criteria, were least likely to get initiated or put in place. Drivers reported to be both most salient and to get installed were leadership; shared vision, values, and mission; and stakeholder engagement. 

Similarly, ICA results revealed how the following frequently noted implementation capacities were important to the implementation process: leadership; training and coaching; shared vision, values, and mission; and decision support data systems. For example, one team member stated, “Leadership is crucial in establishing and promoting the vision for change, creating a sense of urgency around this vision, and creating buy-in for the change effort at all levels of the system.”

ImplicationsUsing a mixed-method approach increased knowledge about length of implementation by specific stage, and the utility and salience of implementation drivers to support project implementation and capacity development in complex systems.   For example, the longest stage of implementation reported was early design.  Early design is a labor intensive stages for large organizations, such as a State child welfare agency, due to the combined complexity of the suggested activities and initial capacity of jurisdictions and ICs. This study revealed both the average length of that stage, and ways in which strong leadership and other drivers can support agencies through each stage and all the way through to full implementation.