Society for Social Work and Research

Sixteenth Annual Conference Research That Makes A Difference: Advancing Practice and Shaping Public Policy
11-15 January 2012 I Grand Hyatt Washington I Washington, DC

196 Using Implementation Science Evaluation Frameworks to Guide Organizational Decision Making Related to Complex System Reforms In Public Child Welfare

Sunday, January 15, 2012: 10:45 AM-12:15 PM
Independence B (Grand Hyatt Washington)
Cluster: Organizations and Management
Speakers/Presenters:
Mary I. Armstrong, PhD, University of South Florida, Diane DePanfilis, PhD, MSW, University of Maryland at Baltimore, Julie S. McCrae, PhD, University of Denver, Brian Deakins, MSW, Children's Bureau, Cathryn C. Potter, University of Denver and Junqing Liu, PhD, University of Maryland at Baltimore
There is a growing body of knowledge from many disciplines about how to implement change in complex organizations and systems. Key features include expertise in implementation strategies, systemic intervention, and an understanding of the process of organizational and systems change. The field of implementation science offers a definition of implementation—“a specified set of activities designed to put into practice an activity or program of known dimensions” (Fixsen et al., 2005) as well as a set of implementation stages and drivers that describe a framework for implementation. Kotter and colleagues from the Harvard Business School propose a process for implementing organizational change that includes creating a climate for change, engaging and enabling the whole organization, and implementing and sustaining change. Jeff Hiatt's ADKAR Model (awareness, desire, knowledge, ability and reinforcement) presents an approach for change management.

Much less is known about policy research and evaluation methods that can inform organizational decision making related to implementation of major system changes. In 2008 the DHHS Children's Bureau granted awards to five regionally based Child Welfare Implementation Centers (ICs) whose role is to assist states, tribes, and communities in implementing large scale systems change. Currently there are 23 projects across the ICs that are engaged in system change. Each IC has an evaluation team that is charged with assessing the process and outcomes of each project. Across the ICs there are differences in conceptual orientations and theoretical frameworks about implementation and system change; and variations in the projects related to systems, outcomes, interventions, scope and size. This roundtable session will begin with brief presentations about the integration of evaluation into implementation science and the challenges of utility-focused evaluations of child welfare system change projects. The federal partner from CB will share CB's vision for the ICs and how the project evaluations feed into knowledge building about complex system change. Presenters from three ICs will focus on the contextual, environmental, and political factors that impact evaluation questions, methodologies, data analysis and interpretation, and presentation of findings. The first presenter will discuss the evaluation challenges when the lead project partner is a Tribal child welfare agency and the other key player is the state child welfare agency. These challenges include gaining trust, using data to confront historical and institutional racism, and changes in leadership. The second two presentations will discuss the challenges of evaluating system change to support implementing statewide practice models. One presentation will pose challenges in making causal connections between broadly-visioned principles and values, and changes in worker behavior. The third presentation will reflect on the use of implementation drivers to increase fidelity of the intervention and the use of data by the management team to drive implementation decisions. These examples will be used to stimulate dialogue regarding the design of evaluation plans that can inform the implementation of change in complex systems. Participants will be engaged to discuss alternatives for measuring key implementation process (e.g., organizational climate, leadership) and outcomes variables over time in the context of complex political environments.

See more of: Roundtables