The Society for Social Work and Research

2013 Annual Conference

January 16-20, 2013 I Sheraton San Diego Hotel and Marina I San Diego, CA

84
Measuring the Implementation of Social Work Interventions: Options and Examples

Friday, January 18, 2013: 2:30 PM-4:15 PM
Executive Center 2A (Sheraton San Diego Hotel & Marina)
Cluster: Research Design and Measurement
Speakers/Presenters:
Diane DePanfilis, PhD, MSW, University of Maryland at Baltimore, Becci A. Akin, PhD, University of Kansas, Charlotte Lyn Bright, PhD, MSW, University of Maryland at Baltimore, Pamela Clarkson Freeman, PhD, MSW, University of Maryland at Baltimore and Stephanie A. Bryson, PhD, MSW, University of Kansas School of Social Welfare
BACKGROUND AND PURPOSE. Intervention research in social work uses scientific methods to evaluate whether an intentional change strategy is both efficacious and effective (Fraser, et al., 2009). However, before evaluating the potential benefit of social work interventions, we must employ sound methods for evaluating the process of implementation and the fidelity of interventions (Fixsen, et al., 2209). In particular, we must consider alternative methods for measuring implementation outcomes (i.e., deliberate and purposive actions to implement new treatments, practices, and services) (Proctor, et al., 2011), including fidelity, or the degree to which interventions are implemented as intended (Mowbray, et al., 2003). This is particularly challenging in the context of social work interventions, which often focus on the complex circumstances client systems face. The purpose of this workshop is to translate what is known (from published implementation and intervention research and the authors’ experiences) about alternatives for measuring implementation processes and outcomes. SESSION FORMAT & CONTENT. The educational methods will include lecture, illustration, demonstration, discussion, and, practice exercises. Core constructs that will be addressed include: organizational culture and climate and readiness to change; implementation drivers (e.g., leadership, organizational, competency); fidelity; usability testing; technology to efficiently support the implementation process; and analytical methods to use implementation findings to guide coaching and implementation. Definitions, alternative methods for measuring these implementation outcomes, and illustrations from the authors’ intervention research will be presented and discussed. For example, to assess fidelity, workshop leaders will contrast differences between measuring structural criteria (e.g., timing, duration of service) versus more complex process criteria (e.g., engagement, empowerment, and comprehensiveness of assessments). Intervention research method examples are derived from federally funded child welfare related demonstration projects and Implementation Center initiatives including Permanency Innovations Initiatives funded by the USDHHS Children’s Bureau in Kansas and Washoe, County Nevada; social work practice model implementation projects supported in federal Regions III and IV; and multi-site community based replications of a promising social work child maltreatment preventive intervention. Workshop leaders represent research teams from two graduate Schools of Social Work. All participants will receive a resource and reference list and a copy of workshop-illustrated methods. WORKSHOP OBJECTIVES. As a result of this session, participants will achieve the following objectives: (1) understand core implementation outcome domains that should be considered in intervention research designs; (2) understand alternative methods for defining fidelity criteria and implementing fidelity assessments; (3) obtain resources about how to use technology to support the collaborative collection of implementation data; and (4) consider alternative formats for presenting implementation data to guide ongoing support and decision-making regarding interventions.
See more of: Workshops