Abstract: Going Beyond Effectiveness: Methodological Approaches to Measuring Implementation Processes within a Randomized Controlled Trial (Society for Social Work and Research 20th Annual Conference - Grand Challenges for Social Work: Setting a Research Agenda for the Future)

Going Beyond Effectiveness: Methodological Approaches to Measuring Implementation Processes within a Randomized Controlled Trial

Schedule:
Friday, January 15, 2016: 6:15 PM
Meeting Room Level-Meeting Room 3 (Renaissance Washington, DC Downtown Hotel)
* noted as presenting author
Victoria Stanhope, PhD, Associate Professor, New York University, New York, NY
Mimi Choy-Brown, MSW, Doctoral candidate, New York University, New York, NY
Purpose:  The significant gap between what we know to be best practices and the care most people receive remains in part due to limited knowledge of the steps necessary to translate a practice to real world settings. Barriers to adopting evidence-based practices have been well documented, demonstrating the need to take into account factors that determine how new practices are received in routine settings, in addition to measuring outcomes. In response to this critical need, implementation science has emerged to study the systematic uptake of research findings and EBPs (Nilsen, 2015). This presentation offers methodological approaches from a large NIMH-funded multi-site randomized controlled trial to capture organizational context and the mechanisms underlying knowledge translation.    

Methods: The study is a RCT of person-centered care planning (PCCP), a recovery-oriented practice for people with severe mental illnesses designed to maximize self-determination. The intervention involved 2-days of PCCP training and a year of bi-weekly technical assistance (TA) calls with supervisors and direct care staff. The study was set in 14 community mental health sites randomized to either training in PCCP or treatment-as-usual arms. Sample (N=280) was comprised of leadership, supervisors and direct staff. Implementation processes were measured by: online surveys completed by leadership, direct staff and supervisors (baseline, 12-months, 18-months), post-training evaluation, , recordings of technical assistance (TA) calls (bi-monthly), trainer logs of TA calls, trainer ratings of sites (monthly), online surveys of supervisors (6 months) and consumer service plans.      

Results:  The complex interchange between contextual factors and mechanisms underlying the PCCP intervention delivery necessitated use of different methodological approaches. Based on the literature and existing implementation frameworks, the following strategy was developed to capture the PCCP implementation process from multiple stakeholder perspectives.

  • Organizational Context
    • Survey data of organizational readiness, leadership, and recovery orientation
    • Survey data listing barriers to implementing PCCP
    • Observations of recorded TA calls to capture barriers to and strategies for PCCP uptake
    • Trainer logs of TA calls and site specific ratings to capture progress towards PCCP uptake
  • Knowledge Translation
    • Survey data of PCCP knowledge and skills
    • Training evaluation survey on staff efficacy and motivation to implement PCCP
    • Observations of recorded TA calls for PCCP competency development
    • Survey of supervisors PCCP coaching and implementation leadership
    • Rating of services plans to capture fidelity to PCCP

Implications: The long pipeline of efficacy to implementation and the fluid and complex world of social service settings clearly necessitates a pragmatic approach to intervention research. As we have demonstrated, documentation of implementation processes within effectiveness studies is possible, enhancing the value of the evidence for end users. In addition to demonstrating whether a practice improves outcomes, we can also provide knowledge about how it works and under what conditions, which informs local customization and increases the likelihood of adoption and sustainability (Glasgow et al., 2015). Also, these methods can be embedded in new innovative research designs, such as hybrid, realist evaluations and learning evaluations, aimed at closing the gap between research and practice.