Motivational Interviewing (MI) has become a widely accepted and valued approach in social work settings such as child welfare, mental health, corrections, and healthcare. Considerable evidence supports MI’s effectiveness across these domains (Lundahl et al., 2013). As MI becomes relied upon by social workers, tools are needed to assess its implementation in research and applied settings. The following research questions guided this study: 1. What measurement tools assess MI implementation? 2. What are the pragmatic and psychometric features of identified tools? 3. What is the content validity individually and collectively of these tools?
METHODS
Systematic review was utilized to identify MI implementation tools and literature on the psychometric properties of such tools. Content analysis was used to identify adherence targets in relationship to best practice guidelines for MI. PRISMA guidelines were followed throughout. 14 electronic databases were searched using two term clusters: “motivational interviewing” and “adherence measures.” 733 unique references were identified. Inclusion criteria were: (a) studies that explicitly discuss development or validation of MI adherence measures, (b) studies that mention specific MI adherence measures, (c) studies that discuss adherence measures that may incorporate MI along with other therapeutic modalities. Two researchers independently reviewed, with high interrater reliability, all articles and identified seven MI adherence instruments (note, two had multiple versions for a total of 9 total instruments). Content analysis was used to describe each instrument, identify the instrument goals and how the data is captured, how the instrument is measured, and the targets of MI adherence.
RESULTS
With regard to targets of MI, 66% of the adherence tools assess use of “OARS” (open questions, affirmations, reflections, summaries), 78% measure change talk elicitation or reinforcement strategies, 33% examine utilization of empathy, 44% measure partnership efforts, 33% measure goal formation, and only a few looked at barriers and ambivalence. Considerable variance was identified in how the adherence tools assessed MI skills. For example, some instruments had as few as 11 MI targets where others had as many as 36. Some tools only looked at the provider’s behaviors, whereas others also looked at behaviors of the client. Most of the instruments relied upon audio recordings of interactions between a professional and a client, with one relying upon written questionnaires. A mixture of outputs was observed. Some produced a global “adherence score” where others had strength ratings on discrete MI skills or behavioral targets. Only one of the instruments engaged in sequential coding where behaviors from the client and provider were evaluated in concert to examine the function of MI skills.
CONCLUSIONS AND IMPLICATIONS
Existing MI adherence tools vary in focus and methods of quantifying MI skills. Current measurement tools do not seem to explicitly align with the new processes outlined in the most recent version of MI (Miller & Rollnick, 2012) or to target unique practice contexts such as corrections versus healthcare. Further work is needed in solidifying what aspects of MI are critical and how these can be efficiently represented through adherence tools within nuanced settings.