Abstract: Validity and Acceptability of a CBT Skills Assessment Tool for Simulated Training in the Treatment of Anxiety Disorders (Society for Social Work and Research 26th Annual Conference - Social Work Science for Racial, Social, and Political Justice)

565P Validity and Acceptability of a CBT Skills Assessment Tool for Simulated Training in the Treatment of Anxiety Disorders

Schedule:
Saturday, January 15, 2022
Marquis BR Salon 6, ML 2 (Marriott Marquis Washington, DC)
* noted as presenting author
Lindsay A. Bornheimer, PhD, Assistant Professor, University of Michigan
Laura Humm, BA, COO, SIMmersion LLC
Michael A. Kallen, PhD, Research Associate Professor, Northwestern University
Meredith E. Coles, PhD, Professor, Binghamton University
Nadine Mastroleo, PhD, Associate Professor, Binghamton University
Shona Vas, PhD, Associate Professor, University of Chicago
Matthew J. Smith, PhD, MSW, LCSW, Associate Professor, University of Michigan-Ann Arbor, MI
Background and Purpose: Anxiety disorders are among the most prevalent mental health problems in the United States. Cognitive-behavioral therapy (CBT) is an effective evidence-based approach to treat anxiety disorders, yet delivery depends on skill acquisition. Innovative technology-assisted and simulation-based clinical trainings and evaluations are emerging in the field and it is essential for these technologies to evaluate skills for proficiency and fidelity to an EBP. This NIMH-funded study sought to develop and validate Virtual Client Sessions: Cognitive Behavior Therapy Skills Assessment (VCS-CBTSA), a series of standardized simulated anxiety disorder client sessions automatically coded using the Assessment of Core CBT Skills (ACCS). This paper presents on the validity and acceptability of the tool including simulation experiences and explorations of real-life versus computer-based simulation sessions.

Methods: A total of 102 participants (32 clinicians and 70 clinicians-in-training) completed two simulated sessions (i.e., both the cognitive and behavioral simulations for a single randomly selected diagnosis) and filled out an experience questionnaire. The experience questionnaire included quantitative and qualitative open-ended questions to explore acceptability and usability of the simulation and evaluation tool. Prior to using the simulations, 18 participants also submitted audio recordings of sessions with their own clients. All transcripts of simulated and real sessions were transcribed and coded using the ACCS by human coders.

Results: Reliability of the human-coded scores for the 22 items of the social anxiety disorder exposure simulation (n=32) was high (Cronbach’s α=0.98), exceeding that found by Muse et al. (2016). Score differences comparing simulation to human-coded scores had a mean of 0.85 scoring metric points (median=0.85; SD=0.16). Our standard error of measurement (SEM) of 2.3 (95% CI=4.6) was largely in line with 3 of the 4 Muse SEM estimates. The intra-class agreement correlation (ICC-consistency, accounting for random error) between human-coded tape sessions vs. human-coded simulation sessions across anxiety disorders and treatment focus was moderate (ICC=0.51). Participants agreed on a 5-point Likert scale that the cognitive restructuring (M = 3.91, SD = 0.79) and exposure (M = 3.95, SD = 0.83) simulation sessions aligned with their real-life treatment sessions. The majority of participants stated that they would recommend the simulation tool to other trainees (63.7%, M = 3.62, SD= 1.03) and that it improved their confidence in working with anxiety disorders (68.3%, M = 3.57, SD = 1.04). Themes of open-ended qualitative questions included satisfaction of simulated session realism, interactivity in the tool and platform, variety within the training experience, and session structure.

Implications and Conclusions: Findings overall demonstrate the high internal consistency and acceptability of the evaluation tool with participant agreement that simulations align with their real-life practice experiences. Given providers and clinical trainees may not have enough face-to-face treatment opportunities with clients for CBT skill acquisition, and particularly so in the COVID-19 context, computerized simulations with assessment offer an accessible opportunity to practice skills and receive real-time feedback. Our future research will examine the effectiveness and implementation of this simulation tool across differing anxiety disorders and various user (i.e., provider, trainee) skill levels.