Research That Matters (January 17 - 20, 2008)


Calvert Room (Omni Shoreham)

Exploring the Unique Contributions of Qualitative Methodology to Program Evaluation

Charlene Cook, MPA, University of Toronto, Faye Mishna, PhD, University of Toronto, and Peter A. Newman, PhD, University of Toronto.

Background and Purpose: Qualitative methodology is largely assigned a secondary role in evaluation research due to its perceived estrangement from evidence-based practice. Nevertheless, rigorous qualitative contributions may enhance evidence-based practice by incorporating client perspectives, relational complexity, and other outcomes not accessible through quantitative measures. The purpose of this study is to explore the unique role of qualitative research as a primary methodology in program evaluation, utilizing a qualitative evaluation of an ecologically informed psychodynamic school-based intervention for maltreated children as a case example.

Methods: The intervention comprised play therapy with maltreated children and frequent therapist meetings with the teacher and parents. In depth, semi-structured interviews were conducted at six and 12 months with the parents, therapist, and teacher of all six child clients. Qualitative methodology was selected as the primary evaluative tool due to the exploratory nature of the intervention and previous research that suggested the value of qualitative methodology to assess complex change in children (Jackson, Spreier-Rump, Ferguson, & Brown, 1999). Questions addressed the academic, social, and emotional development of the child, and the influence of the therapeutic intervention on the family system. Interviews were transcribed, coded, and thematically analyzed using the constant comparative method (Creswell, 1998). Measures to ensure trustworthiness included triangulation between parents, teachers and therapists, prolonged engagement, and negative case analysis. Quantitative measures were also collected at baseline and six months into therapy using the Child Behaviour Checklist (Achenbach, 2001) and Teacher Report Form (Achenbach, 2001a).

Results: Quantitative analysis was inconclusive regarding effectiveness due to the small sample size, and parent/teacher confusion over categorizing children's experiences. Impetuses and barriers to the children's progress were thematically identified from the perspective of parents, therapists and teachers. Furthermore, qualitative analyses allowed us to identify outcomes that were not evident based on the quantification of frequency and intensity of behaviour. These included positive outcomes associated with challenges to authority by girls whose maltreatment had encouraged passivity and children's newfound ability to understand and verbalize personal limits when under stress. Though positive, these outcomes could have eluded or been perceived as negative by the categorizations that typify behavioural quantitative assessments. Qualitative data was also used to inform meaningful changes to ongoing treatment, such as more extensive engagement with a parent who felt excluded from the therapist-child dyad and was considering severing treatment. Qualitative evaluation was also used to promote the therapist's liaison role between the child's parents and teacher when the heretofore absent connection it fostered between the family and school was identified in interviews. Importantly, qualitative methodology centred perceptions of effectiveness on the experience of marginalized parents/caregivers.

Conclusions and Implications: Findings contribute to emerging scholarship on the use of qualitative research as a primary methodology in exploratory program evaluation. Qualitative methods enhanced this evaluation by revealing the complexity and dynamism in the intervention and the client population. As this complexity typifies much of social work program evaluation, qualitative methodologies may be well-suited to add flexibility and depth to evaluation and stand with quantitative methods in advancing social work's evidence-based orientation.