The social work code of ethics mandates that “Social workers should critically examine and keep current with emerging knowledge relevant to social work and fully utilize evaluation and research evidence in their professional practice” (NASW, 1999, p. 23). Given this mandate, greater attention in recent years has been placed on translating empirical knowledge to clinical practice and utilizing empirically-based practice (EBP) technologies.
Despite important efforts toward more wide-spread use of EBP, little attention has been paid to the research report as an instrumental tool in the EBP process. This is surprising given that the research report is the traditional means of conveying research findings. As a result, use of the research report represents an important opportunity to effectively transmit knowledge to practice. The focus of the present study is to evaluate a random sample of published quantitative research in order to gain a better understanding of the degree to which research methodology is reported in a clear and understandable manner (i.e., consumability).
Methods:
A content analysis (N = 95) was conducted utilizing a cross-sectional research design and multi-stage cluster sampling in which studies from the social work literature were randomly selected from a sampling frame of over 500 that was derived from journals found in the Social Work Abstracts and An Author’s Guide to Social Work Journals. The consumability of a study was assessed by examining the degree to which the study’s research methods were explicitly stated. Seventeen different aspects of each study were evaluated including sampling method, research design, method of measure, and whether implications of these methods were discussed.
Results:
Results indicate that authors explicitly and effectively identified their study’s purpose, the hypotheses to be tested, and study limitations. However, authors frequently neglected to identify independent and dependent variables, sampling method, scale reliability and validity issues, practice/policy implications, and implications of methodology used. In fact, 71.6% of studies included in this sample did not discuss practice implications and 68% did not explore policy implications. In addition, 70% of studies did not identify their sampling method. Finally, when comparing core social work journals (as defined by the Social Work Abstracts) to non-core journals, only 26% of studies in core social work journals discussed the reliability of measures compared to 63% in non-core journals.
Conclusions and Implications:
These results indicate that though some methods appear to be clearly reported, others are not. This calls into question the degree to which social work research is doing all it can to effectively and responsibly share the knowledge that it generates. Study methodology should be an integral part of the practitioner’s evaluation of research. However, failure to report methodology clearly can serve as a barrier to the practitioner evaluation process. Given the many demands placed on them, it seems unnecessarily burdensome to ask them to devote precious time and effort to identify methodology in order to effectively evaluate new research. Recommendations for reporting methodology will also be discussed.