Research That Matters (January 17 - 20, 2008) |
Purpose: Two studies were conducted to assess the methods used in published reviews of results of research on effects of a prominent evidence-based program. The purpose was to describe methods used to identify, analyze, and synthesize results of empirical research on intervention effects, and determine whether published reviews are vulnerable to various sources and types of bias.
Methods: Study 1 extracted and analyzed information on the methods, sources, and conclusions of 37 published reviews of research on effects of one model program. Study 2 compared results of one published randomized controlled trial with summaries of results of that trial that appeared in 13 published reviews. Both studies used descriptive statistics and content analysis.
Results: Study 1: Published reviews varied in terms of the transparency of inclusion criteria, strategies for locating relevant published and unpublished data, standards used to evaluate evidence, and methods used to synthesize results across studies. Most reviews relied solely on narrative analysis of a convenience sample of published studies. None of the reviews used systematic methods to identify, analyze, and synthesize results. Study 2: When results of a single study were traced from the original report to summaries in published reviews, three patterns emerged: a complex set of results was simplified, non-significant results were ignored, and positive results were over-emphasized. Most reviews used a single positive statement to characterize results of a study that were decidedly mixed. This suggests that reviews were influenced by confirmation bias, the tendency to emphasize evidence that supports a hypothesis and ignore evidence to the contrary.
Conclusions and implications: Published reviews may be vulnerable to biases that scientific methods of research synthesis were designed to address. This raises important questions about the validity of traditional sources of knowledge about “what works,” and suggests need for a renewed commitment to using scientific methods to produce valid evidence for practice.