Identifying signals for updating systematic reviews : a comparison of two methods

Both methods alone or in combination may be considered as appropriate tools. Future research would confirm these conclusions for a larger cohort of reviews and assess the predictive validity of the methods with actual updates

Main Author: Shekelle, Paul G.
Corporate Authors: United States Agency for Healthcare Research and Quality, Southern California Evidence-Based Practice Center/RAND., Tufts Evidence-based Practice Center, University of Ottawa Evidence-based Practice Center
Format: eBook
Published: Rockville, MD Agency for Healthcare Research and Quality [2011], 2011
Series:Methods research report
Online Access:
Collection: National Center for Biotechnology Information - Collection details see MPG.ReNa
LEADER 04573nam a2200349 u 4500
001 EB000943152
003 EBX01000000000000000736742
005 00000000000000.0
007 tu|||||||||||||||||||||
008 150223 r ||| eng
100 1 |a Shekelle, Paul G. 
245 0 0 |a Identifying signals for updating systematic reviews  |h Elektronische Ressource  |b a comparison of two methods  |c prepared for Agency for Healthcare Research and Quality, U.S. Department of health and Human Services ; prepared by Paul G. Shekelle ... [et al.] 
260 |a Rockville, MD  |b Agency for Healthcare Research and Quality  |c [2011], 2011 
300 |a PDF file (various pagings)  |b ill 
505 0 |a Includes bibliographical references 
653 |a Systematic Reviews as Topic 
653 |a Meta-Analysis as Topic 
710 2 |a United States  |b Agency for Healthcare Research and Quality 
710 2 |a Southern California Evidence-Based Practice Center/RAND. 
710 2 |a Tufts Evidence-based Practice Center 
710 2 |a University of Ottawa Evidence-based Practice Center 
041 0 7 |a eng  |2 ISO 639-2 
989 |b NCBI  |a National Center for Biotechnology Information 
490 0 |a Methods research report 
500 |a "June 2011." 
856 |u  |3 Volltext 
082 0 |a 610 
520 |a Both methods alone or in combination may be considered as appropriate tools. Future research would confirm these conclusions for a larger cohort of reviews and assess the predictive validity of the methods with actual updates 
520 |a BACKGROUND: Methods of assessing the need for systematic reviews to be updated have been published, but agreement among them is unclear. OBJECTIVES: To compare two methods for assessing the need to update an evidence review, using three evidence reports on the effects of omega-3 fatty acids on cancer, cognition and aging, and cardiovascular diseases (with separate analyses for fish oil and alpha-linolenic acid). The RAND method combines a targeted literature search with the assessments of content experts. The Ottawa method relies on a quantitative and qualitative assessment of the study results from a similar targeted search. DATA SOURCES: A MEDLINE search was conducted on a limited set of journals, including five pivotal general medical journals and a small number of specialty journals, from 1 year prior to release of the original reports using their search strategies. METHODS: The search results were screened using the original eligibility criteria.  
520 |a Agreement between the RAND and Ottawa methods was assessed for each report with the kappa statistic. RESULTS: Overall agreement between the two methods ranged from "nonexistent" (kappa = 0.19, for fish oil and cardiovascular disease) to "almost perfect" (kappa = 1.0 for cognitive function). Many of the disagreements between the methods were due to a situation where the original review had a Key Question with no evidence and some evidence was identified in the update. In these situations, the RAND method produced a positive signal for updating and Ottawa's method produced a negative signal. A sensitivity analysis that reclassified these situations as agreement between the two methods yielded much better estimates of agreement: for three of the four conditions, agreement was "substantial" to "almost perfect" and overall agreement was "substantial." CONCLUSIONS: The RAND method and the modified Ottawa method agree reasonably well in their assessment of the need to update reviews.  
520 |a Study-level data and findings of existing systematic reviews, randomized controlled trials, and large observational studies addressing the original key questions were abstracted. Using the RAND method, we contacted experts--including members of the original technical expert panels and the original peer reviewers--and sought their opinions regarding the status of the original reports and any new references. The results of the literature reviews and expert opinions were combined to determine the need for updating based on predetermined criteria. Using a modification of the Ottawa method, new trial data were meta-analyzed with the original meta-analysis results. A quantitative signal for the need to update was based on statistical differences with the original meta-analyses. Qualitative signals, such as differences in characterizations of effectiveness, new information about harm, and caveats about the previously reported findings, were sought for outcomes without existing meta-analyses.