Intersection of Systematic Review Methodology with the NIHReproducibility Initiative
2014; National Institute of Environmental Health Sciences; Volume: 122; Issue: 7 Linguagem: Inglês
10.1289/ehp.1408671
ISSN1552-9924
AutoresKristina A. Thayer, Mary S. Wolfe, Andrew A. Rooney, Abee L. Boyles, John R. Bucher, Linda S. Birnbaum,
Tópico(s)Health and Medical Research Impacts
ResumoVol. 122, No. 7 EditorialOpen AccessIntersection of Systematic Review Methodology with the NIH Reproducibility Initiative Kristina A. Thayer, Mary S. Wolfe, Andrew A. Rooney, Abee L. Boyles, John R. Bucher, and Linda S. Birnbaum Kristina A. Thayer Division of the National Toxicology Program, and , Mary S. Wolfe Division of the National Toxicology Program, and , Andrew A. Rooney Division of the National Toxicology Program, and , Abee L. Boyles Division of the National Toxicology Program, and , John R. Bucher Division of the National Toxicology Program, and , and Linda S. Birnbaum Office of the Director, National Institute of Environmental Health Sciences, National Institutes of Health, Department of Health and Human Services, Research Triangle Park, North Carolina, USA Published:1 July 2014https://doi.org/10.1289/ehp.1408671Cited by:16View Article in:中文版AboutSectionsPDF ToolsDownload CitationsTrack Citations ShareShare onFacebookTwitterLinked InReddit Kristina A. ThayerMary S. WolfeAndrew A. RooneyAbee L. BoylesJohn R. BucherLinda S. BirnbaumIn a landmark 2005 paper published in PLoS Medicine, Ioannidis posited that "most current published research findings are false" (Ioannidis 2005). Consistent with this opinion are reports that drug development has been hindered and many clinical trials wasted based on published findings from preclinical studies that with further effort could not be reproduced (Begley and Ellis 2012). The National Institutes of Health (NIH) recently outlined a sweeping set of initiatives to address the lack of reproducibility of research findings (Collins and Tabak 2014). In this editorial we touch on current efforts to address the research reproducibility problem and propose that systematic review methodologies, which are being developed to assess confidence in the quality of evidence used in reaching public health decisions, could also be used to improve the reproducibly of research.Reports and editorials in the biomedical literature have increasingly drawn attention to a disturbing lack of reproducibility of published scientific findings. Although poor reporting of key aspects of study methodology clearly contributes to the problem, other factors such as study conduct, may be equally or more important (Begley and Ellis 2012; Ioannidis 2005; Landis et al. 2012; Tsilidis et al. 2013). This situation has prompted actions by both the private and public sectors. These include the private Reproducibility Initiative collaboration between Plos One ( http://www.plosone.org/), Science Exchange ( https://www.scienceexchange.com/), figshare ( http://figshare.com/), and Mendeley ( http://www.mendeley.com/) (Nice 2013; Wadman 2013), which among other projects is attempting to replicate key findings from the 50 most impactful studies published in the field of cancer biology between 2010 and 2012. A major public effort is the NIH Initiative to Enhance Reproducibility and Transparency of Research Findings, which seeks to increase community awareness of the reproducibility problem, enhance formal training of investigators in elements of proper study design, improve the review of grant applications, and increase funding stability for investigators to enable them to use more appropriate and complex study designs (Tabak 2013). One planned activity of the NIH initiative is to develop a pilot training module on research integrity as it relates to experimental biases and study design. The intention is to provide specific guidance for researchers to improve the quality of their research publications by increasing their awareness of research practices that may affect the validity of their study findings. This guidance could also be used to improve both the grant proposal and journal peer-review stages to ensure more systematic and rigorous evaluation of both proposed and completed studies.Hoojimans and Ritskes-Hoitinga (2013) recently published a progress report outlining a number of initiatives to address the reproducibility problem, specifically with respect to preclinical/experimental animal studies performed for translational research. Of course, experimental animal studies are critically important in many areas beyond drug development. Regulations to protect the public from harmful environmental exposures have historically relied heavily on the results of experimental animal studies. Within the larger area of environmental health sciences research, important evidence can also come from epidemiology studies of widely varying design, as well as from "mechanistic studies." The consistent and transparent integration of this evidence to reach public health decisions is of immense international importance.Implementing remedies to improve the reporting of key aspects of study methodology is perhaps the easiest challenge to address given that reporting quality checklists are available for clinical trials (Schulz et al. 2010), observational human studies (von Elm et al. 2008), animal studies (Hooijmans et al. 2010; Kilkenny et al. 2010), and in vitro studies (Schneider et al. 2009) (see also EQUATOR Network 2014). An increasing number of journals, including the Nature group, Plos One, and Environmental Health Perspectives, are now providing more explicit guidance to authors on items that should be reported when submitting papers.A cornerstone of systematic review is the application of transparent, rigorous, objective, and reproducible methodology in a literature-based evaluation to identify, select, assess, and synthesize results of relevant studies. The application of systematic review methodology in an evaluation does not eliminate the need or the role for expert judgment. These methods do, however, offer a much-improved level of transparency for understanding the critical studies forming the basis for decisions and the overall confidence in the decision.Establishing guidance to enable systematic assessment of the appropriateness of study design and conduct—or more generally, study quality—is challenging. Although there is a reasonable harmonization of approaches used to assess internal validity (risk of bias) for human clinical trials (Higgins and Green 2011), there is not currently a similar consensus on how to assess that the findings and conclusions drawn from observational human, experimental animal, and in vitro studies are a true reflection of the outcome of the study. For these types of data, ongoing methods development in the field of systematic review can help.Interest has been growing in the fields of toxicology and pharmacology (National Research Council 2009; Rooney et al. 2014; Sena et al. 2007; Woodruff and Sutton 2011) in extending systematic review methods beyond the traditional area of human clinical trials to consider other evidence streams (observational human, experimental animal, and in vitro studies). For example, the National Toxicology Program (NTP) Office of Health Assessment and Translation (OHAT) has worked internationally to develop a formal approach for systematic review and evidence integration for literature-based evaluations through consultation with technical expert advisors, its scientific advisory committees, and with other agencies or programs that conduct literature-based assessments, as well as through public comment by stakeholders (Rooney et al. 2014). The Navigation Guide Work Group has developed a similar framework, and recent case studies support the feasibility of applying systematic review methods to environmental health evaluations. Because a key aspect in conducting a systematic review is to evaluate study quality, including internal validity or risk of bias for studies (Higgins and Green 2011), work by the NTP, Navigation Guide, and others is leading to the development of powerful risk-of-bias assessment tools applicable to a variety of human, animal, and mechanistic study designs. It is also leading to the development of methods to handle the assessment and integration of data within and across multiple evidence streams. Current systematic review methods under development differ in some respects but are substantively similar in approach. The flexible framework developed by OHAT (Rooney et al. 2014) allows evaluations to be specifically tailored to appropriately carry out environmental health assessments that include information derived from a diverse mix of study types and designs. This framework is envisioned to be continual, with refinements and improvements anticipated with use.Investments in biomedical research today must result in improvements in quality of life in the future. Addressing the reproducibility of published scientific findings is of vital importance for maintaining the integrity of biomedical research. We believe that the widespread adoption and adherence to elements of systematic review throughout the entire scientific process, including study concept, grant writing and review, study performance, study reporting, and ultimately study utilization for reaching conclusions in environmental health sciences or any other area in biomedical research, can significantly improve both public health decisions and our return on scientific investment.The authors declare they have no actual or potential competing financial interests.ReferencesBegley CG, Ellis LM. 2012. Drug development: raise standards for preclinical cancer research.Nature 483(7391):531-53322460880. Crossref, Medline, Google ScholarCollins FS, Tabak LA. 2014. Policy: NIH plans to enhance reproducibility.Nature 505(7485):612-61324482835. Crossref, Medline, Google ScholarEQUATOR Network. 2014. Enhancing the Quality and Transparency of Health Research (EQUATOR) Network. Available: http://www.equator-network.org/ [accessed 18 December 2013]. Google ScholarHiggins JP, Green S, eds. 2011. Cochrane Handbook for Systematic Reviews of Interventions. Version 5.1.0 (updated March 2011). Available: http://handbook.cochrane.org/ [accessed 3 February 2013]. Google ScholarHooijmans CR, Leenaars M, Ritskes-Hoitinga M. 2010. A gold standard publication checklist to improve the quality of animal studies, to fully integrate the Three Rs, and to make systematic reviews more feasible.Altern Lab Anim 38(2):167-18220507187. Medline, Google ScholarHooijmans CR, Ritskes-Hoitinga M. 2013. Progress in using systematic reviews of animal studies to improve translational research.PLoS Med 10(7):e1001482; doi:10.1371/journal.pmed.100148223874162. Crossref, Medline, Google ScholarIoannidis JPA. 2005. Why most published research findings are false.PLoS Med 2(8):e124; doi:10.1371/journal.pmed.002012416060722. Crossref, Medline, Google ScholarKilkenny C, Browne WJ, Cuthill IC, Emerson M, Altman DG. 2010. Improving bioscience research reporting: the ARRIVE guidelines for reporting animal research.PLoS Biol 8(6):e1000412; doi:10.1371/journal.pbio.100041220613859. Crossref, Medline, Google ScholarLandis SC, Amara SG, Asadullah K, Austin CP, Blumenstein R, Bradley EWet al.. 2012. A call for transparent reporting to optimize the predictive value of preclinical research.Nature 490(7419):187-19123060188. Crossref, Medline, Google ScholarNational Research Council. 2009. Science and Decisions: Advancing Risk Assessment. Available: http://www.nap.edu/openbook.php?record_id=12209&page=R1 [accessed 17 January 2013]. Google ScholarNice M. 2013. NIH Acknowledges Irreproducibility in Experiment Results, Seeks New Validation Standards. BioNews Texas. Available: http://bionews-tx.com/news/2013/08/01/nih-acknowledges-irreproducibility-in-experiment-results-seeks-new-validation-standards/ [accessed 15 December 2013]. Google ScholarRooney AA, Boyles AL, Wolfe MS, Bucher JR, Thayer KA. 2014. Systematic review and evidence integration for literature-based environmental health science assessments.Environ Health Perspect 122:711-718; doi:10.1289/ehp.130797224755067. Link, Google ScholarSchneider K, Schwarz M, Burkholder I, Kopp-Schneider A, Edler L, Kinsner-Ovaskainen Aet al.. 2009. "ToxRTool", a new tool to assess the reliability of toxicological data.Toxicol Lett 189(2):138-14419477248. Crossref, Medline, Google ScholarSchulz KF, Altman DG, Moher D. 2010. CONSORT 2010 statement: updated guidelines for reporting parallel group randomised trials.BMJ 340:c332; doi:10.1136/bmj.c33220332509. Crossref, Medline, Google ScholarSena E, Wheble P, Sandercock P, Macleod M. 2007. Systematic review and meta-analysis of the efficacy of tirilazad in experimental stroke.Stroke 38(2):388-39417204689. Crossref, Medline, Google ScholarTabak LA. 2013. Guest Director's Letter: NIH Initiative on Enhancing Research Reproducibility and Transparency. Available: http://www.niams.nih.gov/News_and_Events/NIAMS_Update/2013/tabak_letter.asp [accessed 15 December 2013]. Google ScholarTsilidis KK, Panagiotou OA, Sena ES, Aretouli E, Evangelou E, Howells DWet al.. 2013. Evaluation of excess significance bias in animal studies of neurological diseases.PLoS Biol 11(7):e1001609; doi:10.1371/journal.pbio.100160923874156. Crossref, Medline, Google Scholarvon Elm E, Altman DG, Egger M, Pocock SJ, Gotzsche PC, Vandenbroucke JP. 2008. The Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) statement: guidelines for reporting observational studies.J Clin Epidemiol 61(4):344-34918313558. Crossref, Medline, Google ScholarWadman M. 2013. NIH mulls rules for validating key results.Nature 500(7460):14-1623903729. Crossref, Medline, Google ScholarWoodruff TJ, Sutton PNavigation Guide Work Group. 2011. An evidence-based medicine methodology to bridge the gap between clinical and environmental health sciences.Health Aff 30(5):931-937. Crossref, Google ScholarFiguresReferencesRelatedDetailsCited by Mullane K, Curtis M and Williams M (2018) Biomedical Research in the 21st Century: Multiple Challenges in Resolving Reproducibility Issues Research in the Biomedical Sciences, 10.1016/B978-0-12-804725-5.00006-9, (307-353), . Walker V, Boyles A, Pelch K, Holmgren S, Shapiro A, Blystone C, Devito M, Newbold R, Blain R, Hartman P, Thayer K and Rooney A (2018) Human and animal evidence of potential transgenerational inheritance of health effects: An evidence map and state-of-the-science evaluation, Environment International, 10.1016/j.envint.2017.12.032, 115, (48-69), Online publication date: 1-Jun-2018. Wikoff D, Urban J, Harvey S and Haws L (2018) Role of Risk of Bias in Systematic Review for Chemical Risk Assessment: A Case Study in Understanding the Relationship Between Congenital Heart Defects and Exposures to Trichloroethylene, International Journal of Toxicology, 10.1177/1091581818754330, 37:2, (125-143), Online publication date: 1-Mar-2018. Williams M, Mullane K and Curtis M (2018) Addressing Reproducibility: Peer Review, Impact Factors, Checklists, Guidelines, and Reproducibility Initiatives Research in the Biomedical Sciences, 10.1016/B978-0-12-804725-5.00005-7, (197-306), . Austin C, Bloom T, Dallmeier-Tiessen S, Khodiyar V, Murphy F, Nurnberger A, Raymond L, Stockhause M, Tedds J, Vardigan M and Whyte A (2016) Key components of data publishing: using current best practices to develop a reference model for data publishing, International Journal on Digital Libraries, 10.1007/s00799-016-0178-2, 18:2, (77-92), Online publication date: 1-Jun-2017. Hoffmann S, de Vries R, Stephens M, Beck N, Dirven H, Fowle J, Goodman J, Hartung T, Kimber I, Lalu M, Thayer K, Whaley P, Wikoff D and Tsaioun K (2017) A primer on systematic reviews in toxicology, Archives of Toxicology, 10.1007/s00204-017-1980-3, 91:7, (2551-2575), Online publication date: 1-Jul-2017. Malloy T, Zaunbrecher V, Beryt E, Judson R, Tice R, Allard P, Blake A, Cote I, Godwin H, Heine L, Kerzic P, Kostal J, Marchant G, McPartland J, Moran K, Nel A, Ogunseitan O, Rossi M, Thayer K, Tickner J, Whittaker M and Zarker K (2017) Advancing alternatives analysis: The role of predictive toxicology in selecting safer chemical products and processes, Integrated Environmental Assessment and Management, 10.1002/ieam.1923, 13:5, (915-925), Online publication date: 1-Sep-2017. Vandenberg L, Ågerstrand M, Beronius A, Beausoleil C, Bergman Å, Bero L, Bornehag C, Boyer C, Cooper G, Cotgreave I, Gee D, Grandjean P, Guyton K, Hass U, Heindel J, Jobling S, Kidd K, Kortenkamp A, Macleod M, Martin O, Norinder U, Scheringer M, Thayer K, Toppari J, Whaley P, Woodruff T and Rudén C (2016) A proposed framework for the systematic review and integrated assessment (SYRINA) of endocrine disrupting chemicals, Environmental Health, 10.1186/s12940-016-0156-6, 15:1, Online publication date: 1-Dec-2016. Samuel G, Hoffmann S, Wright R, Lalu M, Patlewicz G, Becker R, DeGeorge G, Fergusson D, Hartung T, Lewis R and Stephens M (2016) Guidance on assessing the methodological and reporting quality of toxicologically relevant studies: A scoping review, Environment International, 10.1016/j.envint.2016.03.010, 92-93, (630-646), Online publication date: 1-Jul-2016. Garattini S, Jakobsen J, Wetterslev J, Bertelé V, Banzi R, Rath A, Neugebauer E, Laville M, Masson Y, Hivert V, Eikermann M, Aydin B, Ngwabyt S, Martinho C, Gerardi C, Szmigielski C, Demotes-Mainard J and Gluud C (2016) Evidence-based clinical practice: Overview of threats to the validity of evidence and how to minimise them, European Journal of Internal Medicine, 10.1016/j.ejim.2016.03.020, 32, (13-21), Online publication date: 1-Jul-2016. Rooney A, Cooper G, Jahnke G, Lam J, Morgan R, Boyles A, Ratcliffe J, Kraft A, Schünemann H, Schwingl P, Walker T, Thayer K and Lunn R (2016) How credible are the study results? Evaluating and applying internal validity tools to literature-based assessments of environmental health hazards, Environment International, 10.1016/j.envint.2016.01.005, 92-93, (617-629), Online publication date: 1-Jul-2016. Jarvis M and Williams M (2016) Irreproducibility in Preclinical Biomedical Research: Perceptions, Uncertainties, and Knowledge Gaps, Trends in Pharmacological Sciences, 10.1016/j.tips.2015.12.001, 37:4, (290-302), Online publication date: 1-Apr-2016. Trasande L, Zoeller R, Hass U, Kortenkamp A, Grandjean P, Myers J, DiGangi J, Hunt P, Rudel R, Sathyanarayana S, Bellanger M, Hauser R, Legler J, Skakkebaek N and Heindel J (2016) Burden of disease and costs of exposure to endocrine disrupting chemicals in the European Union: an updated analysis, Andrology, 10.1111/andr.12178, 4:4, (565-572), Online publication date: 1-Jul-2016. LaKind J, Goodman M, Barr D, Weisel C and Schoeters G (2015) Lessons learned from the application of BEES-C: Systematic assessment of study quality of epidemiologic research on BPA, neurodevelopment, and respiratory health, Environment International, 10.1016/j.envint.2015.03.015, 80, (41-71), Online publication date: 1-Jul-2015. LaKind J, Goodman M, Makris S and Mattison D (2015) Improving Concordance in Environmental Epidemiology: A Three-Part Proposal, Journal of Toxicology and Environmental Health, Part B, 10.1080/10937404.2015.1051612, 18:2, (105-120), Online publication date: 17-Feb-2015. George B, Sobus J, Phelps L, Rashleigh B, Simmons J and Hines R (2015) Raising the Bar for Reproducible Science at the U.S. Environmental Protection Agency Office of Research and Development, Toxicological Sciences, 10.1093/toxsci/kfv020, 145:1, (16-22), Online publication date: 1-May-2015. Vol. 122, No. 7 July 2014Metrics About Article Metrics Publication History Originally published1 July 2014Published in print1 July 2014 Financial disclosuresPDF download License information EHP is an open-access journal published with support from the National Institute of Environmental Health Sciences, National Institutes of Health. All content is public domain unless otherwise noted. Note to readers with disabilities EHP strives to ensure that all journal content is accessible to all readers. However, some figures and Supplemental Material published in EHP articles may not conform to 508 standards due to the complexity of the information being presented. If you need assistance accessing journal content, please contact [email protected]. Our staff will work with you to assess and meet your accessibility needs within 3 working days.
Referência(s)