Revisão Acesso aberto Revisado por pares

Open synthesis and the coronavirus pandemic in 2020

2020; Elsevier BV; Volume: 126; Linguagem: Inglês

10.1016/j.jclinepi.2020.06.032

ISSN

1878-5921

Autores

Neal Haddaway, Elie A. Akl, Matthew J. Page, Vivian Welch, Ciara Keenan, Tamara Lotfi,

Tópico(s)

Artificial Intelligence in Healthcare and Education

Resumo

•Open Science principles are vital for ensuring reproducibility, trust, and legacy.•Evidence synthesis is a vital means of summarizing research for decision-making.•Open Synthesis is the application of Open Science principles to evidence synthesis.•Open approaches to planning, conducting, and reporting synthesis have many benefits.•We call on the evidence synthesis community to embrace Open Synthesis. The coronavirus disease 2019 (COVID-19) pandemic of 2020 has caused high levels of mortality and continues to threaten the lives of the global population [[1]World Health OrganizationCoronavirus disease 2019 (COVID-19): situation report, 85. World Health Organisation, Geneva2020Google Scholar]. The pandemic has amounted to a "once in a lifetime" event for humanity and has affected it across its different sectors of existence: health, education, economy, environment, etc. The pandemic continues to threaten job prospects for millions of people and has resulted in widespread economic turmoil [[2]McKibbin W.J. Fernando R. The global macroeconomic impacts of COVID-19: seven scenarios. SSRN J.https://www.ssrn.com/abstract=3547729Date: 2020Date accessed: April 15, 2020Google Scholar]. It has also led to the cancellation of numerous conferences (e.g., [[3]Robbins R. STAT's guide to health care conferences disrupted by the coronavirus crisis. STAT News.https://www.statnews.com/2020/03/07/stats-guide-health-care-conferences-disrupted-covid-19/Date: 2020Date accessed: April 7, 2020Google Scholar]) and research fieldwork and closed offices across the globe. As the scientific community grapples to respond to the massive and rapidly evolving crisis, the volume of research literature that has been published in relation to the outbreak has expanded rapidly (Figure 1). Simultaneously, efforts to synthesize this growing evidence base have begun, both through ongoing traditional approaches to independent systematic reviews (e.g., [[4]Sahin A.R. Erdogan A. Mutlu Agaoglu P. Dineri Y. Cakirci A.Y. Senel M.E. et al.2019 novel coronavirus (COVID-19) outbreak: a review of the current literature.Eurasian J Med Oncol. 2020; 4: 1-7Google Scholar,[5]Salehi S. Abedi A. Balakrishnan S. Gholamrezanezhad A. Coronavirus disease 2019 (COVID-19): a systematic review of imaging findings in 919 patients.Am J Roentgenol. 2020; : 1-7Crossref PubMed Scopus (912) Google Scholar]), and through both rapid and living systematic reviews (e.g., https://covidrapidreviews.cochrane.org/search/site). Rapid systematic reviews provide in a timely way the evidence needed to inform policy making under urgent circumstances. On the other hand, living systematic reviews ensure that any evidence synthesis is up to date with the latest evidence (e.g., by the L.OVE team at Epistemonikos). As the volume of evidence increases and decision makers and scientists struggle to grapple with the rapidly expanding evidence base, many research groups are volunteering to support these efforts by using online collaborative tools and virtual workspaces, in an effort to support continued working during challenging times, and also to help identify, map, and synthesize research as it emerges. This work faces a suite of challenges because of the often closed nature of science. The major challenges are the duplication of efforts (leading to research waste), the inefficiency in conducting research, and missing the opportunity to address important questions. Open science principles present an opportunity to address these challenges in the context of the COVID-19 pandemic. They would also ensure that the research in the field is more collaborative, transparent, and rigorous. This article argues for, and illustrates how, to apply the principles of Open Science to the field of evidence synthesis, a concept we refer to as Open Synthesis [[6]Haddaway N.R. Open Synthesis: on the need for evidence synthesis to embrace Open Science.Environ Evid. 2018; 7: 26Crossref Scopus (24) Google Scholar]. We use the COVID-19 pandemic as a case in point to highlight the potential significant benefits of Openness to the research, policy, and practice communities. Evidence synthesis is the name for research methodologies that involve identifying, collating, appraising, and summarizing a body of research evidence using tried and tested systematic and robust literature review methods: i.e., systematic reviews and systematic maps [[7]Gough D. Oliver S. Thomas J. An introduction to systematic reviews.2nd ed. SAGE Publication, London2017: 304Google Scholar]. Systematic reviews are now widely used in the field of health care as a "gold standard" for summarizing evidence to provide support for decision-making in policy and practice, through a variety of knowledge translation products and practice guidelines [[8]Alonso-Coello P. Schünemann H.J. Moberg J. Brignardello-Petersen R. Akl E.A. Davoli M. et al.GRADE Evidence to Decision (EtD) frameworks: a systematic and transparent approach to making well informed healthcare choices. 1: Introduction.BMJ. 2016; 353: i2016Crossref PubMed Scopus (411) Google Scholar]. However, systematic reviewers face challenges as a result of an often closed academic system; research can be difficult to find and download without access to expensive bibliographic databases [[9]Livoreil B. Glanville J. Haddaway N.R. Bayliss H. Bethel A. Lachapelle F.F. et al.Systematic searching for environmental evidence using multiple tools and sources.Environ Evid. 2017; 6: 23Crossref Scopus (70) Google Scholar]; primary research articles and the systematic reviews that synthesize them are hidden behind paywalls [[10]Chawla A. Twycross-Lewis R. Maffulli N. Microfracture produces inferior outcomes to other cartilage repair techniques in chondral injuries in the paediatric knee.Br Med Bull. 2015; 116: 93-103PubMed Google Scholar,[11]Piwowar H. Priem J. Larivière V. Alperin J.P. Matthias L. Norlander B. et al.The state of OA: a large-scale analysis of the prevalence and impact of Open Access articles.PeerJ. 2018; 6: e4375Crossref PubMed Scopus (492) Google Scholar]; reporting of methods used in trials and syntheses is often deficient to some degree, hampering verification and learning about methodology [[12]Glasziou P. Altman D.G. Bossuyt P. Boutron I. Clarke M. Julious S. et al.Reducing waste from incomplete or unusable reports of biomedical research.Lancet. 2014; 383: 267-276Abstract Full Text Full Text PDF PubMed Scopus (798) Google Scholar]; research data are often not made public, particularly when produced by organizations with commercial interests, such as pharmaceutical companies [[13]Moynihan R. Bero L. Hill S. Johansson M. Lexchin J. Macdonald H. et al.Pathways to independence: towards producing and using trustworthy evidence.BMJ. 2019; 367: l6576Crossref PubMed Scopus (60) Google Scholar]; analytical code is rarely shared and statistical methods can be hard to verify [[14]Chiang I.C.A. Jhangiani R.S. Price P.C. From the "replicability crisis" to open science practices. Research Methods in Psychology. BCcampus.https://opentextbc.ca/researchmethods/chapter/from-the-replicability-crisis-to-open-science-practices/Date: 2015Date accessed: April 22, 2020Google Scholar], and educational materials to train the next generation of evidence synthesists are often not made public [[15]Farrow R. Open education and critical pedagogy.Learn Media Technology. 2017; 42: 130-146Crossref Scopus (26) Google Scholar]. Open Science has central premises relating to accessibility and the collaborative nature of knowledge creation and the knowledge itself [[16]Fecher B. Friesike S. Open science: one term, five schools of thought.in: Bartling S. Friesike S. Opening Science: The Evolving Guide on How the Internet is Changing Research, Collaboration and Scholarly Publishing [Internet]. Springer International Publishing, Cham2014: 17-47Crossref Google Scholar]. These principles (see Table 1) include concepts such as open access (unrestricted availability of research publications,11) and open data (freely accessible research data used in analyses; [[17]Gewin V. Data sharing: an open mind on open data.Nature. 2016; 529: 117-119Crossref PubMed Google Scholar]) that together support efficient, transparent, and rigorous research.Table 1Main concepts within Open Science [translated and adapted from OpenScienceASAP; http://openscienceasap.org/open-science]ConceptDefinitionOpen dataFreely available research dataOpen sourceUse and production of freely accessible software and hardwareOpen methodologyDocumentation of methods for a research process as far as possibleOpen peer reviewTransparent and traceable quality assurance through open peer reviewOpen accessPublish research articles in an accessible manner, making them useable and accessible for allOpen educational resourcesFree and accessible materials for education and university teaching Open table in a new tab There are various definitions of Open Science, ranging from relatively simple classifications of "data, analysis, publications, and comments" [[18]Foster E.D. Deardorff A. Open science framework (OSF).J Med Libr Assoc. 2017; 105: 203-206Crossref Google Scholar] to somewhat more elaborate frameworks (see Table 1), all the way to complex hierarchical conceptual models [[19]Knoth P. Pontika N. Open Science Taxonomy. figshare, 2015https://doi.org/10.6084/m9.figshare.1508606.v3Crossref Google Scholar]. Although these classifications differ in their complexity, they each attempt to cover all aspects of research processes from initiation to communication. Some of the problems with traditional approaches to evidence synthesis described above (access to data, methods, publications, etc.) can be and indeed are being mitigated by applying these Open Science principles to evidence synthesis; the result has been termed Open Synthesis [[6]Haddaway N.R. Open Synthesis: on the need for evidence synthesis to embrace Open Science.Environ Evid. 2018; 7: 26Crossref Scopus (24) Google Scholar]. Open Synthesis was first proposed to apply Open Access, Open Data, Open Source and Open Methodology to evidence synthesis, with the possible addition of Open Education. We propose a finer resolution based on more complex taxonomies (e.g., [[19]Knoth P. Pontika N. Open Science Taxonomy. figshare, 2015https://doi.org/10.6084/m9.figshare.1508606.v3Crossref Google Scholar]). We suggest that such Open Synthesis would support the transfer of knowledge from primary research to decision support tools and evidence portals (e.g., the Teaching and Learning Toolkit), particularly during humanitarian crises; for example, Evidence Aid hosts a freely accessible evidence repository that holds summaries of COVID-19 relevant evidence (https://www.evidenceaid.org/coronavirus-covid-19-evidence-collection/) [[20]Clarke M. Evidence Aid – from the Asian tsunami to the Wenchuan earthquake.J Evid Based Med. 2008; 1: 9-11Crossref PubMed Scopus (23) Google Scholar]. Many Open Synthesis resources have been developed and assembled in an effort to facilitate access to the novel evidence base emerging in relation to the COVID-19 pandemic. These examples are (understandably) almost exclusively related to the field of health, but the evidence base will become increasingly multidisciplinary and cross-sectoral as research focus spreads to include the societal and environmental impacts of the outbreak and subsequent social policies, such as widescale lockdowns. The key components of Open Synthesis are described in Figure 2, and examples are given below. The COVID-19 evidence map of emerging literature produced by the Meta-Evidence blog was open to interested collaborators (before the project was discontinued because of considerable overlap with several other projects) and involved substantial efforts to translate and extract information from literature written in Chinese. The synthesizing group under COVID Evidence Network to support Decision makers (COVID-END; https://www.mcmasterforum.org/networks/covid-end/working-groups/synthesizing) supports efforts to synthesize the evidence that already exists in ways that are more coordinated and efficient and that balance quality and timeliness. Cochrane's COVID Rapid Reviews repository provides space for Open Collaboration by connecting authors interested in addressing the same rapid review question that were submitted by the public. To enable free (i.e., not paywalled) searching for relevant evidence, various efforts are seeking to build "living" bibliographies and databases of research on COVID-19. For example, the CORD19 database (MIT); the COVID-19 living systematic map (EPPI center); Cochrane's COVID-19 Study Register; the Norwegian Institute of Public Health's live map of COVID-19 evidence. Similarly, the McMaster GRADE Center is collaborating with the Norwegian Institute of Public Health and others to map recommendations relevant to COVID-19 and make them publicly available (including the strength and certainty of supporting evidence) [[21]Schünemann H.J. Santesso N. Vist G.E. Cuello C. Lotfi T. Flottorp S. et al.Using GRADE in situations of emergencies and urgencies: certainty in evidence and recommendations matters during the COVID-19 pandemic, now more than ever and no matter what.J Clin Epidemiol. 2020; 0Google Scholar]. Efforts exist to ensure that evidence syntheses use transparent and well reported methods to improve repeatability and usability. For example, the systematic review registry PROSPERO has provided a link to already registered reviews of human and animal studies relevant to COVID-19. Freely accessible data (including those extracted and generated within the process of conducting a systematic review) are being made available for reuse and analysis. From evidence syntheses, the Epistemonikos COVID-19 collection archives data extracted from within reviews in a publicly accessible database (https://www.epistemonikos.cl/all-about-covid-19/). Freely useable and adaptable tools for analysis and visualization have been made available online to support the conduct and communication of COVID-19 relevant research, for example, corona-cli (code for analyzing and visualizing data on the outbreak); the EviAtlas tool for mapping the geographical spread of evidence on COVID-19 [[22]Haddaway N.R. Feierman A. Grainger M.J. Gray C.T. Tanriver-Ayder E. Dhaubanjar S. et al.EviAtlas: a tool for visualising evidence synthesis databases.Environ Evid. 2019; 8: 22Crossref Scopus (29) Google Scholar]. Many researchers routinely publish the analytic code to accompany their research (e.g., R script for statistical analyses), although to date this practice is not common in the syntheses we have examined; perhaps because this is challenging where reviewers have not made use of code-driven software, and code does not readily exist (e.g., for reviews conducted using RevMan software). However, some examples of Open Code in primary research include code to webscrape COVID data from Worldometers and epidemiological modeling code for COVID. Several publishers and journals have made COVID-19 relevant research articles and evidence syntheses freely accessible, including the Cochrane COVID-19 evidence collection and several Elsevier journals including Journal of Clinical Epidemiology and The Lancet (https://www.elsevier.com/connect/coronavirus-information-center). Systematic reviewers can facilitate Open Access by ensuring their reviews are freely accessible (e.g., by publishing in open access journals or depositing preprints or postprints in publicly accessible repositories) but also by facilitating access to the primary research synthesized in their reviews (e.g., by providing DOIs for the full texts of their included studies). Although most journals do not currently publish peer review reports and revisions of systematic reviews, some resources exist to support this, including the Outbreak Science Rapid PREReview for prepublication peer review. Various freely accessible training resources (e.g., courses, webinars, and handbooks) exist for evidence synthesis methodology, including #ESTraining provided by the Collaboration for Environmental Evidence and Stockholm Environment Institute and webinars provided by the Global Evidence Synthesis Initiative. Systematic reviews have been shown to suffer from poor reporting of funding, role of funders, and conflicts of interest in general [[23]Bou-Karroum L. Hakoum M.B. Hammoud M.Z. Khamis A.M. Al-Gibbawi M. Badour S. et al.Reporting of financial and non-financial conflicts of interest in systematic reviews on health policy and systems research: a cross sectional survey.Int J Health Policy Manag. 2018; 7: 711-717Crossref PubMed Scopus (14) Google Scholar]. Open Interests calls for individuals to transparently declare possible financial and nonfinancial interests—ideally, this would be performed by all parties involved in the conduct and publication of systematic reviews (including educators, engaged stakeholders, review authors, advisory group members, peer reviewers, editors, and publishers); these should be updated regularly. In practical terms, this could either be a declaration at the point of publication (e.g., review publications, educational materials, or peer review comments) or via a freely accessible central database of interests. At present, no Open Interests initiative exists. Although no criticisms have been fielded against Open Synthesis yet, some researchers have raised concerns about Open Science. We have described some of these in Table 2. These concerns either relate to openness itself as a practice or the application and enforcement of Open Science within current institutions and incentive structures.Table 2Concerns relating to Open Science and their applicability to and mitigation within Open SynthesisConcern relating to Open ScienceDescription of the concernApplicability to Open SynthesisPotential mitigations for Open SynthesisExacerbation of power imbalance and inequality or exclusion of minorities [[24]Bahlai C. Bartlett L.J. Burgio K.R. Fournier A.M. Keiser C.N. Poisot T. et al.Open science isn't always open to all scientists.Am Sci. 2019; 107: 78-82Crossref Google Scholar]Open Science practices applied within the current incentive structures and institutions can exacerbate power imbalance and inequality, particularly adversely affecting minorities and the vulnerable or oppressedHighly applicable to evidence syntheses, just as with primary research.Open Synthesis principles can be endorsed rather than enforced to avoid penalizing vulnerable researchers who may struggle to be Open. Structures can be put in place to support minorities and vulnerable researchers (e.g., publication fee waivers for low- and middle-income researchers [[25]Lawson S. Fee Waivers for Open Access Journals.Publications. 2015; 3: 155-167Crossref Scopus (22) Google Scholar], mentoring in Open practices).Risk of misuse [[26]Grand A. Wilkinson C. Bultitude K. Winfield A.F.T. Open science: a new "trust technology"?.Sci Commun. 2012; 34: 679-689Crossref Scopus (45) Google Scholar]Open Data and Code may be reused or reanalyzed incorrectly, potentially for nefarious reasonsAlthough some data in syntheses are in the public domain, some data from unpublished studies or unpublished outcomes obtained from authors are not available in the public domain. Furthermore, the calculation of effect sizes may use assumptions that affect the estimates calculated.Ensure full methodological transparency to avoid misunderstandings, including annotation of analytic or statistical code and any assumptions. Adequate reference and easy linkage to the original data source should be provided for clarity.Risk of public misunderstanding (e.g., [[27]Nielsen M. Reinventing discovery: the new era of networked science. Princeton University Press, New Jersey2020: 204Google Scholar])Detailed language and nuance of data may be misunderstood by lay people, nonspecialists, or those who did not collect the dataSystematic reviews are typically not intended to be a means of communication with the public (plain language summaries instead). The risk is not higher for Open Synthesis relative to standard synthesis.Synthesis methods must be detailed enough and follow standard language to allow full understanding.Potential to be overwhelmed by information [[28]Grand A. Wilkinson C. Bultitude K. Winfield A.F.T. Mapping the hinterland: data issues in open science.Public Underst Sci. 2016; 25: 88-103Crossref PubMed Scopus (18) Google Scholar]Publication of large volumes of data or information may make it difficult to find important details within/across studiesInformation is typically more structured across evidence syntheses than primary research because they use a common methodological framework.Standardized reporting templates could be built to support or facilitate metadata formatting so that information is readily found and understood. Reviewers could provide different versions with different levels of detail for different audiences (e.g., Plain language summary for the lay public).Fear of repercussions if mistakes are unearthed after publication [[29]Allen C. Mehler D.M.A. Open science challenges, benefits and tips in early career and beyond.PLoS Biol. 2019; 17: e3000246Crossref PubMed Scopus (210) Google Scholar]Authors may fear that they could be subjected to persecution if mistakes are identified in their methods after publication and so may prefer to keep data and analyses privateThere is potential for error in the identification, selection, appraisal, and analysis of studies included in systematic reviewsReviewers should be incentivized to admit errors and supported when these occur. Institutional punitive measures for publishing corrections or retractions should first examine the reasons behind the action, avoiding blanket punishments and acknowledge authors who act ethically and responsibly, while promoting and rewarding Open behaviors. Open Synthesis should be reframed as an opportunity to validate findings as opposed to detecting mistakes.Publication of data leads to "research parasitism" [[30]Longo D.L. Drazen J.M. Data sharing.N Engl J Med. 2016; 374: 276-277Crossref PubMed Scopus (269) Google Scholar]Some researchers feel that reuse of data or methods by others is an unfair practice and that authors alone should retain exclusive rightsCochrane, the Campbell Collaboration and the Collaboration for Environmental Evidence allow review teams the right to lead updates to their reviews for a fixed period. Data collected and used in an evidence synthesis is typically already in the public domain, anyway.Raise awareness of the benefits in legacy and impact of research resulting from reuse of data. Ensure those reusing data provide appropriate and full acknowledgment of data sources.Reconsider rules for academic credit, reward, and promotion.Belief that low quality science will proliferate [[31]Lancaster A. Open Science and its Discontents | Ronin Inst.http://ronininstitute.org/open-science-and-its-discontents/1383/Date accessed: May 28, 2020Google Scholar][Specifically referring to Open Peer Review and preprints] some argue that a lack of traditional peer review for preprints removes the gatekeeping that ensures research validity, and low-quality research will become commonPreprints are, in part, a response to a lack of immediate Open Access and closed peer review. They are not an integral part of Open Science but rather an extension of it. Current institutions and incentive structures may not be sufficient to prevent low quality evidence syntheses from being published, but this is also the case for those that are traditionally peer reviewed.Make use of opportunities for Open Peer Review that complement and strengthen preprints (i.e., postpublication peer review;,31). Raise awareness and establish standard communication practices for understanding preprints within the communications community (i.e., journalists and institutional communications officers). Ensure preprints follow standards for conducting and reporting evidence synthesis (e.g., PRISMA and ROSES)Increased resources needed to attain Openness [[26]Grand A. Wilkinson C. Bultitude K. Winfield A.F.T. Open science: a new "trust technology"?.Sci Commun. 2012; 34: 679-689Crossref Scopus (45) Google Scholar,[32]Beagrie N. Lavoie B. Woolard M. Keeping research data safe (Phase 2). Jisc.https://www.webarchive.org.uk/wayback/archive/20140613220103mp_/http://www.jisc.ac.uk/publications/reports/2010/keepingresearchdatasafe2.aspxDate: 2010Date accessed: June 7, 2020Google Scholar]Ensuring that data and information are made fully Open may require resources (time and funding) that are not readily available to allThe large amounts of data potentially produced within a systematic review project could require considerable resources to clean and annotate if not planned from the outset, particularly for analytic code. Open Collaboration could require considerable time to manage if roles and tasks are not carefully predefined.Openness can be achieved for the most part by using cost-free alternatives (e.g., self-archiving to avoid publication fees and the use of free data repositories) and by incentivizing and institutionalizing Open and transparent practices from an early career stage (e.g., good code annotation practices). However, this point is not trivial and highlights the need for careful planning across all aspects of Open Synthesis; planning can significantly reduce resource requirements. Standardizing methods and processes and tools used to abstract and store data could assist in this process [[33]Akl E.A. Haddaway N.R. Rada G. Lotfi T. Evidence synthesis 2.0: when systematic, scoping, rapid, living, and overviews of reviews come together.J Clin Epidemiol. 2020; 0Google Scholar]Risk of "platform capitalism" (i.e., commercialization of public data) [[34]Pievatolo M.C. Open science: human emancipation or bureaucratic serfdom? SCIRES-it.https://archiviomarini.sp.unipi.it/858/Date accessed: June 1, 2020Google Scholar]The free availability of data permits the development of subscription-based/pay-to-use services (e.g., Academia.edu) that aim to provide additional services using public data (e.g., analytics) and platforms that may exploit or disadvantage certain groups of people (e.g., by charging for a service that is otherwise already free elsewhere)Grass roots and no-cost alternatives to these services are often available but awareness of free-to-use services is vital to avoid entrapment by commercial enterprises (e.g., paying a publisher to access an article that is already Open Access).Noncommercial use Creative Commons licenses may help restrict/prevent commercial use of Open Data (e.g., CC BY-NC 3.0), but they are not without criticism, for example, that Creative Commons licenses are based on copyright law that is overly restrictive to academic collaborations [[35]Corbett S. Creative commons licences, the copyright regime and the online community: is there a fatal disconnect?.Mod Law Rev. 2011; 74: 503-531Crossref Scopus (14) Google Scholar].Need to maintain confidentiality [[36]Cummings J.A. Zagrodney J.M. Day T.E. Impact of open data policies on consent to participate in human subjects research: discrepancies between participant action and reported concerns.PLoS One. 2015; 10: e0125208PubMed Google Scholar,[37]Walsh C.G. Xia W. Li M. Denny J.C. Harris P.A. Malin B.A. Enabling open-science initiatives in clinical psychology and psychiatry without sacrificing patients' privacy: current practices and future challenges.Adv Methods Practices Psychol Sci. 2018; 1: 104-114Crossref Scopus (20) Google Scholar]Research subjects are typically provided anonymity that may mean publication of raw data is not feasible or safeEvidence syntheses often make use of summary data not disaggregated at the level of individual participants, and for these reviews this may not be an issue. Individual participant data (IPD) meta-analyses, however, may not be able to publish data openly.For IPD meta-analyses, the requirements for Open Data may need to be relaxed or adapted in some contexts to ensure anonymity can be maintained. For example, data on request repositories for individual patient data exist [[38]van Middelkoop M. Lohmander S. Bierma-Zeinstra S.M.A. Sharing data–taming the beast: barriers to meta-analyses of individual patient data (IPD) and solutions.https://bjsm.bmj.com/content/early/2020/01/29/bjsports-2019-101892Date: 2020Date accessed: June 5, 2020Google Scholar]. Standardized ethical practices could be established where needed for IPD meta-analysis.Institutional barriers including career incentives that reward closed practices [[39]Gagliardi D. Cox D. Li Y. Institutional inertia and barriers to the adoption of open science.in: The transformation of university institutional and organizational boundaries. Brill Sense, Leiden, The Netherlands2015: 107-133Crossref Google Scholar]Career incentives in academic typically and historically center around publication in high-impact journals that are prohibitively expensive to publish Open Access. Recruitment and promotion in academia typically also do not reward or acknowledge Open practices. Institutions may not understand/accept the desire to be OpenSystematic reviewers often work within institutions established around primary research practices, so the same incentives apply. Organizations primarily focusing on evidence synthesis may already have Open practices.Incentive structures are likely to change over time as Open Science practices become more common, but authorities must take a stand to support researchers who are likely to be disadvantaged by being more Open (e.g., early career researchers). Open table in a new tab In addition, there are risks associated with some of the practices that may be facilitated by Open Synthesis, for example, 1) living systematic reviews may involve repeated incremental rerunning of meta-analyses, leading to increased chances of false positive that need to be accounted for (e.g., [[40]Mavergames C. Elliott J.H. Living Systematic Reviews: towards real-time evidence for health-care decision-making | BMJ Best Pract.https://bestpractice.bmj.com/info/toolkit/discuss-ebm/living-systematic-reviews-towards-real-time-evidence-for-health-care-decision-making/Date: 2020Date accessed: June 5, 2020Google Scholar]); 2) updates may need to account for changes in best practice in risk of bias assessments as novel methods become available, potentially involving reassessment of studies identified in the original review. These are not problems with Open Synthesis but rather important issues that should be addressed when planning incentives and infrastructure in support of Open Syntheses. However, a pathway to Open systematic reviews and systematic maps will involve many steps and a diverse array of different actions; these changes should not be expected overnight, and there is a need for detailed discussion about implications and pitfalls. That said, it is generally accepted that the advantages of Open Science outweigh the disadvantages [[41]LeBel E.P. Campbell L. Loving T.J. Benefits of open and high-powered research outweigh costs.J Pers Soc Psychol. 2017; 113: 230Crossref PubMed Scopus (36) Google Scholar]. At present, some of these Open Synthesis practices are enforced or encouraged by review coordinating bodies. Cochrane reviews can be made immediately Open Access at the point of publication for a fee (payable by authors) or made free after a 12 month period (otherwise requiring subscription to access, green Open Access). Cochrane does not yet require systematic review–extracted data to be made public [[42]Shokraneh F. Adams C.E. Clarke M. Amato L. Bastian H. Beller E. et al.Why Cochrane should prioritise sharing data.BMJ. 2018; 362: k3229Crossref PubMed Scopus (13) Google Scholar]. While methods in Cochrane reviews are typically well-reported thanks to the Methodological Expectations for Cochrane Intervention Reviews reporting standards [[43]Higgins J. Churchill R. Lasserson T. Chandler J. Tovey D. Update from the methodological Expectations of Cochrane Intervention reviews (MECIR) project.in: Cochrane Methods. Cochrane. 2012Google Scholar], the "raw" data extracted from primary studies within a review are not typically included. All Campbell Collaboration reviews are published in their Open Access journal. Transparent and Open Methods are required by the Methodological Expectations for Campbell Collaboration Intervention Reviews. Open Data and Code are in the vision for the future of the journal [[44]Welch V.A. Campbell systematic reviews takes next step to meeting FAIR principles.Campbell Syst Rev. 2019; 15: e1032Google Scholar]. For both organizations, review protocols are published online and time-stamped before work commences, as should be performed with all systematic reviews and maps (e.g., in PROSPERO, Cochrane Database of Systematic Reviews, or published in a suitable journal). Adopting truly Open evidence synthesis approaches has the potential to globalize research, break down barriers to data sharing and collaboration, and mitigate inequality in knowledge availability (e.g., a large body of Chinese coronavirus trials was recently translated and mapped by researchers from Lanzhou University). Open synthesis also supports either living systematic reviews or intermittent updates; it is agnostic toward the framework chosen to update reviews. Importantly, it emphasizes the need to facilitate updates however that may occur. Moreover, Open Synthesis of evidence will provide guideline developers with faster and better access to the synthesis methods, findings, conflict of interest information, and other elements necessary for guideline development, and subsequently, improve the quality and efficiency of guideline development. Achieving the optimal impact of Open Synthesis requires the consideration of other principles. Of outmost importance is to respond to the knowledge needs of decision makers by adopting valid prioritisty setting approaches. Similarly, it has to feed into knowledge translation tools that are appropriate to the target decision makers. In addition, it should build on emerging concepts, such as Evidence Synthesis 2.0 [[33]Akl E.A. Haddaway N.R. Rada G. Lotfi T. Evidence synthesis 2.0: when systematic, scoping, rapid, living, and overviews of reviews come together.J Clin Epidemiol. 2020; 0Google Scholar], to ensure the efficiency of the process and appropriateness of the output. We encourage adoption of these principles across all disciplines to meet the social, legal, ethical, and economic challenges of the global COVID-19 pandemic, such as supporting home-based education for children out of school; mitigating social impacts of isolation; responding to the increased risk and severity of domestic violence, global food insecurity, or the implications of social lockdowns on environmental recovery from long-term anthropogenic disturbance and climate change. We call for increasing application of Open Science and Open Synthesis principles across disciplines both within and beyond the COVID-19 epidemic to support evidence production, synthesis, and evidence-informed policy. By embracing Open Synthesis, evidence synthesis communities from all disciplines can maximize the efficiency, impact, and legacy of systematic reviews and better support decision-making, particularly in global crises such as the current COVID-19 pandemic, establishing a more resilient and collaborative future in the event of similar global challenges. Neal R. Haddaway: Conceptualization, Data curation. Elie A. Akl: Conceptualization, Writing - original draft, Writing - review & editing. Matthew J. Page: Writing - original draft, Writing - review & editing. Vivian A. Welch: Writing - original draft, Writing - review & editing. Ciara Keenan: Writing - original draft, Writing - review & editing. Tamara Lotfi: Conceptualization, Writing - original draft, Writing - review & editing. Download .xml (.0 MB) Help with xml files Data Profile

Referência(s)