Methodological Quality of Experimental Stroke Studies Published in the Stroke Journal
2015; Lippincott Williams & Wilkins; Volume: 47; Issue: 1 Linguagem: Inglês
10.1161/strokeaha.115.011695
ISSN1524-4628
AutoresJens Minnerup, Verena Zentsch, Antje Schmidt, Marc Fisher, Wolf‐Rüdiger Schäbitz,
Tópico(s)Animal testing and alternatives
ResumoHomeStrokeVol. 47, No. 1Methodological Quality of Experimental Stroke Studies Published in the Stroke Journal Free AccessResearch ArticlePDF/EPUBAboutView PDFView EPUBSections ToolsAdd to favoritesDownload citationsTrack citationsPermissionsDownload Articles + Supplements ShareShare onFacebookTwitterLinked InMendeleyReddit Jump toSupplemental MaterialFree AccessResearch ArticlePDF/EPUBMethodological Quality of Experimental Stroke Studies Published in the Stroke JournalTime Trends and Effect of the Basic Science Checklist Jens Minnerup, MD, Verena Zentsch, MD, Antje Schmidt, MD, Marc Fisher, MD and Wolf-Rüdiger Schäbitz, MD Jens MinnerupJens Minnerup From the Department of Neurology, University of Münster, Münster, Germany (J.M., V.Z., A.S.); Department of Neurology, Beth Israel Deaconess Medical Center, Harvard Medical School, Boston, MA (M.F.); and Department of Neurology, Bethel-Evangelisches Krankenhaus, Bielefeld, Germany (W.-R.S). , Verena ZentschVerena Zentsch From the Department of Neurology, University of Münster, Münster, Germany (J.M., V.Z., A.S.); Department of Neurology, Beth Israel Deaconess Medical Center, Harvard Medical School, Boston, MA (M.F.); and Department of Neurology, Bethel-Evangelisches Krankenhaus, Bielefeld, Germany (W.-R.S). , Antje SchmidtAntje Schmidt From the Department of Neurology, University of Münster, Münster, Germany (J.M., V.Z., A.S.); Department of Neurology, Beth Israel Deaconess Medical Center, Harvard Medical School, Boston, MA (M.F.); and Department of Neurology, Bethel-Evangelisches Krankenhaus, Bielefeld, Germany (W.-R.S). , Marc FisherMarc Fisher From the Department of Neurology, University of Münster, Münster, Germany (J.M., V.Z., A.S.); Department of Neurology, Beth Israel Deaconess Medical Center, Harvard Medical School, Boston, MA (M.F.); and Department of Neurology, Bethel-Evangelisches Krankenhaus, Bielefeld, Germany (W.-R.S). and Wolf-Rüdiger SchäbitzWolf-Rüdiger Schäbitz From the Department of Neurology, University of Münster, Münster, Germany (J.M., V.Z., A.S.); Department of Neurology, Beth Israel Deaconess Medical Center, Harvard Medical School, Boston, MA (M.F.); and Department of Neurology, Bethel-Evangelisches Krankenhaus, Bielefeld, Germany (W.-R.S). Originally published10 Dec 2015https://doi.org/10.1161/STROKEAHA.115.011695Stroke. 2016;47:267–272Other version(s) of this articleYou are viewing the most recent version of this article. Previous versions: January 1, 2015: Previous Version 1 The translational roadblock, characterized by beneficial effects of treatments in experimental studies and subsequent neutral or negative results in clinical trials, is a major problem in cerebrovascular medicine.1 Growing evidence suggests that this may be due, at least in part, to low study quality of experiments resulting in a systematically reduced internal validity of animal studies. The lack of blinding and randomization in experimental studies, for instance, leads to an overestimation of treatment effects and may, therefore, contribute to different results of preclinical and clinical studies.2 As a consequence, the Stroke Therapy Academic Industry Roundtable was founded to develop recommendations on quality characteristics of animal stroke studies.3Stroke is generally regarded as one of the most important journals in the area of cerebrovascular disease, publishing clinical and experimental studies with the latter focusing on basic mechanisms of cerebral ischemia and on the development of new stroke therapies. To insure the quality of experimental studies, authors need to declare in a checklist cornerstones of study conductance, such as randomization and blinding procedures, definition of inclusion and exclusion criteria, etc, when submitting an article to Stroke. However, whether implementation of the so-called Basic Science Checklist in the submission process in August 2011 improved the quality of preclinical studies is unknown so far.The aim of this study was to analyze time trends of quality and design of preclinical studies published in Stroke from January 2010 through December 2013 and whether implementation of the Basic Science Checklist in August 2011 had an effect on key quality measures.MethodsRetrieving Publications and Selection of StudiesWe used the journal's Archive of All Online Issues4 for the identification of eligible articles published from January 2010 to December 2013. Publications were categorized according to the journal's article categories as an original contribution (clinical or basic science), brief report, letter to the editor, and review article. We only included original basic science contributions in which the efficacy or side effects of a therapy were evaluated in an animal model of cerebral ischemia, intracranial bleeding, cerebral venous thrombosis, or intracranial aneurysms.Data ExtractionWe extracted data on the submission date, year, and month of print publication, the species used, and the stroke model used. We assessed whether the following items of the Basic Science Checklist were met5: (1) species, strains, and sources of animals defined, (2) statistical methods defined, (3) specific criteria for inclusions and exclusions specified, (4) randomization, allocation concealment, and blinding performed, and (5) postrandomization all excluded animals reported. The checklist as it seems during the submission process is provided in Figure I in the online-only Data Supplement. The study characteristics animals defined (strain, species, and source) and randomization, allocation concealment, and blinding are summarized as 1 item in the Basic Science Checklist but were also separately recorded for our analysis. In addition, we recorded whether the methods of randomization, blinding, and allocation concealment were stated as suggested by Macleod et al.5 The item postrandomization excluded animals reported was met when the number of animals that were excluded after treatment allocation for different reasons, including mortality or other reasons, was reported for each treatment group. We further assessed the methodological quality of the included studies according to a previously published Stroke Therapy Academic Industry Roundtable recommendation–derived quality scale.6–8 The realization of the following aspects of each study was evaluated: (1) dose–response relationship, (2) optimal time window of the treatment, (3) monitoring of physiological parameters (such as temperature, glucose level, or blood pressure), (4) assessment of at least 2 outcomes (infarct size and 1 functional outcome), (5) outcome assessment in the acute phase (1–7 days), (6) outcome assessment in the chronic phase (beyond 7 days), (7) animals with comorbidity (aged, diabetic, or hypertensive), (8) compliance with animal welfare regulations, (9) statement of potential conflict of interests, and (10) sample size calculation reported.Two authors (V.Z. and A.S.) independently extracted data. Disagreements were solved after discussion of the study details (J.M., V.Z., and A.S.).Statistical AnalysisTo evaluate changes in study characteristics by year, we used the Mantel–Haenszel test of trend for categorical and ordinal variables. We performed a logistic regression analysis for categorical variables and Student t test for continuous variables to determine the association of Basic Science Checklist items and whether the study was submitted before or after the implementation of the Basic Science Checklist in August 2011. Statistical significance was determined as P 0.05).Download figureDownload PowerPointFigure. Trends in published original contributions categorized as clinical science contribution, basic science contribution not investigating a treatment, and basic science contribution investigating a treatment.Study characteristics of included basic science studies, which investigated a treatment, are given in Table 1. During the study period, the proportions of different stroke models and species used in the studies did not vary over time. With respect to Basic Science Checklist items, the proportion of articles that reported the combination of randomization, allocation concealment, and blinding increased (P=0.001). The proportion of each of the subitems randomization, allocation concealment, and blinding also increased over time (P=0.01, 0.001, and 0.004, respectively). None of the studies reported the methods of blinding or allocation concealment. The proportion of studies that met 2 or 3 checklist items decreased and those that met 4 or 5 items increased during the study period. Proportions of the other Basic Science Checklist characteristics and the mean sum of Basic Science Checklist items did not significantly change. Reporting of the Stroke Therapy Academic Industry Roundtable–derived quality items investigation of a dose–response relationship (P=0.02), physiological monitoring (P=0.02), and compliance with animal welfare regulations (P=0.02) significantly increased. There was a trend toward an increased reporting of assessment of infarct size and functional outcome (P=0.08) and outcome assessment in the acute phase (P=0.05). The mean of the sum of all quality characteristics significantly increased from 7.04 in 2011 to 8.33 in 2013 (P=0.002). The proportion of studies with the lowest category of the sum of quality characteristics (0–5 items met) decreased during the study period. As required by Stroke, all studies stated a potential conflict of interest.Table 1. Characteristics of Animal Experimental Studies* Published in Stroke From 2010 to 20132010 (n=28)2011 (n=42)2012 (n=24)2013 (n=39)P Values for TrendStroke model, n (%) Focal cerebral ischemia†20 (71.4)27 (64.3)18 (75.0)27 (69.2)0.89 Carotid artery ligation or bilateral stenosis5 (17.9)6 (14.3)3 (12.5)3 (7.7)0.21 Intracerebral or subarachnoid hemorrhage2 (7.1)8 (19.0)2 (8.3)10 (25.6)0.11 Aneurysm or arteriovenous malformation0 (0.0)1 (2.4)2 (8.3)1 (2.6)0.42 Cerebral venous sinus thrombosis1 (3.6)0 (0.0)0 (0.0)0 (0.0)0.17Species, n (%) Mice5 (17.9)18 (42.9)12 (50.0)18 (46.2)0.03 Rats23 (82.1)23 (54.8)15 (62.5)22 (56.4)0.11 Rabbits0 (0.0)1 (2.4)0 (0.0)1 (2.6)0.57Basic Science Checklist, n (%) (1) Animals defined (strain, species, and source)24 (85.7)33 (78.6)19 (79.2)28 (71.8)0.20 Species28 (100.0)42 (100.0)24 (100.0)39 (100.0)N/A Strain28 (100.0)40 (95.2)24 (100.0)38 (97.4)0.86 Source of animals24 (85.7)34 (81.0)19 (79.2)28 (71.8)0.16 (2) Statistical method defined27 (96.4)42 (100.0)24 (100.0)39 (100.0)0.17 (3) Inclusion and exclusion criteria defined9 (32.1)8 (19.0)8 (33.3)16 (41.0)0.17 (4) Randomization, allocation concealment, and blinding‡0 (0.0)4 (9.5)4 (16.7)11 (28.2)0.001 Randomization14 (50.0)27 (64.3)17 (70.8)31 (79.5)0.01 Method of randomization stated1 (3.6)1 (2.4)7 (29.2)2 (5.1)0.28 Allocation concealment0 (0.0)5 (11.9)4 (16.7)12 (30.8)0.001 Blinding16 (57.1)33 (78.6)19 (79.2)35 (89.7)0.004 (5) Postrandomization excluded animals reported15 (53.6)13 (31.0)3 (12.5)11 (28.2)0.64Sum of Basic Science Checklist items§ Mean (SD)2.68 (0.77)2.38 (0.94)2.58 (1.02)2.85 (1.20)0.24 0 or 1 item met, n (%)2 (7.1)6 (14.3)3 (12.5)7 (17.9)0.26 2 or 3 items met, n (%)23 (82.1)31 (73.8)15 (62.5)20 (51.3)0.04 4 or 5 items met, n (%)3 (10.7)5 (11.9)6 (25.0)12 (30.8)0.02STAIR-derived quality recommendations, n (%) Dose–response relationship2 (7.1)11 (26.2)11 (45.8)12 (30.8)0.02 Time window investigation3 (10.7)4 (9.5)9 (37.5)4 (10.3)0.54 Physiological monitoring‖15 (53.6)31 (73.8)20 (83.3)31 (79.5)0.02 Assessment of infarct size and functional outcome15 (53.6)22 (52.4)17 (70.8)27 (69.2)0.08 Outcome assessment in the acute phase (1–7 d)21 (75.0)37 (88.1)22 (91.7)36 (92.3)0.05 Outcome assessment in the chronic phase (>7 d)10 (35.7)15 (35.7)10 (41.7)15 (38.5)0.73 Animals with comorbidity1 (3.6)2 (4.8)1 (4.2)4 (10.3)0.25 Compliance with animal welfare regulations25 (89.3)40 (95.2)24 (100.0)39 (100.0)0.02 Statement of potential conflict of interest28 (100.0)42 (100.0)24 (100.0)39 (100.0)N/A Sample size calculation2 (7.1)3 (7.1)3 (12.5)7 (17.9)0.11Sum of all quality characteristics¶ Mean (SD)7.04 (2.13)7.31 (1.65)8.46 (1.93)8.33 (2.07)0.002 0–5 items met, n (%)7 (25.0)4 (9.5)0 (0.0)2 (5.1)0.008 6–10 items met, n (%)20 (71.4)37 (88.1)20 (83.3)33 (84.6)0.33 11–15 items met, n (%)1 (3.6)1 (2.4)4 (16.7)4 (10.3)0.11Percentages of stroke models and species sum up to >100% because some studies used >1 model or species. N/A indicates not applicable; and STAIR, Stroke Therapy Academic Industry Roundtable.*Only experimental stroke studies that investigated the efficacy or the side effects of a treatment.†Includes middle cerebral artery occlusion, photothrombosis, and endothelin-1 injection.‡This criterion includes randomization, allocation concealment, and blinding but not the method of randomization.§Sum of the items: (1) animals defined, (2) statistical method defined, (3) inclusion and exclusion criteria defined, (4) randomization, allocation concealment, and blinding, and (5) postrandomization excluded animals reported.‖Includes temperature, glucose level, or blood pressure.¶Includes Basic Science Checklist items and STAIR-derived quality recommendations.Table 2 shows the results of the univariate analysis comparing whether Basic Science Checklist items of published articles submitted before and after implementation of the checklist in August 2011 were met. The proportion of studies that defined inclusion and exclusion criteria increased from 22.7% to 41.4% (P=0.02). More articles reported randomization, allocation concealment, and blinding after implementation of the checklist (6.7% versus 24.1%; P=0.007). Among subitems, allocation concealment (8.0% versus 25.9%; P=0.008), and blinding (70.7% versus 86.2%; P=0.04) significantly increased after checklist implementation. The majority of studies defined the animals used with no difference before and after checklist implementation (81.3% versus 74.1%; P=0.32). Particularly, all studies reported the species, and almost all studies (97.3% and 98.3%, respectively) reported the strain used. Also all but 1 study described their statistical methods. There was a trend toward an increased mean of the sum of checklist items (P=0.09). The proportion of studies, which met 2 or 3 items decreased from 78.7% to 51.7% (P=0.01) and the proportion of studies, which met 4 or 5 items increased from 10.7% to 31.0% (P=0.005) after checklist implementation.Table 2. Comparison of Quality Characteristics of Animal Experimental Studies* Submitted to Stroke Before and After the Implementation of the Basic Science Checklist in August 2011, Univariate AnalysisSubmission Before Checklist Implementation (n=75)Submission After Checklist Implementation (n=58)OR95% CIP ValuesBasic Science Checklist, n (%) (1) Animals defined (strain, species, and source)61 (81.3)43 (74.1)0.660.29–1.500.32 Species75 (100.0)58 (100.0)N/AN/ANA Strain73 (97.3)57 (98.3)1.560.14–17.660.72 Source of animals62 (82.7)43 (74.1)0.600.26–1.390.23 (2) Statistical method defined74 (98.7)58 (100.0)→ ∞0.00–∞1.00 (3) Inclusion and exclusion criteria defined17 (22.7)24 (41.4)2.411.14–5.110.02 (4) Randomization, allocation concealment, and blinding†5 (6.7)14 (24.1)4.461.50–13.230.007 Randomization46 (61.3)43 (74.1)1.810.85–3.820.12 Method of randomization stated5 (6.7)6 (10.3)1.620.47–5.580.45 Allocation concealment6 (8.0)15 (25.9)4.011.45–11.130.008 Blinding53 (70.7)50 (86.2)2.591.06–6.360.04 (5) Postrandomization excluded animals reported29 (38.7)23 (39.7)1.040.52–2.100.91Sum of Basic Science Checklist items,‡ mean (SD) Mean (SD)2.48 (0.86)2.79 (1.17)N/AN/A0.09 0 or 1 item met, n (%)8 (10.7)10 (17.2)1.750.64–4.750.28 2 or 3 items met, n (%)59 (78.7)30 (51.7)0.290.14–0.620.01 4 or 5 items met, n (%)8 (10.7)18 (31.0)3.771.50–9.460.005Percentages of stroke models and species sum up to >100% because some studies used >1 model or species. CI indicates confidence interval; N/A, not applicable; and OR, odds ratio.*Only experimental stroke studies that investigated the efficacy or side effects of a treatment were included.†This criterion includes randomization, allocation concealment, and blinding but not the method of randomization.‡Sum of the items: (1) animals defined, (2) statistical method defined, (3) inclusion and exclusion criteria defined, (4) randomization, allocation concealment, and blinding, and (5) postrandomization excluded animals reported.DiscussionSystematic assessment of all animal experimental studies investigating a treatment published in Stroke between January 2010 and December 2013 revealed improvements in the reporting of key characteristics of scientific quality, such as definition of inclusion and exclusion criteria, randomization, allocation concealment, blinding, and physiological monitoring. The overall quality measured by a sum score of quality items also improved over time. When comparing whether items of the Basic Science Checklist were met in published studies submitted before and after checklist implementation, we found an increased reporting of inclusion and exclusion criteria definition, allocation concealment, blinding, and an increase of studies with the highest category of the checklist item sum score. However, relevant components of quality did not improve over time or improved but still had a low prevalence, including inclusion and exclusion criteria defined, method of randomization stated, allocation concealment, and postrandomization excluded animals reported.Our results should be interpreted in the context of previous systematic reviews and meta-analyses of preclinical studies. Systematic analyses of experimental stroke studies investigating neuroprotective treatments found that randomization of animals was reported in 42% to 50% compared with 61% before and 74% after checklist implementation in our study.7,9 Studies on Alzheimer disease, multiple sclerosis, and cancer reported randomization in only 22%, 16%, and 28%, respectively.10–12 Although these comparisons suggest that the stroke field does better than other research areas, our study reveals that only few studies describe whether a proper procedure for randomization was followed. Reporting the method of randomization is important as picking animals at random from a cage are considered unlikely to provide adequate randomization.5 Blinded assessment of outcome and allocation concealment are further key quality measures. Systematic evaluations found blinding to be reported in 40% to 58% of experimental stroke studies,7,9 in 22% of studies on Alzheimer disease,12 and in 16% of studies on multiple sclerosis,11 compared with 71% before and 86% after checklist implementation in the present analysis. A review of 100 articles on cancer showed that only 2% reported that observers were blinded to treatment.10 However, 1 might consider that these studies were published earlier and quality might have improved within the last years. Before implementation of the checklist, reporting of allocation concealment in our study was within the range of previously published studies and increased by 3-fold after checklist implementation.13 However, allocation concealment is still reported in only 26% of the analyzed studies.Outliers in animal studies are not unusual, and their exclusion can influence the study results. For example, a treatment may seem effective if it kills the most severely affected animals and their neurological outcome and infarct volume are not considered for analyses.14 Therefore, the definition of exclusion criteria and reporting of postrandomization excluded animals are part of the Basic Science Checklist. Reporting a definition of inclusion and exclusion criteria almost doubled from 23% to 41% after checklist implementation, whereas the description of postrandomization excluded animals remained stable (39% before and 40% after checklist implementation). Definition of exclusion and inclusion criteria and reporting of excluded animals have been hardly investigated in systematic studies so far. An analysis of studies published in the Journal of Cerebral Blood Flow and Metabolism in 2008 found that 19% reported inclusion and exclusion criteria and 8% mortality.11 However, reporting excluded animals might be overstated in our study because this item was fulfilled once mortality was reported, although exclusion for other reasons might have occurred that was not reported. Indeed, the relatively high number of excluded animals in a recent preclinical randomized controlled multicenter trial suggests that exclusion of animals for different reasons is more frequent as reported in monocenter studies.15The need for improving the quality of animal studies is obvious as inadequate study designs and reporting were shown to correlate with overstated findings.2,16 Strategies for achieving this goal were adopted from the clinical trial community, which developed and implemented the Consolidated Standards or Reporting Trials (CONSORT) guidelines. Analogous to the CONSORT statement, the Animal Research Reporting In vivo Experiments guidelines that were published in 2010 provide recommendations for the design and reporting of animal experimental studies.17 The question is how to enforce following such recommendations. In a landmark article on how to improve the methodological reporting of animal studies, Landis and et al14 proposed checklist implementation during submission. Our results support this strategy. However, a definitive causal relationship between the implementation of the Basic Science Checklist and the observed improvements in quality cannot be made. Actually, quality characteristics not included in the checklist were also increasingly met during the study period, including physiological monitoring, compliance with animal welfare regulations, and dose–response relationship investigation, thus suggesting that a general trend toward an improved quality exists, which is independent of checklist implementation.Our study has strengths and limitations. A limitation of our study might be that there is no final proof for Stroke Therapy Academic Industry Roundtable–derived quality criteria to improve translation because no treatment, either following or not following these recommendations, has been shown to predict a positive phase III clinical trial. However, the importance of quality standards in preclinical studies was unequivocally demonstrated.2 Another limitation is that some checklist items can hardly be judged unambiguously. Two examples are discussed above, that is, the reporting of excluded animals and randomization. In addition, although blinding and allocation concealment were frequently stated, their methods were not reported in any study. We further cannot exclude the possibility that some studies may have met key methodological parameters but did not report them. However, previous analyses showed that deficiencies in reporting quality standards itself are associated with overstated findings. Strengths of our study include the comprehensive analysis of experimental studies published in Stroke, a leading journal in the field, during 4 years that sufficiently cover the period before and after implementation of the Basic Science Checklist. Data were extracted independently by 2 assessors, and disagreements were solved after discussion with a third party.ConclusionsIn this first systematic analysis of the methodological quality of experimental studies published in Stroke, we found that relevant key measures, including randomization, blinding, and allocation concealment, improved during the study period. Quality standards were more frequently met in studies submitted after implementation of the Basic Science Checklist, suggesting that the checklist contributed to the improved quality. However, we also identified deficiencies, such as low rates of reporting the methods of randomization, inclusion and exclusion criteria, excluded animals, and allocation concealment. For further improvements, we, therefore, propose mandatory reporting of key methodological parameters in the published article and not only during submission. Such reporting can be implemented at the end of the article like the disclosure statement or in a separate formal panel. Besides reporting quality characteristics, adherence to checklist items should be mandatory for article acceptance. Moreover, information on quality characteristics should be more detailed and unambiguous.Sources of FundingJ. Minnerup was supported by the Else Kröner-Fresenius-Stiftung (2014_EKES.16), and A. Schmidt was supported by the Medical faculty of the University of Muenster.DisclosuresNone.FootnotesGuest Editor for this article was Miguel A. Perez-Pinzon, PhD.*Drs Minnerup and Zentsch contributed equally.The online-only Data Supplement is available with this article at http://stroke.ahajournals.org/lookup/suppl/doi:10.1161/STROKEAHA.115.011695/-/DC1.Correspondence to Jens Minnerup, MD, Department of Neurology, University of Münster, Albert-Schweitzer-Campus 1, 48149 Münster, Germany. E-mail [email protected]References1. O'Collins VE, Macleod MR, Donnan GA, Horky LL, van der Worp BH, Howells DW.1,026 experimental treatments in acute stroke.Ann Neurol. 2006; 59:467–477. doi: 10.1002/ana.20741.CrossrefMedlineGoogle Scholar2. Crossley NA, Sena E, Goehler J, Horn J, van der Worp B, Bath PM, et al. Empirical evidence of bias in the design of experimental stroke studies: a metaepidemiologic approach.Stroke. 2008; 39:929–934. doi: 10.1161/STROKEAHA.107.498725.LinkGoogle Scholar3. Stroke Therapy Academic Industry Roundtable (STAIR).Recommendations for standards regarding preclinical neuroprotective and restorative drug development.Stroke. 1999; 30:2752–2758.LinkGoogle Scholar4. Stroke. Archive of All Online Issues. http://stroke.ahajournals.org/content/by/year. Accessed August 2, 2015.Google Scholar5. Macleod MR, Fisher M, O'Collins V, Sena ES, Dirnagl U, Bath PM, et al. Good laboratory practice: preventing introduction of bias at the bench.Stroke. 2009; 40:e50–e52. doi: 10.1161/STROKEAHA.108.525386.LinkGoogle Scholar6. Minnerup J, Heidrich J, Wellmann J, Rogalewski A, Schneider A, Schäbitz WR.Meta-analysis of the efficacy of granulocyte-colony stimulating factor in animal models of focal cerebral ischemia.Stroke. 2008; 39:1855–1861. doi: 10.1161/STROKEAHA.107.506816.LinkGoogle Scholar7. Minnerup J, Wersching H, Diederich K, Schilling M, Ringelstein EB, Wellmann J, et al. Methodological quality of preclinical stroke studies is not required for publication in high-impact journals.J Cereb Blood Flow Metab. 2010; 30:1619–1624. doi: 10.1038/jcbfm.2010.74.CrossrefMedlineGoogle Scholar8. Minnerup J, Heidrich J, Rogalewski A, Schäbitz WR, Wellmann J.The efficacy of erythropoietin and its analogues in animal stroke models: a meta-analysis.Stroke. 2009; 40:3113–3120. doi: 10.1161/STROKEAHA.109.555789.LinkGoogle Scholar9. Philip M, Benatar M, Fisher M, Savitz SI.Methodological quality of animal studies of neuroprotective agents currently in phase II/III acute ischemic stroke trials.Stroke. 2009; 40:577–581. doi: 10.1161/STROKEAHA.108.524330.LinkGoogle Scholar10. Hess KR.Statistical design considerations in animal studies published recently in cancer research.Cancer Res. 2011; 71:625. doi: 10.1158/0008-5472.CAN-10-3296.CrossrefMedlineGoogle Scholar11. Vesterinen HM, Sena ES, ffrench-Constant C, Williams A, Chandran S, Macleod MR.Improving the translational hit of experimental treatments in multiple sclerosis.Mult Scler. 2010; 16:1044–1055. doi: 10.1177/1352458510379612.CrossrefMedlineGoogle Scholar12. Sena ES, Currie GL, McCann SK, Macleod MR, Howells DW.Systematic reviews and meta-analysis of preclinical studies: why perform them and how to appraise them critically.J Cereb Blood Flow Metab. 2014; 34:737–742. doi: 10.1038/jcbfm.2014.28.CrossrefMedlineGoogle Scholar13. Frantzias J, Sena ES, Macleod MR, Al-Shahi Salman R.Treatment of intracerebral hemorrhage in animal models: meta-analysis.Ann Neurol. 2011; 69:389–399. doi: 10.1002/ana.22243.CrossrefMedlineGoogle Scholar14. Landis SC, Amara SG, Asadullah K, Austin CP, Blumenstein R, Bradley EW, et al. A call for transparent reporting to optimize the predictive value of preclinical research.Nature. 2012; 490:187–191. doi: 10.1038/nature11556.CrossrefMedlineGoogle Scholar15. Llovera G, Hofmann K, Roth S, Salas-Pérdomo A, Ferrer-Ferrer M, Perego C, et al. Results of a preclinical randomized controlled multicenter trial (pRCT): anti-CD49d treatment for acute brain ischemia.Sci Transl Med. 2015; 7:299ra121. doi: 10.1126/scitranslmed.aaa9853.CrossrefMedlineGoogle Scholar16. Macleod MR, van der Worp HB, Sena ES, Howells DW, Dirnagl U, Donnan GA.Evidence for the efficacy of NXY-059 in experimental focal cerebral ischaemia is confounded by study quality.Stroke. 2008; 39:2824–2829. doi: 10.1161/STROKEAHA.108.515957.LinkGoogle Scholar17. Kilkenny C, Browne WJ, Cuthill IC, Emerson M, Altman DG.Improving bioscience research reporting: the ARRIVE guidelines for reporting animal research.PLoS Biol. 2010; 8:e1000412. doi: 10.1371/journal.pbio.1000412.CrossrefMedlineGoogle Scholar Previous Back to top Next FiguresReferencesRelatedDetailsCited By Friedrich J, Lindauer U and Höllig A (2022) Procedural and Methodological Quality in Preclinical Stroke Research–A Cohort Analysis of the Rat MCAO Model Comparing Periods Before and After the Publication of STAIR/ARRIVE, Frontiers in Neurology, 10.3389/fneur.2022.834003, 13 Bahr-Hosseini M, Bikson M, Iacoboni M, Liebeskind D, Hinman J, Carmichael S and Saver J (2021) PRIMED2 Preclinical Evidence Scoring Tool to Assess Readiness for Translation of Neuroprotection Therapies, Translational Stroke Research, 10.1007/s12975-021-00922-4, 13:2, (222-227), Online publication date: 1-Apr-2022. Reeves M, Gall S and Raval A (2021) Hello Authors! We Are the Technical Reviewers and Are Here to Help You!, Stroke, 53:2, (307-310), Online publication date: 1-Feb-2022. Li J, Liu H, Zhu Y, Yan L, Liu R, Wang G, Wang B and Zhao B (2022) Animal Models for Treating Spinal Cord Injury Using Biomaterials-Based Tissue Engineering Strategies, Tissue Engineering Part B: Reviews, 10.1089/ten.teb.2020.0267, 28:1, (79-100), Online publication date: 1-Feb-2022. Williams J, Chu H, Lown M, Daniel J, Meckl R, Patel D and Ibrahim R Weaknesses in Experimental Design and Reporting Decrease the Likelihood of Reproducibility and Generalization of Recent Cardiovascular Research, Cureus, 10.7759/cureus.21086 Kringe L, Sena E, Motschall E, Bahor Z, Wang Q, Herrmann A, Mülling C, Meckel S and Boltze J (2020) Quality and validity of large animal experiments in stroke: A systematic review, Journal of Cerebral Blood Flow & Metabolism, 10.1177/0271678X20931062, 40:11, (2152-2164), Online publication date: 1-Nov-2020. Fisher M and Seastrong R (2020) The Past Decade at Stroke, Stroke, 51:3, (1032-1035), Online publication date: 1-Mar-2020.Ramirez F, Jung R, Motazedian P, Perry-Nguyen D, Di Santo P, MacDonald Z, Clancy A, Labinaz A, Promislow S, Simard T, Provencher S, Bonnet S, Graham I, Wells G and Hibbert B (2019) Journal Initiatives to Enhance Preclinical Research: Analyses of Stroke, Nature Medicine, Science Translational Medicine, Stroke, 51:1, (291-299), Online publication date: 1-Jan-2020. McCann S and Lawrence C (2020) Comorbidity and age in the modelling of stroke: are we still failing to consider the characteristics of stroke patients?, BMJ Open Science, 10.1136/bmjos-2019-100013, 4:1, (e100013), Online publication date: 1-Jan-2020. Macleod M and Mohan S (2019) Reproducibility and Rigor in Animal-Based Research, ILAR Journal, 10.1093/ilar/ilz015, 60:1, (17-23), Online publication date: 31-Dec-2020. Yarborough M, Nadon R and Karlin D (2019) Four erroneous beliefs thwarting more trustworthy research, eLife, 10.7554/eLife.45261, 8 (2019) Did a change in Nature journals' editorial policy for life sciences research improve reporting?, BMJ Open Science, 10.1136/bmjos-2017-000035, 3:1, (e000035), Online publication date: 1-Feb-2019. Zingarelli B, Coopersmith C, Drechsler S, Efron P, Marshall J, Moldawer L, Wiersinga W, Xiao X, Osuchowski M and Thiemermann C (2019) Part I: Minimum Quality Threshold in Preclinical Sepsis Studies (MQTiPSS) for Study Design and Humane Modeling Endpoints, Shock, 10.1097/SHK.0000000000001243, 51:1, (10-22), Online publication date: 1-Jan-2019. Remick D, Ayala A, Chaudry I, Coopersmith C, Deutschman C, Hellman J, Moldawer L and Osuchowski M (2019) Premise for Standardized Sepsis Models, Shock, 10.1097/SHK.0000000000001164, 51:1, (4-9), Online publication date: 1-Jan-2019. Hietamies T, Ostrowski C, Pei Z, Feng L, McCabe C, Work L and Quinn T (2018) Variability of functional outcome measures used in animal models of stroke and vascular cognitive impairment – a review of contemporary studies, Journal of Cerebral Blood Flow & Metabolism, 10.1177/0271678X18799858, 38:11, (1872-1884), Online publication date: 1-Nov-2018. Schäbitz W, Fisher M and Aronowski J (2018) Call for Basic Science Papers, Stroke, 49:8, (1803-1804), Online publication date: 1-Aug-2018. Detante O, Muir K and Jolkkonen J (2017) Cell Therapy in Stroke—Cautious Steps Towards a Clinical Treatment, Translational Stroke Research, 10.1007/s12975-017-0587-6, 9:4, (321-332), Online publication date: 1-Aug-2018. Yarborough M, Bredenoord A, D'Abramo F, Joyce N, Kimmelman J, Ogbogu U, Sena E, Strech D and Dirnagl U (2018) The bench is closer to the bedside than we think: Uncovering the ethical ties between preclinical researchers in translational neuroscience and patients in clinical trials, PLOS Biology, 10.1371/journal.pbio.2006343, 16:6, (e2006343) Ramirez F and Hibbert B (2018) Letter by Ramirez and Hibbert Regarding Article, "Consideration of Sex Differences in Design and Reporting of Experimental Arterial Pathology Studies: A Statement From the Arteriosclerosis, Thrombosis, and Vascular Biology Council", Arteriosclerosis, Thrombosis, and Vascular Biology, 38:6, (e99-e100), Online publication date: 1-Jun-2018.Provencher S, Archer S, Ramirez F, Hibbert B, Paulin R, Boucherat O, Lacasse Y and Bonnet S (2018) Standards and Methodological Rigor in Pulmonary Arterial Hypertension Preclinical and Translational Research, Circulation Research, 122:7, (1021-1032), Online publication date: 30-Mar-2018.Nallamothu B (2018) Checklists and Circulation: Cardiovascular Quality and Outcomes, Circulation: Cardiovascular Quality and Outcomes, 11:1, Online publication date: 1-Jan-2018.Cui L, Golubczyk D and Jolkkonen J (2017) Top 3 Behavioral Tests in Cell Therapy Studies After Stroke, Stroke, 48:11, (3165-3167), Online publication date: 1-Nov-2017. Bahor Z, Liao J, Macleod M, Bannach-Brown A, McCann S, Wever K, Thomas J, Ottavi T, Howells D, Rice A, Ananiadou S and Sena E (2017) Risk of bias reporting in the recent animal focal cerebral ischaemia literature, Clinical Science, 10.1042/CS20160722, 131:20, (2525-2532), Online publication date: 15-Oct-2017. Schäbitz W, Minnerup J, Fisher M and Aronowski J (2017) High Appraisal of Methodological Quality of Basic Science Articles Published in Stroke, Stroke, 48:9, (2337-2338), Online publication date: 1-Sep-2017.Bolli R (2017) New Initiatives to Improve the Rigor and Reproducibility of Articles Published in Circulation Research, Circulation Research, 121:5, (472-479), Online publication date: 18-Aug-2017.Ramirez F, Motazedian P, Jung R, Di Santo P, MacDonald Z, Moreland R, Simard T, Clancy A, Russo J, Welch V, Wells G and Hibbert B (2017) Methodological Rigor in Preclinical Cardiovascular Studies, Circulation Research, 120:12, (1916-1926), Online publication date: 9-Jun-2017. Berge E, Salman R, van der Worp H, Stapf C, Sandercock P, Sprigg N, Macleod M, Kelly P, Nederkoorn P, Ford G, Arnold M, Berge E, Diez-Tejedor E, Jatuzis D, Kelly P, Krieger D, Nederkoorn P, Sandercock P, Stapf C, Weimar C, Ford G and Salman R (2017) Increasing value and reducing waste in stroke research, The Lancet Neurology, 10.1016/S1474-4422(17)30078-9, 16:5, (399-408), Online publication date: 1-May-2017. Reichlin T, Vogt L, Würbel H and Williams B (2016) The Researchers' View of Scientific Rigor—Survey on the Conduct and Reporting of In Vivo Research, PLOS ONE, 10.1371/journal.pone.0165999, 11:12, (e0165999) Vogt L, Reichlin T, Nathues C, Würbel H and Dirnagl U (2016) Authorization of Animal Experiments Is Based on Confidence Rather than Evidence of Scientific Rigor, PLOS Biology, 10.1371/journal.pbio.2000598, 14:12, (e2000598) Vahidy F, Schäbitz W, Fisher M and Aronowski J (2016) Reporting Standards for Preclinical Studies of Stroke Therapy, Stroke, 47:10, (2435-2438), Online publication date: 1-Oct-2016. McCann S, Cramond F, Macleod M and Sena E (2016) Systematic Review and Meta-Analysis of the Efficacy of Interleukin-1 Receptor Antagonist in Animal Models of Stroke: an Update, Translational Stroke Research, 10.1007/s12975-016-0489-z, 7:5, (395-406), Online publication date: 1-Oct-2016. January 2016Vol 47, Issue 1 Advertisement Article InformationMetrics © 2015 American Heart Association, Inc.https://doi.org/10.1161/STROKEAHA.115.011695PMID: 26658439 Manuscript receivedOctober 1, 2015Manuscript acceptedNovember 4, 2015Originally publishedDecember 10, 2015Manuscript revisedNovember 2, 2015 Keywordscerebral infarctionmodels, animalanimalscheckliststrokePDF download Advertisement SubjectsAnimal Models of Human DiseaseBasic Science ResearchIschemic Stroke
Referência(s)