Revisão Acesso aberto Revisado por pares

Part 8: Education, Implementation, and Teams

2015; Lippincott Williams & Wilkins; Volume: 132; Issue: 16_suppl_1 Linguagem: Inglês

10.1161/cir.0000000000000277

ISSN

1524-4539

Autores

Farhan Bhanji, Judith Finn, Andrew Lockey, Koenraad G. Monsieurs, Robert Frengley, Taku Iwami, Eddy Lang, Matthew Huei‐Ming, Mary E. Mancini, Mary Ann McNeil, Robert Greif, John E. Billi, Vinay Nadkarni, Blair L. Bigham, Janet Bray, Jan Breckwoldt, Steven C. Brooks, Adam Cheng, Aaron Donoghue, Jonathan P. Duff, Dana P. Edelson, Henrik Fischer, Elaine Gilfoyle, Ming‐Ju Hsieh, David A. Kloeck, Patrick Chow‐In Ko, Marion Leary, Theresa M. Olasveengen, Jon C. Rittenberger, Robert D. Schultz, Dion Stub, Zuzana Triska, Traci A. Wolbrink, Chih‐Wei Yang, Joyce Yeung,

Tópico(s)

Disaster Response and Management

Resumo

HomeCirculationVol. 132, No. 16_suppl_1Part 8: Education, Implementation, and Teams Free AccessResearch ArticlePDF/EPUBAboutView PDFView EPUBSections ToolsAdd to favoritesDownload citationsTrack citationsPermissions ShareShare onFacebookTwitterLinked InMendeleyReddit Jump toFree AccessResearch ArticlePDF/EPUBPart 8: Education, Implementation, and Teams2015 International Consensus on Cardiopulmonary Resuscitation and Emergency Cardiovascular Care Science With Treatment Recommendations Farhan Bhanji, Judith C. Finn, Andrew Lockey, Koenraad Monsieurs, Robert Frengley, Taku Iwami, Eddy Lang, Matthew Huei-Ming Ma, Mary E. Mancini, Mary Ann McNeil, Robert Greif, John E. Billi, Vinay M. Nadkarni and Blair Bigham Farhan BhanjiFarhan Bhanji , Judith C. FinnJudith C. Finn , Andrew LockeyAndrew Lockey , Koenraad MonsieursKoenraad Monsieurs , Robert FrengleyRobert Frengley , Taku IwamiTaku Iwami , Eddy LangEddy Lang , Matthew Huei-Ming MaMatthew Huei-Ming Ma , Mary E. ManciniMary E. Mancini , Mary Ann McNeilMary Ann McNeil , Robert GreifRobert Greif , John E. BilliJohn E. Billi , Vinay M. NadkarniVinay M. Nadkarni and Blair BighamBlair Bigham and on behalf of the Education, Implementation, and Teams Chapter Collaborators Originally published20 Oct 2015https://doi.org/10.1161/CIR.0000000000000277Circulation. 2015;132:S242–S268IntroductionCurrent evidence demonstrates considerable variability in cardiac arrest survival in and out of hospital and, therefore, substantial opportunity to save many more lives.1–3 The Formula for Survival4 postulates that optimal survival from cardiac arrest requires high-quality science, education of lay providers and healthcare professionals, and a well-functioning Chain of Survival5 (implementation).The Education, Implementation, and Teams (EIT) Task Force of the International Liaison Committee on Resuscitation (ILCOR) set out to define the key PICO (population, intervention, comparator, outcome) questions related to resuscitation education (includingteamwork skills) and systems-level implementation that would be reviewed by 2015. The selection of questions was supported through the use of an online anonymous task force member–only voting process where the results were considered in the ultimate consensus decisions of the task force. Topics from the 2010 evidence review process were scrutinized for relevance, the potential to improve outcomes, and the likelihood of new evidence being published since 2010. Finally, PICO questions for which the Grading of Recommendations, Assessment, Development, and Evaluation (GRADE) process was not as well developed at the time of PICO selection were deferred until at least after the 2015 cycle. We planned to reduce the total number of PICO questions reviewed to provide more in-depth and evidence-based reviews of the included questions. New topics were determined on the basis of the evolving literature and changes in resuscitation practice. Input on the selection of PICO questions was sought from the general public through the ILCOR website and from ILCOR member resuscitation councils through their council chairs and individual task force members.The GRADE ProcessThe EIT Task Force performed detailed systematic reviews based on the recommendations of the Institute of Medicine of the National Academies6 and using the methodological approach proposed by the GRADE Working Group.7 After identification and prioritization of the questions to be addressed (using the PICO format),8 with the assistance of information specialists, a detailed search for relevant articles was performed in each of 3 online databases (PubMed, Embase, and the Cochrane Library).By using detailed inclusion and exclusion criteria, articles were screened for further evaluation. The reviewers for each question created a reconciled risk of bias assessment for each of the included studies, using state-of-the-art tools: Cochrane for randomized controlled trials (RCTs),9 Quality Assessment of Diagnostic Accuracy Studies (QUADAS)-2 for studies of diagnostic accuracy,10 and GRADE for observational studies that inform both therapy and prognosis questions.11GRADE Evidence Profile tables12 were then created to facilitate an evaluation of the evidence in support of each of the critical and important outcomes. The quality of the evidence (or confidence in the estimate of the effect) was categorized as high, moderate, low, or very low,13 based on the study methodologies and the 5 core GRADE domains of risk of bias, inconsistency, indirectness, imprecision, and other considerations (including publication bias).14These evidence profile tables were then used to create a written summary of evidence for each outcome (the Consensus on Science statements). Whenever possible, consensus-based treatment recommendations were then created. These recommendations (designated as strong or weak) were accompanied by an overall assessment of the evidence and a statement from the task force about the values and preferences that underlie the recommendations.Further details of the methodology that underpinned the evidence evaluation process are found in "Part 2: Evidence Evaluation and Management of Conflicts of Interest."To our knowledge, this is the first time that GRADE has been applied on a large scale to education literature in health. Detailed review of the evidence, the Consensus on Science statements, and treatment recommendations occurred within the task force, and most final recommendations reflect the consensus of the task force. In some instances, the task force could not reach consensus and a vote was required; greater than 50% agreement was adequate for standard decisions on wording, and 70% agreement was required for treatment recommendations that were discordant with the quality of evidence.The EIT Task Force spent considerable time deliberating on the scoring of the importance of outcomes according to the GRADE approach, particularly with respect to educational studies. In contrast to clinical studies, where direct patient outcomes are commonly measured, in educational research, which often include manikin studies, participant learning outcomes are very common. After considerable task force discussion, for education PICO questions, patient-related outcomes and actual performance in the clinical setting were deemed the critical outcomes, with learning-related outcomes (immediate and longer retention) classed as important. Kirkpatrick's classic model of Program Evaluation15 as well as McGaghie's16 T1 to T3 for simulation research both align with the notion that patient-related (and system-related) outcomes are more relevant than transfer of learning from the education programs to the clinical environment, which in turn is more important than isolated demonstration of learning in a training setting. Recognizing the considerable body of evidence demonstrating a decay of resuscitation skills within weeks to months after a course, long-term retention of learning was considered a more robust outcome than learning assessed at the time of the training. Similarly, resuscitation is considered a (psychomotor or leadership/teamwork) skill; therefore, "skills" were considered to be higher-level outcomes than "knowledge." The published resuscitation education literature and subsequent GRADE analysis were frequently limited by the heterogeneous nature of the interventions (with frequent downgrades for inconsistency) and the quality of the assessment tools (outcome measures). In keeping with systematic review methodology, meta-analysis was conducted in specific PICO questions only when studies of similar design, interventions, and target populations reported comparable outcomes.The EIT Task Force reviewed 17 PICO questions, which was a reduction of 15 questions from 2010. The questions selected included the following:Basic Life Support TrainingCardiopulmonary resuscitation (CPR) instruction methods (self-instruction versus traditional) (EIT 647)Automated external defibrillator (AED) training methods (EIT 651)Timing for basic life support (BLS) retraining (EIT 628)Resource-limited settings (EIT 634)BLS training for high-risk populations (EIT 649)Compression-only CPR training (EIT 881)Advanced Life Support TrainingPrecourse preparation for advanced life support (ALS) courses (EIT 637)High-fidelity manikins in training (EIT 623)Team and leadership training (EIT 631)Timing for advanced resuscitation training (EIT 633)ImplementationImplementation of guidelines in communities (EIT 641)Cardiac arrest centers (EIT 624)Social media technologies (EIT 878)Measuring performance of resuscitation systems (EIT 640)CPR feedback devices in training (EIT 648)Debriefing of resuscitation performance (EIT 645)Medical emergency teams (METs) for adults (EIT 638)Summary of New Treatment RecommendationsThe following is a summary of the most important new reviews or changes in recommendations for education, implementation, and teams since the last ILCOR review, in 2010:TrainingHigh-fidelity manikins may be preferred to standard manikins at training centers/organizations that have the infrastructure, trained personnel, and resources to maintain the program.CPR feedback devices (providing directive feedback) are useful for learning psychomotor CPR skills.One- to 2-year retraining cycles are not adequate to maintain competence in resuscitation skills. The optimal retraining intervals are yet to be defined, but more frequent training may be helpful for providers likely to encounter a cardiac arrest.Systems LevelYou can't improve what you can't measure, so systems that facilitate performance measurement and quality improvement initiatives are to be used where possible.Data-driven performance-focused debriefing can help improve future performance of resuscitation teams.Out-of-hospital cardiac arrest (OHCA) victims should be considered for transport to a specialist cardiac arrest center as part of a wider regional system of care.There have been advances in the use of technology and social media for notification of the occurrence of suspected OHCA and sourcing of bystanders willing to provide CPR.BLS TrainingBLS is foundational in the care of cardiac arrest victims. For the OHCA victim, the goal is to increase rates of bystander CPR and deliver prompt defibrillation, because these are the major determinants of the community Chain of Survival. Unfortunately, only a minority of cardiac arrest victims actually receive bystander CPR, and it is difficult for potential rescuers to overcome barriers such as panic, fear of harming the victim, concern about the rescuers' inability to perform CPR correctly, physical limitations, fear of liability or infection, or in some instances the victim's characteristics.17 Recent training in CPR,17–19 along with dispatcher-assisted CPR,20,21 may help overcome these barriers and save more lives. For healthcare professionals, it is the quality of CPR delivered that is critical, because poor compliance with recommended guidelines has been associated with lower survival.22,23 Suboptimal CPR is common24 but should be considered a preventable harm, and quality improvement processes should be implemented to try to minimize its occurrence.The ILCOR EIT Task Force chose the following PICO questions as part of the review of BLS training:Video- or computer-assisted self-instruction versus traditional coursesAlternate methods to train in AED useTiming of BLS retrainingAn additional PICO question on the use of CPR feedback devices in training was also conducted and is documented later in this article, along with the corresponding PICO questions on the use of feedback devices in clinical practice (BLS 361) and the use of feedback devices as part of the quality improvement process (EIT 640).CPR Instruction Methods (Self-Instruction Versus Traditional) (EIT 647)Among students who are taking BLS courses in an educational setting (P), does video or computer self-instructions (I), compared with traditional instructor-led courses (C), change survival, skill performance in actual resuscitations, skill performance at 1 year, skill performance at course conclusion, cognitive knowledge (O)?Consensus on ScienceNo studies addressed the critical outcomes of skill performance in actual resuscitations or survival of patients.For the important outcome of cognitive knowledge, we have identified low-quality evidence (downgraded for serious risk of bias and imprecision) from 4 RCTs with a total of 370 students showing no differences between self-instruction and instructor-led courses (using a multiple-choice questionnaire at course conclusion and at 2 months to 1 year).25–28For the important outcome of skill performance at course conclusion, we have identified very-low-quality evidence (downgraded for risk of bias, inconsistency, and imprecision) from 9 RCTs25,29–36 and 1 randomized cluster-controlled trial37 with a total of 2023 students showing no differences between self-instruction and instructor-led courses based on failure to pass total performance evaluation by instructors using checklists (relative risk [RR], 1.09; 95% confidence interval [CI], 0.66–1.83).For the important outcome of skill performance at 1 year, we have identified low-quality evidence (downgraded for risk of bias and imprecision) from 2 RCTs with a total of 234 students showing no differences between self-instruction and traditional instruction based on failure to pass the total performance evaluation by instructors using checklists (RR, 0.91; 95% CI, 0.61–1.35).28,38Treatment RecommendationsWe suggest that video and/or computer-based self-instruction with synchronous or asynchronous hands-on practice may be an effective alternative to instructor-led courses (weak recommendation, very-low-quality evidence).Values, Preferences, and Task Force InsightsDespite heterogeneity in the delivery of video and/or computer-based instruction and in the evaluation methods among different studies, we make this recommendation based on the absence of differences in the outcomes between self-instruction versus instructor-led courses. In making this recommendation, we place higher value on the potential reduction in time and resources with self-instruction, which could translate to increased CPR training.The EIT Task Force recognized the considerable heterogeneity in the interventions on self-instruction (computer versus video assisted; with or without hands-on practice) and challenge with lumping them together (ie, a poorly designed computer-based learning activity is very different from a well-designed one), yet they are grouped together in the GRADE process. Nonetheless, the task force developed consensus that this was an important PICO question that had the potential to increase the number of lay providers available to respond to cardiac arrests and potentially the subsequent survival for victims in a time- and resource-wise manner.Knowledge GapsDo students receiving self-instruction courses have better skill performance in actual resuscitations and further improve the rate of return of spontaneous circulation (ROSC) and survival to hospital discharge of patients when compared with those receiving traditional courses?The teaching material of the video or the computer and different type of self-instruction teaching courses might affect the learning effect.AED Training Methods (EIT 651)Among students who are taking AED courses in an educational setting (P), does any specific training intervention (I), compared with traditional lecture/practice sessions (C), change clinical outcome, skill performance in actual resuscitations, skill performance at 1 year, skill performance at course conclusion, cognitive knowledge, use of AEDs (O)?Consensus on ScienceNo study addressed the critical outcomes of skill performance in an actual resuscitation or patient outcome.All studies for this PICO question were manikin based, and all participants were adults.16,36,37,39–42 The included studies used manikin-based scenarios as the standard method for assessment, and end points did not extend beyond skill retention after 6 months. Substantial heterogeneity was found for interventions and controls, and for time points of assessment. Except for 2 studies40,41 none investigated AED training in isolation. All other studies address the whole sequence of BLS together with AED related outcomes.To account for the nature of training, 4 subquestions were specified. For both groups of lay providers and healthcare providers, the question was subdivided into (a) self-instruction without (or with minimal) instructor involvement versus a traditional instructor-led course, and (b) self-instruction combined with instructor-led versus a traditional course.For Lay ProvidersFor the subquestion of self-instruction without (or with minimal) instructor involvement versus a traditional instructor-led course, we identified low-quality evidence (downgraded for indirectness) addressing the important outcome of skill retention after 2 to 6 months.16,36,40,41For 2 of the investigated DVD-based teaching methods, the RR to pass the overall test directly after the course was only 0.36 (95% CI, 0.25–0.53), and 0.35 (95% CI, 0.24–0.51) if compared with instructor-led training.40 No significant difference was found 2 months after training when comparing a computer-learning-only course to instructor-led training.16 No significant difference was reported for AED performance (time to first shock and AED placement) for a video self-learning intervention of 30 minutes in comparison with instructor-led training of 3 to 4 hours.36 Training for senior citizens (video self-training of 11 minutes plus 45 minutes of manikin training plus minimal instructor) was not significantly different compared with the control group. This study also suggests a saving of resources by the alternative training method.41For the subquestion of self-instruction combined with instructor-led versus traditional courses, we identified low-quality evidence (downgraded for indirectness) addressing the important outcomes of skill retention after 2 months for the following 2 studies:Interactive computer session of 45 minutes plus 45 minutes of instructor-based practice led to results comparable with those from a traditional course of the same duration.16 AED skills remained rather stable over 2 months, while CPR skills deteriorated significantly.A 9-minute DVD plus manikin training plus scenario training was inferior to traditional training, with an RR to pass the overall test of 0.55, which increased to 0.84 after 2 months.40 This may indicate a potential learning effect of the short postcourse test.For Healthcare ProvidersFor the subquestion of self-instruction without (or with minimal) instructor involvement versus traditional instructor-led courses, we identified very-low-quality evidence (downgraded for indirectness and imprecision) addressing the important outcome of skill performance at end of course, or 2 weeks after completion.Isolated self-instructed training was as efficient as traditional training, but testing was limited to the end of the course.37 No differences were found between groups, but significant time (and financial) savings were reported.39 However, the sample size was very small. Another study showed worse results for theory-only training, but this study was flawed because the control group was inadequate.42For the subquestion of self-instruction combined with instructor-led versus traditional courses, we identified low-quality evidence (downgraded for indirectness) for the important outcomes of skill performance at end of course, or 2 weeks after completion. Training time was reduced while performance was only slightly reduced. A 40-minute skills lab training plus instructor was associated with a higher rate of mistakes in AED operations.37 In another study, no differences were found between groups, but significant time (and financial) savings were reported in the self-instruction combined with instructor-training group39; however, the sample size was very small.Treatment RecommendationFor lay providers learning AED skills, we suggest that self-instruction combined with short instructor-led training may replace longer traditional courses (weak recommendation, low-quality evidence).For healthcare providers learning AED skills, we suggest that self-directed training (as short as 40 minutes) may be used in place of traditional training (weak recommendation, low-quality evidence).Values, Preferences, and Task Force InsightsIn making this recommendation, we place value on pragmatic considerations such that if instructor-led training is not available, then self-directed training (or no training at all ["just do it"]) is an acceptable pragmatic option to use AED as stated in the 2010 International Consensus on Cardiopulmonary Resuscitation and Emergency Cardiovascular Care Science With Treatment Recommendations (CoSTR).18,19Very little research was conducted on AED teaching outside of the context of a (standard) BLS course (only 2 studies40,41 reported on that setting). All data were extracted from studies in the context of BLS teaching.The ILCOR 2010 CoSTR stated that laypeople and healthcare providers could use an AED without training16,43,44 and that untrained individuals could deliver a shock with an AED.45–47 The current systematic review investigated whether a specific training intervention in an educational setting changed clinical or learning outcomes.The original intent was to produce a single consensus on science with treatment recommendations based on a single PICO question. As the literature was reviewed, it became clear that there was marked heterogeneity in populations studied and the types of interventions, so multiple subsections were developed with multiple treatment recommendations.Knowledge GapsProperly powered studies are needed where the primary outcome is AED use in the clinical setting and patient outcomes are considered.The optimal duration of AED training is still unclear.The effectiveness and optimal timing of brief refresher training should be evaluated.The most suitable methods to train children/adolescents need to be determined.Timing for BLS Retraining (EIT 628)Among students who are taking BLS courses (P), does any specific interval for update or retraining (I), compared with standard practice (ie, 12 or 24 monthly) (C), change patient outcomes, skill performance in actual resuscitations, skill performance at 1 year, skill performance at course conclusion, cognitive knowledge (O)?Consensus on ScienceFor critical outcomes of patient outcome and skill performance during actual resuscitation, we found no published evidence.For the important outcome of skill performance 3 to 12 months after initial training, we identified very-low-quality evidence (downgraded for risk of bias, inconsistency, and indirectness) from 3 RCTs48–50 and 2 non-RCTs51,52 evaluating the effects of additional updates or retraining compared with standard practice (12–24 monthly). The heterogeneous nature of the studies prevented pooling of data. Two studies (1 RCT and 1 non-RCT) evaluated the effect of high-frequency, low-dose training (6 minutes monthly practice and every-2-weeks video reminder) after standard BLS courses and demonstrated benefit on CPR performance (compression depth, 40.3±6.6 versus 36.5±7.7 mm)50 and on time to shock delivery (time [mean±SD], 60.0±12.9 versus 73.6±22 s).52 Two other RCTs and 1 non-RCT conducting a variety of retraining and evaluating 5 to 6 months after the retraining showed no benefit on chest compression quality or time to shock delivery.48,49,51For the important outcome of cognitive knowledge, we identified very-low-quality evidence (downgraded for risk of bias, inconsistency, and indirectness) from 1 RCT48 demonstrating improved self-reported confidence score (96 versus 92; P=0.038) after additional traditional BLS retraining and 1 non-RCT52 demonstrating increased willingness to perform CPR (RR, 0.62; 95% CI, 0.40–0.96) after high-frequency, low-dose training (every-2-weeks video reminder).52Studies evaluating BLS skill retention demonstrated rapid decay in BLS skills (eg, chest compression quality and time to defibrillation) within 3 to 12 months after initial training.18,19Treatment RecommendationsThere is insufficient evidence to recommend the optimum interval or method for BLS retraining for laypeople.Because there is evidence of skills decay within 3 to 12 months after BLS training and evidence that frequent training improves CPR skills, responder confidence, and willingness to perform CPR, we suggest that individuals likely to encounter cardiac arrest consider more frequent retraining (weak recommendation, very-low-quality evidence).Values, Preferences, and Task Force InsightsIn making this recommendation, we place emphasis on the need for individuals and organizations to determine the importance of BLS skill maintenance, based on their local context and the feasibility of more frequent training.The search strategy for this PICO question focused on lay providers, but the results were considered to be generalizable. The EIT Task Force debated at length whether to recommend a specific interval for retraining, but opted to leave this to the discretion of the organizations involved because the only evidence is that CPR skills decay before the currently recommended 12- to 24-month retraining intervals.Knowledge GapsThere is limited evidence evaluating the effect of shorter intervals between BLS courses.High-frequency, low-dose training shows some promise, and could potentially enhance BLS training and reduce skill decay. More studies are needed to confirm the role of such training.There is significant heterogeneity of initial training, timing and contents of retraining, and outcomes among current studies. There is a need for development of guidelines to ensure uniform testing and reporting in BLS training and simulation research.Basic Life Support: Other ConsiderationsThere are several issues that impact the optimal design and implementation of BLS training within communities. The ILCOR EIT Task Force chose to focus on PICO questions that aligned with the GRADE methodology for intervention questions and that could have a relatively immediate impact to help save more lives or could identify important knowledge gaps that require further research.For 2015, the ILCOR EIT Task Force chose to focus onEducational approaches to resuscitation training in resource-limited settingsFocused training of likely rescuers for high-risk populationsThe impact of training communities to use compression-only CPRResource-Limited Settings (EIT 634)Among students who are taking BLS or ALS courses in a resource-limited educational setting (P), does any educational approach (I), compared with other approaches (C), change clinical outcome, skill performance in actual resuscitations, skill performance at 1 year, skill performance at time between course conclusion and 1 year, skill performance at course conclusion, cognitive knowledge (O)?Consensus on ScienceFor the critical outcomes of change in clinical outcome and skill performance in actual resuscitations and the important outcome of skill performance at 1 year, we found no evidence in low-resource settings.For the important outcome of skill performance at time of course conclusion and 1 year, we found very-low-quality evidence (downgraded for serious risk of bias, imprecision, and possible publication bias) from 2 RCTs.53,54 One study tested cognitive and skill retention 3 weeks after ALS refresher training in 3 arms, namely simulation (traditional course format) versus multimedia (computer-based learning) and self-directed reading.53 In another study, students were tested at 3 and 6 months after training.54 This study involved BLS training in a traditional course format versus limited instruction (larger student-to-instructor ratio) and self-directed computer-based learning. All modalities were shown to be equivocal or to have mixed but not constant benefit over traditional format.For the important outcome of skill performance at course conclusion, we identified 6 RCTs53–58 and 1 observational study.59 Studies varied significantly in the subject taught from BLS to ALS, range of participants (paramedic students, medical students in various stages of training, nursing staff, general healthcare providers), duration of course, and training methods. Educational strategies included traditional course format versus computer-based learning, telemedicine, self-directed reading, limited instruction (larger student-to-instructor ratio), 4-stage skill teaching, video instruction, and video-based group feedback. Studies ranged from very-low-quality evidence53 (downgraded for serious risk of bias and imprecision) to moderate-quality evidence55–58 (downgraded for imprecision).Because the outcome of skill performance in all 7 studies53–59 demonstrated equivocal or minimal benefit in skill performance compared with traditional course format, we suggest the possibility of using other training methods for teaching BLS or ALS. However, the heterogeneity of the studies makes it unclear what this alternative method might be (weak recommendation, low-quality evidence).For the important outcome of cognitive knowledge, we identified 4 RCTs: 2 were of very low quality (downgraded for serious risk of bias, imprecision, and possible publication bias),53,54 1 was of low quality (downgraded for risk of bias and imprecision),57 and 1 was of moderate quality (downgraded for imprecision).55 These studies differed in the teaching methods used to compare cognitive outcome, including simulation (traditional course format), multimedia (computer-based learning), self-directed reading, limited instruction (larger student-to-instructor ratio), and self-directed computer-based learning. In comparing traditio

Referência(s)
Altmetric
PlumX