Error and Root Cause Analysis
2017; Elsevier BV; Volume: 17; Issue: 10 Linguagem: Inglês
10.1093/bjaed/mkx019
ISSN2058-5357
AutoresMhairi Jhugursing, Valerie Dimmock, Haresh Mulchandani,
Tópico(s)Healthcare Technology and Patient Monitoring
ResumoKey points•Errors can be defined as an act of commission or omission leading to an undesirable outcome or potentially undesirable outcome.•Error can be classified as active or latent, individual or system based.•Addressing the individual error may prevent the person repeating the same error, whereas addressing latent errors and contributing human factors may prevent an entire organization from making the error again.•In the NHS, incident reporting systems are a form of voluntary staff-led error data collection and collation.•Root cause analysis is a systematic process whereby the factors that contribute to an incident are identified and learned from. •Errors can be defined as an act of commission or omission leading to an undesirable outcome or potentially undesirable outcome.•Error can be classified as active or latent, individual or system based.•Addressing the individual error may prevent the person repeating the same error, whereas addressing latent errors and contributing human factors may prevent an entire organization from making the error again.•In the NHS, incident reporting systems are a form of voluntary staff-led error data collection and collation.•Root cause analysis is a systematic process whereby the factors that contribute to an incident are identified and learned from. 'Knowledge and error flow from the same mental sources, only success can tell one from the other.'1Mach E Knowledge and Error. Reidel, Dordrecht, Holland1976Crossref Google Scholar An error is an act that can lead to an undesirable outcome. Everyone makes mistakes; it is part of the human condition. Cognitive psychologists believe that slips, lapses, and mistakes are inevitable, as they are the price we pay for advanced higher cerebral function. Specifically, a medical error is a preventable adverse effect of care, whether or not it is evident or harmful to the patient. This might include an inaccurate or incomplete diagnosis or treatment of a disease, injury, behaviour, infection, or other ailment. But how do we reconcile this with modern health care, where safe effective patient care is the pinnacle of our practice? James Reason, a British psychologist, founded the framework of error classification and management in health care in the 1990s. As health care has evolved over time, so has our understanding of error and its management. Reason's work forms the basis of the rapidly expanding field of patient safety. Anaesthesia was one of the first specialities to introduce patient safety guidelines into practice. Patient safety was further highlighted by the Elaine Bromiley case in 2005. This article explores the theory of error occurrence and systems we use to learn from it. Errors can be defined as an act of commission or omission leading to an undesirable outcome or potential undesirable outcome.2Wachter RM Understanding Patient Safety. 2nd Edn. McGraw-Hill Lange, New York2012: 33-51Google Scholar Fortunately, not all errors lead to actual patient harm. Errors can be classified according to:8St Pierre M Hofinger G Buerschaper C Simon R Crisis Management in Acute Care Setting. 2nd Edn. Springer, New York2015: 41-59Google Scholar,9Gregory B Kaprielian VS Anatomy of Error Module Duke University of Medicine.Available from http://patientsafetyed.duhs.duke.edu/module_e/module_overview.htmlGoogle Scholar •The time point the error occurred relative to the error identification, i.e. active and latent errors.•The thought process related to the error, poor planning, or poor execution of a task.•The conscious movement away from established rules within an organization, i.e. violations and malevolent acts.•Team-working errors. Errors can be thought of as active or latent (Fig. 1). Active errors occur at the 'sharp end' of health care, where an action directly causes an error in real time.8St Pierre M Hofinger G Buerschaper C Simon R Crisis Management in Acute Care Setting. 2nd Edn. Springer, New York2015: 41-59Google Scholar–10Wheeler SJ Wheeler DW Medication errors in anaesthesia and critical care.Anaesthesia. 2000; 60: 257-273Crossref Scopus (89) Google Scholar These errors tend to be person/individual related or immediate equipment failures. Examples include an anaesthetist administering an incorrect drug dose to a patient or not turning on the oxygen flow for pre-oxygenation in a rapid sequence induction of anaesthesia. Active errors are usually picked up quickly at the time of the incident and therefore more easily identified. Latent errors occur at the 'blunt end' of health care. They do not directly cause the error and occur upstream of the event. These errors are wide ranging—the building/room layout, organizational processes, human resources, equipment failure, or medication error. These errors tend to be 'systems' related. They are the factors that have contributed to the event.8St Pierre M Hofinger G Buerschaper C Simon R Crisis Management in Acute Care Setting. 2nd Edn. Springer, New York2015: 41-59Google Scholar For example, many drugs have similar packaging (see Fig. 2), and the wrong drug or concentration could easily be administered. Some pumps have multiple steps to their programming, making it difficult to check the correct data have been entered. An example of a human resource error is not enough anaesthetists available to attend a crisis in another theatre during a normal working day. Latent errors can lie dormant for a long period of time before a situation arises that leads to the error being identified. There has been extensive investment in latent error reduction in many high-risk industries, with human factors engineering playing an important role. To use an analogy, 'active failures are like mosquitoes. They can be swatted one by one, but they still keep coming. The best remedies are to create more effective defences and to drain the swamps in which they breed. The swamps, in this case, are the ever present latent conditions.'11Reason J Human error: models and management.BMJ. 2000; 320: 768-770Crossref PubMed Scopus (3631) Google Scholar An executive error is an active error occurring at an individual level. A task broadly has two stages—the planning phase and the action phase. An executive error occurs when an intended outcome is not achieved because of a deficit in the actions carried out.8St Pierre M Hofinger G Buerschaper C Simon R Crisis Management in Acute Care Setting. 2nd Edn. Springer, New York2015: 41-59Google Scholar Executive errors can be further divided into slips and lapses. Slips are due to attentional failures, usually when undertaking familiar tasks. Familiar tasks have usually been undertaken many times previously and are almost an automatic behaviour, requiring little conscious effort3Wachter RM Understanding Patient Safety. 2nd Edn. McGraw-Hill Lange, New York2012: 21-31Google Scholar,8St Pierre M Hofinger G Buerschaper C Simon R Crisis Management in Acute Care Setting. 2nd Edn. Springer, New York2015: 41-59Google Scholar,12Zhang J Patel VL Johnson TR Shortliffe EH A cognitive taxonomy of medical errors.J Biomed Inform. 2004; 37: 193-204Crossref PubMed Scopus (162) Google Scholar. A slip can take the form of intrusions where you are thinking about the management of another difficult case ahead and, for example, inadvertently draw up ephedrine into the intended metaraminol syringe. Slips tend to happen with familiar skill-based tasks. Lapses are due to memory failures. The tasks involved tend to be complex or have multiple steps,8St Pierre M Hofinger G Buerschaper C Simon R Crisis Management in Acute Care Setting. 2nd Edn. Springer, New York2015: 41-59Google Scholar,12Zhang J Patel VL Johnson TR Shortliffe EH A cognitive taxonomy of medical errors.J Biomed Inform. 2004; 37: 193-204Crossref PubMed Scopus (162) Google Scholar e.g. forgetting to flush the central venous catheter with 0.9% saline before inserting it or forgetting to give surgical antibiotic prophylaxis prior to making an incision (see Fig. 3). A planning error is a form of active error, where the intended outcome is not achieved because of a deficit in the planning before execution of the task and is termed as a mistake. Planning relies on thought processes, and it applies largely to unfamiliar tasks. It is a conscious effort and requires more focus and energy than automatic familiar tasks. Problem solving is required to formulate a plan, and there are three common cognitive tools used to facilitate the process: knowledge based, rule application, and a mental model.8St Pierre M Hofinger G Buerschaper C Simon R Crisis Management in Acute Care Setting. 2nd Edn. Springer, New York2015: 41-59Google Scholar,12Zhang J Patel VL Johnson TR Shortliffe EH A cognitive taxonomy of medical errors.J Biomed Inform. 2004; 37: 193-204Crossref PubMed Scopus (162) Google Scholar Knowledge-based mistakes are generated by lack of facts related to the problem.8St Pierre M Hofinger G Buerschaper C Simon R Crisis Management in Acute Care Setting. 2nd Edn. Springer, New York2015: 41-59Google Scholar,9Gregory B Kaprielian VS Anatomy of Error Module Duke University of Medicine.Available from http://patientsafetyed.duhs.duke.edu/module_e/module_overview.htmlGoogle Scholar,12Zhang J Patel VL Johnson TR Shortliffe EH A cognitive taxonomy of medical errors.J Biomed Inform. 2004; 37: 193-204Crossref PubMed Scopus (162) Google Scholar For example, suxamethomium can trigger malignant hyperthermia; and patients with diabetes can have gastric autonomic neuropathy, increasing the risk of acid reflux and aspiration. Rule application concerns the use of a set of learned guidelines. Mistakes can occur in the misapplication of a 'good' rule, e.g. giving a beta-blocker to a patient with fast atrial fibrillation to control the heart rate, but the patient has asthma. Using the 'wrong' rule also causes mistakes,8St Pierre M Hofinger G Buerschaper C Simon R Crisis Management in Acute Care Setting. 2nd Edn. Springer, New York2015: 41-59Google Scholar e.g. using the pneumonia guidelines to treat a patient with pulmonary embolus. Mental models are psychological representations of a situation. We often rely on previous experience when encountering a new situation and tend to use a mental model that best fits when we previously felt in control. However, this may not be the reality of the situation.8St Pierre M Hofinger G Buerschaper C Simon R Crisis Management in Acute Care Setting. 2nd Edn. Springer, New York2015: 41-59Google Scholar For example, a general anaesthetic is administered to an obstetric patient because of inadequate epidural anaesthesia for urgent Caesarean section. The patient becomes hypotensive and is resuscitated with i.v. fluids. Previous experience has been of obstetric hypotension caused by maternal haemorrhage, and aggressive blood transfusion is commenced. The obstetricians have achieved haemostasis, but the mental model persists and further fluids and blood products are given for presumed occult blood loss. The blood pressure remains low and the oxygen saturations drop as fluid overload manifests. The mental model has falsely reassured the anaesthetist who has overlooked or dismissed other causes of hypotension, such as a 'high' regional block. Mental models should not be relied upon and should be frequently re-evaluated. As an anaesthetist progresses through his/her career, the likelihood of particular error types changes. In the early stages of a doctor's career, knowledge-based errors are more likely. As the novice advances and acquires his/her basic competencies his/her practice is guided by a basic set of rules, hence rule-based error usually occurs at this point. As an anaesthetic consultant, the majority of activities have become automatic skills, and slips and lapses can creep in.8St Pierre M Hofinger G Buerschaper C Simon R Crisis Management in Acute Care Setting. 2nd Edn. Springer, New York2015: 41-59Google Scholar Slips, lapses, and mistakes are unintentional behaviours that may lead to patient harm. Violations differ as they are deliberate acts that deviate from the accepted guidelines of safe practice. In some rare instances, they are malevolent acts, where staff deliberately cause harm to patients, equipment, or other staff members. However, the majority of violations are committed with the intention of making a task easier or quicker, despite falling outside the safety guidelines. Such 'shortcuts' may on the surface seem to be a better trade-off between efficiency and risk but may cause patient harm. Continued violation of a safety protocol at an individual or department level will lead to a bare minimum safety margin, where serious harm could occur.8St Pierre M Hofinger G Buerschaper C Simon R Crisis Management in Acute Care Setting. 2nd Edn. Springer, New York2015: 41-59Google Scholar An example would be preoperative assessment of elective cases in the anaesthetic room. This would be time saving but would need to be balanced against the risk of pressure to proceed with surgery, even with insufficient investigations. In most occasions, all required information will be available and anaesthesia will proceed safely. Team-working errors occur with poor interaction within the multi-professional theatre team. Key areas of potential poor performance include crisis resource management (CRM), leadership, situational awareness, and task prioritization and allocation.4Wachter RM Understanding Patient Safety. 2nd Edn. McGraw-Hill Lange, New York2012: 149-159Google Scholar,11Reason J Human error: models and management.BMJ. 2000; 320: 768-770Crossref PubMed Scopus (3631) Google Scholar,13Fortune PM Davis M Hanson J Philips B Advanced Life Support Group Human Factors in the Health Care Setting. Wiley-Blackwell, New Jersey, USA, London, Oxford2013: 20-35Google Scholar,14Carthey J Clarke J Implementing human factors in healthcare PDF. Patient Safety First Group.Available from www.weahsn.net/wp-content/uploads/Human-Factors-How-to-Guide-v1.2.pdfGoogle Scholar Slips, lapses, and mistakes are all human errors, and as humans we are not infallible. Often we hear about an incident and think how easily that could have happened to any of us. By altering the environment or circumstances surrounding an incident we can minimize future errors. Human factors influence the way we behave and interact at work. These factors include our work environment, the organizational structure, remit of our job, and our personal traits. By looking into latent problems within a system and then placing safeguards, we can minimize active/human errors. This is the basis of human factors engineering.5Wachter RM Understanding Patient Safety. 2nd Edn. McGraw-Hill Lange, New York2012: 111-123Google Scholar,11Reason J Human error: models and management.BMJ. 2000; 320: 768-770Crossref PubMed Scopus (3631) Google Scholar There are many latent errors and human factors that can lead to patient harm. Some of the common safeguards implemented in the theatre environment are: •Active errors from the individual can be minimized by the use of checklists, briefings, guidelines, structured handovers, read backs, and double checks, 'Do not enter—anaesthesia in progress' signs, and simulation training (see Table 1).Table 1Measures to reduce active errors6Wachter RM Understanding Patient Safety. 2nd Edn. McGraw-Hill Lange, New York2012: 303-319Google Scholar,9Gregory B Kaprielian VS Anatomy of Error Module Duke University of Medicine.Available from http://patientsafetyed.duhs.duke.edu/module_e/module_overview.htmlGoogle Scholar,10Wheeler SJ Wheeler DW Medication errors in anaesthesia and critical care.Anaesthesia. 2000; 60: 257-273Crossref Scopus (89) Google Scholar,15Human Factors and Managing Error NHS Education for Scotland.Available from https://learn.nes.nhs.scot/Resource/View/800Google ScholarChecklistsWorld Health Organization (WHO) surgical checklist, Safe Anaesthesia Liaison Group (SALG) 'Stop Before You Block', anaesthetic machine check, rapid sequence induction checklist, transfer checklist.Briefing and debriefingTheatre team planning and review of surgical list.GuidelinesAssociation of Anaesthetists of Great Britain and Ireland (AAGBI) – anaphylaxis, malignant hyperthermia, local anaesthesia (LA) toxicity, Difficult Airway Society (DAS)— difficult intubation.HandoversObstetric 'Sick patients, At risk, Follow-ups, Epidurals' (SAFER) handover, postoperative paediatric intensive care unit handover.Read backsPatient details—name, date of birth, hospital number, blood results.Simulation trainingCrisis resource management—also termed team resource management, human factors training, anaesthetic critical incident management, emergency airway managementDouble checksDrug calculations, drug infusion preparation, controlled drug administration. Open table in a new tab •Latent errors within a system can be minimized by automated systems, standardization of equipment and drugs, and optimal equipment design with forced functions (i.e. equipment or systems that only allow specific standardized options to minimize error) (see Table 2).Table 2Measures to reduce latent errors9Gregory B Kaprielian VS Anatomy of Error Module Duke University of Medicine.Available from http://patientsafetyed.duhs.duke.edu/module_e/module_overview.htmlGoogle Scholar,10Wheeler SJ Wheeler DW Medication errors in anaesthesia and critical care.Anaesthesia. 2000; 60: 257-273Crossref Scopus (89) Google Scholar,15Human Factors and Managing Error NHS Education for Scotland.Available from https://learn.nes.nhs.scot/Resource/View/800Google ScholarAutomated systemsReal-time electronic anaesthetic charts, bar code blood product checks, anaesthetic machine checks.StandardizationPatient controlled analgesia (PCA) and potassium drug infusions prepared by pharmacy with standardized concentrations, syringe tips—specific to epidural/spinal anaesthesia, catheter mounts 15/22 mm connectors. Schrader probes/non-interchangeable screw thread connections.Equipment design and functionMonitors with visual data and audio data (including pulse oximetry saturation percentage and tone). Monitor display arranged so that it is easy to see basic vital measurements. Target-controlled infusion (TCI) pumps that are easy to programme and review data before proceeding. Giving pre-programmed options of TCI protocols, e.g. Marsh and Minto models. PCA pumps with preset standardized programmes. Open table in a new tab No department is error proof. Reason proposed the 'Swiss Cheese' model to describe the trajectory of an error (see Fig. 4).9Gregory B Kaprielian VS Anatomy of Error Module Duke University of Medicine.Available from http://patientsafetyed.duhs.duke.edu/module_e/module_overview.htmlGoogle Scholar Multiple safeguards are put in place to avoid errors in a system. Each layer of protection will have an area of weakness, representing a hole in the defence. With multiple layers, even if an error breaches one layer, the next layer will catch the error before it causes patient harm. This model can be 100% effective in a static environment, but the NHS is a complex dynamic system with multiple 'moving' layers of protection, thus at some point the 'holes' in each layer can align, allowing a straight path to patient harm. There are, however, limitations to the 'Swiss Cheese' model in that it implies linear determinism. While that may be true in some industries such as nuclear and aviation, in health care it seldom is, as events rarely unfold in a single straightforward pattern, and are usually much more complicated and interrelated.3Wachter RM Understanding Patient Safety. 2nd Edn. McGraw-Hill Lange, New York2012: 21-31Google Scholar,8St Pierre M Hofinger G Buerschaper C Simon R Crisis Management in Acute Care Setting. 2nd Edn. Springer, New York2015: 41-59Google Scholar,12Zhang J Patel VL Johnson TR Shortliffe EH A cognitive taxonomy of medical errors.J Biomed Inform. 2004; 37: 193-204Crossref PubMed Scopus (162) Google Scholar A comparably complex situation to health care is in the prediction of extreme weather events. It is not always possible to predict the occurrence of a severe hurricane or flood, as there are numerous factors that interplay to lead to one, but once one has occurred, the factors leading up to it can be analysed in retrospect in great detail. Error theory is constantly evolving. Suggested theories and solutions are more applicable to a static environment; however, health care is a dynamic environment. An expanding population, with increasing levels of co-morbidity, coupled with demand outstripping resources, means the health care environment will only get more complex over time. Kinnear proposed a different approach accepting the fluidity of the health care environment and focusing on our innate ability to problem solve through resilience engineering.16Kinnear J Presentation—Damage Limitation—Minimising Unintentional Harm: Complexity, Team Working & Human Factors. The Royal College of Anaesthetists, Patient Safety in Peri-Operative Practice, 2014Google Scholar This is examining what works well in this challenging environment, in addition to went wrong. Resilience engineering involves building a flexible organization that anticipates the dynamic nature of errors and continually revises risk models to reduce errors before they occur.11Reason J Human error: models and management.BMJ. 2000; 320: 768-770Crossref PubMed Scopus (3631) Google Scholar Errors will occur within every health care system. It is important that these errors are identified and lessons drawn from these cases to improve patient safety. Incident reporting systems (IRS) collect error data with the aim of facilitating learning and improving patient safety. They are passive processes where data are voluntarily reported by staff on the front line. Voluntary incident reporting is highly variable and does not truly reflect the incidence of errors, which are generally under-reported within health care. However, these passive systems are relatively inexpensive and empower staff to identify and learn from error. If lessons are appropriately disseminated to the right people, the required changes can be made to prevent recurrence. Increased reporting can be encouraged with an open 'fair blame' culture ensuring timely feedback and improvements.7Wachter RM Understanding Patient Safety. 2nd Edn. McGraw-Hill Lange, New York2012: 233-253Google Scholar,14Carthey J Clarke J Implementing human factors in healthcare PDF. Patient Safety First Group.Available from www.weahsn.net/wp-content/uploads/Human-Factors-How-to-Guide-v1.2.pdfGoogle Scholar A 'fair blame' culture achieves a balance of openly investigating the factors leading up to an incident and our individual responsibility and behaviour as health care professionals to maintain good practice and safety. In the UK, 75% of Trusts use the Datix web-based software programme for incident reporting. Once reported, the process of learning and improving patient safety begins with root cause analysis (RCA). There are two general approaches to error investigation: person based and system based. The person-based approach focuses directly on the unsafe act committed by the person causing the incident and implies that the error is specific to that individual, separating the incident from the latent errors and human factors contributing to it. Unfortunately, the person-based approach is prevalent throughout many organizations.8St Pierre M Hofinger G Buerschaper C Simon R Crisis Management in Acute Care Setting. 2nd Edn. Springer, New York2015: 41-59Google Scholar With this approach, it is easier to blame an individual for his/her carelessness, inattention, recklessness, or lack of education. This tends to lead to a 'naming, blaming and shaming' culture, where if a similar circumstance were to occur again, the likelihood is that the same error would happen. In addition, the morale and confidence of the individuals involved may be permanently damaged. This is not an effective approach to error reduction in health care. The Berwick Report (2013) on patient safety in the NHS stated we should 'abandon blame as a tool'.17Berwick D A Promise to Learn—A Commitment to Act. Improving the Safety of Patients in England. Commissioned by NHS England, 2013Google Scholar The basic premise in the system-based approach is that humans are fallible and errors are expected, even in the best organizations. The systems-based approach is far more applicable to the NHS and considers the organizational processes and chain of events that lead to the error. Errors are seen as consequences rather than causes, having their origins not so much in the perversity of human nature as in the 'upstream' systemic factors.11Reason J Human error: models and management.BMJ. 2000; 320: 768-770Crossref PubMed Scopus (3631) Google Scholar Countermeasures are based on the assumption that though we cannot change the human condition, we can change the conditions under which humans work. RCA examines both aspects, but with particular emphasis on the systems-based approach.8St Pierre M Hofinger G Buerschaper C Simon R Crisis Management in Acute Care Setting. 2nd Edn. Springer, New York2015: 41-59Google Scholar RCA is the structured, thorough investigation of a patient safety incident to determine the underlying causes and contributing factors, and then analyse these to draw out any learning points.7Wachter RM Understanding Patient Safety. 2nd Edn. McGraw-Hill Lange, New York2012: 233-253Google Scholar The learning points can be actioned to reduce the chance of the same or similar incident reoccurring. The Berwick Report stated as its main objective regarding patient safety, 'A promise to learn—a commitment to act: improving the safety of patients in England'.17Berwick D A Promise to Learn—A Commitment to Act. Improving the Safety of Patients in England. Commissioned by NHS England, 2013Google Scholar The most widely adopted RCA template is from the National Patient Safety Agency (NPSA), which is detailed below. The process of RCA can be broken down into seven steps18Root Cause Analysis National Patient safety Agency.Available from www.nrls.npsa.nhs.ukGoogle Scholar (see Fig. 5). Worked ExampleIncident Report System (Datix) summary from theatres:Vascular surgery emergency open abdominal aortic aneurysm (AAA) repair (failed endovascular aortic repair (EVAR)): Incorrect dose of heparin administered during surgery, patient suffered coagulopathy causing major surgical haemorrhage and cardiovascular instability requiring multiple blood product administrations. Patient admitted to intensive care unit postoperatively. Patient remains intubated with high oxygen requirements secondary to transfusion related acute lung injury (TRALI) and is receiving renal replacement therapy due to acute renal impairment. Anaesthetist who administered heparin had drawn up the incorrect dose. There are two different heparin vial concentrations in the anaesthetic room drug cupboard, heparin 100 units/ml and 1000 units/ml. Incident Report System (Datix) summary from theatres: Vascular surgery emergency open abdominal aortic aneurysm (AAA) repair (failed endovascular aortic repair (EVAR)): Incorrect dose of heparin administered during surgery, patient suffered coagulopathy causing major surgical haemorrhage and cardiovascular instability requiring multiple blood product administrations. Patient admitted to intensive care unit postoperatively. Patient remains intubated with high oxygen requirements secondary to transfusion related acute lung injury (TRALI) and is receiving renal replacement therapy due to acute renal impairment. Anaesthetist who administered heparin had drawn up the incorrect dose. There are two different heparin vial concentrations in the anaesthetic room drug cupboard, heparin 100 units/ml and 1000 units/ml. To conduct an RCA, the incident must be categorized into level of harm, and an appropriate team should be assembled. •Level 1 – No harm ('near miss')/low harm/moderate harm. A concise investigation usually handled by one local person, comprising a one-page summary.•Level 2 – Severe harm or death. A comprehensive investigation, an in-depth assessment requiring a multidisciplinary team, which was not involved in the incident, or in the locality or directorate. There may be expert opinion sought.•Level 3 – Severe harm, death, or public interest. As above, plus incidents of public interest or notifiable serious incidents (e.g. never events). These investigations are carried out by investigators external to the organization.•The RCA team will routinely comprise a person trained in RCA, expert(s) in the incident field, an administrator, and a non-executive person (layperson or patient representative).18Root Cause Analysis National Patient safety Agency.Available from www.nrls.npsa.nhs.ukGoogle Scholar Worked ExamplePatient safety incident Level 2 (severe harm), requiring comprehensive investigation by RCA team. RCA team assembled. Consultant anaesthetist (not involved in the incident), consultant vascular surgeon (not involved in the incident), medical secretary, pharmacist, and patient advisory liaison officer. Patient safety incident Level 2 (severe harm), requiring comprehensive investigation by RCA team. RCA team assembled. Consultant anaesthetist (not involved in the incident), consultant vascular surgeon (not involved in the incident), medical secretary, pharmacist, and patient advisory liaison officer. This step is about gathering all relevant facts surrounding the incident, avoiding opinions and other bias, e.g. the cultural bias 'this is the way it has always been done
Referência(s)