Editorial Acesso aberto Revisado por pares

A new view of safety: Safety 2

2015; Elsevier BV; Volume: 115; Issue: 5 Linguagem: Inglês

10.1093/bja/aev216

ISSN

1471-6771

Autores

D. R. Ball, C. Frerk,

Tópico(s)

Healthcare Quality and Management

Resumo

Primum non nocere (first do no harm) is a priority for our practice, and nowadays safety is under constant scrutiny by patients, politicians, and the press. This is increasingly recognized by our profession, and articles with a focus on risk and safety are starting to appear in UK anaesthetic journals.1Irwin MG Kong VKF Quantifying and communicating peri-operative risk.Anaesthesia. 2014; 69: 1299-1303Crossref PubMed Scopus (14) Google Scholar, 2Moloney J Error modelling in anaesthesia: slices of Swiss cheese or shavings of Parmesan.Br J Anaesth. 2014; 113: 905-906Abstract Full Text Full Text PDF PubMed Scopus (6) Google Scholar Safety is a concept that we intuitively believe we understand but is difficult to define. A suitable definition might be ‘the control of recognized hazards to achieve an acceptable level of risk’. A system is evidently not safe when an episode of harm has occurred (e.g. wrong-site surgery), but a system cannot be deemed to be safe simply because an adverse event has not occurred recently. The longer a team, department, organization, or service goes without anything going wrong, the more likely it is that the people managing it and working within it believe it to be safe, but this is not necessarily true. While safe systems will usually go for long periods without adverse events, this can also occur by chance in unsafe systems, and superficially, it is not possible to distinguish between the two. Health care is continually being pushed to improve, not only in terms of safety, but also with increased efficiency and economy. This invokes the law of stretched systems, where ‘every system is stretched to operate at its capacity and as soon as there is some improvement, for example in the form of new technology, it will be exploited to achieve a new intensity and tempo of activity’.3Woods DD Steering the Reverberations of Technology Change on Fields of Practice: Laws that Govern Cognitive Work.Available from http://www.lrdc.pitt.edu/schunn/cogsci2002/program/plenaries/woods.pdfGoogle Scholar Increased efficiency eats silently and progressively into safety margins without us noticing, and after a while operating within this increasingly risky environment we are caught out by an adverse event occurring within the system that we thought we understood. Health-care organizations then immediately increase their focus on safety, investigations follow, and we offer assurances that lessons will be learned. Established models of accident investigation are generally based on cascade or domino models of the serial, sequential worsening of an incident into an accident. Heinrich4Heinrich HW Industrial Accident Prevention: A Scientific Approach. McGraw-Hill, Columbus1931Google Scholar first presented this notion in his book ‘Industrial Accident Prevention’ in 1931, with five falling dominoes, one being ‘human error’. James Reason's popular and influential ‘Swiss cheese model’ of accident evolution invokes a similar idea, with the concept of breaches in various defences (cultural, organizational, and personal) allowing propagation of an incident into an accident.5Reason JT Human Error. Cambridge University Press, Cambridge1990Crossref Google Scholar, 6Reason J Human error: models and management.Br Med J. 2000; 320: 768-770Crossref PubMed Scopus (3630) Google Scholar These are essentially linear narratives, based on what Hollnagel calls a ‘causality credo’.7Hollnagel E Safety 1 and Safety 2. The Past and Present of Safety Management. Ashgate Publishing, Farnham2014Google Scholar, 8Hollnagel E Leonhardt J Licu T Shorrock S From Safety-I to Safety-II – a white paper.Available from http://www.skylibrary.aer/bookshelf/books/2437.pdfGoogle Scholar The main assumption in these analyses is that the event under review occurred in a system that is capable of deconstruction to its composite parts, and that it is describable and understandable as a culmination of a number of identifiable factors, which form a stepwise narrative. These models are superficially simple and satisfying, appealing to our desire to discover causation and perhaps culpability in the aftermath of a distressing event. A key feature in this form of enquiry is that ‘work as done’ deviated from ‘work as imagined’.7Hollnagel E Safety 1 and Safety 2. The Past and Present of Safety Management. Ashgate Publishing, Farnham2014Google Scholar, 8Hollnagel E Leonhardt J Licu T Shorrock S From Safety-I to Safety-II – a white paper.Available from http://www.skylibrary.aer/bookshelf/books/2437.pdfGoogle Scholar Work as done refers to the practical and pragmatic way that tasks are achieved ‘at the sharp end’, where approximations and adjustments are continually made in order to achieve desired outcomes and there is necessary variation in activity between groups or individuals performing similar tasks in varying conditions. In this model, errors are continually prevented, detected, and managed using a mixture of proactive and reactive strategies. Work as imagined has a different perspective, a key feature of which is that minimal variation in process is expected and that there is one correct way to achieve an outcome. This is often perceived to be the presiding view of those ‘at the blunt end’ and is reflected in the vast arrays of protocols and policies that populate health-care intranet sites. Investigations into incidents and accidents typically identify human errors based on work as imagined. These retrospective reviews are subject to hindsight bias; reports are tailored to fit a linear narrative, and action plans are produced with lists of recommendations. This is known as ‘solutionism’, where there is a belief that problems have easy answers, often of a technical nature.9Morozov E To Save Everything, Click Here. Technology, Solutionism and the Urge to Fix Problems That Don't Exist. Allen Lane, London2013Google Scholar This is a key part of ‘Safety 1’ culture, which is the generally dominant paradigm in health care. As attention is solely directed at reported or discovered mistakes, only negative outliers in performance are identified. This is based on the notion of the ‘just world’ hypothesis,10Lerner MJ The belief in a just world.in: The Belief in A Just World. A Fundamental Delusion. Perspectives in Social Psychology. Springer Verlag, London1980: 9-30Crossref Google Scholar where a culture of ‘name, blame, and shame’11Heard G Errors in medicine: a human factors perspective.Australasian Anaesthesia. 2005; (accessed 21 December 2014)Available from http://www.anzca.edu.au/resources/college-publications/pdfs/books-and-publications/Australasian%20Anaesthesia/australasian-anaesthesia-2005/05_Heard.pdfGoogle Scholar is prevalent and societal pressure to assign accountability to individuals continues. We work in an environment where proscriptive lists of ‘never events’ are compiled,12Adyanthaya SS Patil V Never events: an anaesthetic perspective.Contin Educ Anaesth Crit Care Pain. 2014; 14: 197-201Abstract Full Text Full Text PDF Scopus (5) Google Scholar and many safety and quality indicators are published and publicized (there were more than 100 in a 2009 systematic review).13Haller G Stoelwinder J Myles PS McNeil J Quality and safety indicators in anesthesia. a systematic review.Anesthesiology. 2009; 110: 1158-1175Crossref PubMed Scopus (103) Google Scholar Whilst calls to improve safety and reduce risk are laudable, most are based on the mantra of work as imagined, and the practical, achievable improvements remain elusive. In the 21st century, our work has become sufficiently complex as to defy simple analysis of many of the events that befall our patients. Incidents are no longer tractable or decomposable, even when subjected to exhaustive analysis.7Hollnagel E Safety 1 and Safety 2. The Past and Present of Safety Management. Ashgate Publishing, Farnham2014Google Scholar, 8Hollnagel E Leonhardt J Licu T Shorrock S From Safety-I to Safety-II – a white paper.Available from http://www.skylibrary.aer/bookshelf/books/2437.pdfGoogle Scholar This is beginning to be understood in other critical safety industries; for example, when describing a series of battery fires afflicting the Boeing 787 ‘Dreamliner’ aircraft, Hans Weber, formerly a Federal Aviation Authority advisor admitted that [after 250,000 flight hours] ‘… we don’t know yet the root cause or causes’.14Scott A Boeing 787 receives U.S. approval for expanded flying.Available from http://www.articles.chicagotribune.com/2014-05-28/business/sns-rt-us-usa-boeing-faa-20140528_1_787-dreamliner-etops-faaGoogle Scholar Health care, meanwhile, still demands a root-cause analysis and action plan within 60 days,15NHS England Serious Incident Framework. Supporting learning to prevent recurrence.Available from http://www.england.nhs.uk/wp-content/uploads/2015/04/serious-incidnt-framwrk-upd.pdfGoogle Scholar not accepting that the system from which the accident emerged is often too complex to discover the real truth that quickly, if at all. Charles Perrow recognized that an appreciation of complexity is key to understanding outlier events. His model of ‘normal accidents’16Perrow C Normal Accidents. Living with High Risk Technologies. Princeton University Press, Princeton1984Google Scholar emphasizes that en route to an accident there are multiple design and equipment failures, the majority of which had not been considered to be problematic until after the accident occurred. Importantly, he identified ‘negative synergy’, explaining that coupling of equipment, design, and human error leads to far greater consequences than each taken in isolation and that when complexity and coupling reach critical, unsustainable levels, accidents will inevitably occur. Uneventful safe work usually attracts little attention (as a result of the basic psychological trait of habituation). Humans are primed to respond to novelty, such as an unanticipated failure. Failure is what Safety 1 culture studies; it sees the bad, but is blind to the good. New ways of looking at safety and risk are emerging, drawing once again from thinking in the aviation and nuclear industries. The study and promotion of success in complex ‘sharp end’ working is called ‘Safety 2’. At its heart is the notion of resilience engineering.17Hollnagel E Pariès J Woods DS Wreathall J Resilience Engineering in Practice. A Guidebook. Ashgate Publishing, Farnham2010Google Scholar Resilience is a form of toughness, a mixture of proactive defence coupled with reactive response such that most errors are prevented, avoided, or captured. Resilience enables adaptation to change and endurance during adversity. Hollnagel, a leading exponent of this approach, emphasizes that resilience needs anticipation of threats and opportunities, flexible responses to changing demands, and continual learning from both good and bad performance. With this thinking comes a realignment from a preoccupation with failure to the promotion of success, from Safety 1 to Safety 2; this is resilient health care.18Hollnagel E Braithwaite J Wears RL Resilient Health Care. Ashgate Publishing, Farnham2014Google Scholar Safety 1 and Safety 2 are not antagonistic, but complementary approaches; Safety 1 investigates the detrimental outliers, while Safety 2 considers the rest, including those who excel. The Safety 2 approach recognizes that we work in a sociotechnical system.19Cooper R Foster M Sociotechnical systems.Am Psychol. 1971; 26: 467-474Crossref Google Scholar This term tells us that we, our colleagues, our patients, and the technologies we use are crucially linked and interdependent and that the human contribution is inseparable from the whole. Every system contains domains of complexity, risk, uncertainty, dynamism, and emotion, and each new day brings about unique collections of interactions, mostly non-linear, meaning that both good and bad outcomes are emergent phenomena within a complex system, which are not amenable to a full description of causation.20Funtowicz S Ravetz JR Emergent complex systems.Futures. 1994; 26: 568-582Crossref Scopus (203) Google Scholar Whilst anaesthesia has shown consistent improvements in safety21Mellin-Olsen J Staender S Whitaker DK Smith AF The Helsinki Declaration on patient safety in anaesthesiology.Eur J Anaesthesiol. 2010; 27: 592-597Crossref PubMed Scopus (255) Google Scholar and is cited as a model for health care,22Gaba DM Anaesthesiology as a model for patient safety in health care.Br Med J. 2000; 320: 785-788Crossref PubMed Scopus (376) Google Scholar continual increases in demand and expectation place unprecedented challenges on us. Successful, safe work in this environment requires a high degree of flexibility and adaptability, with staff constantly having to make real-time trade-offs between efficiency and thoroughness.23Hollnagel E The ETTO Principle. Efficiency–Thoroughness Trade-Off. Why Things That go Right Sometimes go Wrong. Ashgate Publishing, Farnham2009Google Scholar Hudson described three waves of industrialization: technical, system, and culture waves.24Hudson P Implementing safety culture in a major multi-national.Safety Science. 2007; 45: 697-722Crossref Scopus (185) Google Scholar The first looked at machines and mechanism, where causation was seen as simple and linear. Machines broke down and were fixed, more men were hired and improvements in design and manufacture reduced failure. This approach developed in response to the demands of the Industrial Revolution and was the dominant perspective for more than 150 years. The second wave, that of human factors, started in the 1960s. Building on the first, this looked at the human contribution, and with an emphasis on Safety 1, still holds sway in many working cultures, including health care. We are now entering the third wave of systems, of organizational safety, where linear narratives are most often unhelpful. This age appreciates our work in the context of a sociotechnical system and underpins the Safety 2 approach. Managing demand and expectation in health care is one of the hard problems that continues to tax those who devote entire careers to it. In recognizing that the complexity of our work has brought about fundamental change for us, Safety 2 is a way forward. The message is simple and important, giving us a way of dealing with the challenges of life and work: ‘study success, not just failure’. The key to understanding success is to recognize the importance of allowing the workforce to be flexible, innovative, and adaptable. The same qualities (flexibility, innovation, and adaptability) will, however, lead to failure in different circumstances. National strategies to promote success should be developed and promoted, with a rebalancing of effort from a reactive Safety 1 to a proactive Safety 2 culture. Information systems must be developed to identify good performance more easily.25Cook TM Coupe M Ku T Shaping quality: the use of performance polygons for multiprofessional presentation and interpretation of qualitative performance data.Br J Anaesth. 2012; 108: 953-960Abstract Full Text Full Text PDF PubMed Scopus (9) Google Scholar Visits to centres with demonstrably good outcomes could help us to review how this has been achieved and may enable us to share best practice and drive up standards by learning not only what others do but also how they do it. At a departmental level, this way of looking at safety is something that we can all start to put into practice immediately. We should begin by examining our current systems, looking at the whole, including both things that go right and things that go wrong. We need to recognize that we are all continually making adjustments within our systems to create safety and balance risks. We must examine our practices dispassionately and honestly to understand where gaps exist between work as imagined and work as done, particularly when systems appear to be working well, and start to consider which gaps and adjustments increase the benefits more than the risks. Morbidity and mortality meetings should include ‘safety and success’ sections, where outliers in good performance are acknowledged and discussed. Morbidity and mortality should continue as part of Safety 1, while system functioning should be examined as part of Safety 2, where excellent performance can be studied and used as an opportunity to improve safety. Adjustments and ‘work as done’ should be acknowledged, and variation should be accepted and embraced, examining both things that go right and things that go wrong. Only then will we genuinely start moving in the direction of resilience and safety. None declared. We thank Dr B. Vowles, Dr N. Cassells and Dr S. Hillier for their advice and expertise during preparation of this manuscript. Download .zip (.0 MB) Help with zip files

Referência(s)