Revisão Revisado por pares

‘Where there is error, may we bring truth.’ A misquote by Margaret Thatcher as she entered No 10, Downing Street in 1979

2005; Wiley; Volume: 60; Issue: 3 Linguagem: Inglês

10.1111/j.1365-2044.2004.04114.x

ISSN

1365-2044

Autores

Hazel Adams,

Tópico(s)

Healthcare Quality and Management

Resumo

The headline in today's Times (14/08/04) is ‘Blundering hospitals kill 40 000 every year…at a cost of £2 billion per year’. Today's British Medical Journal reports that ‘about’ 850 000 medical errors occur in National Health Service (NHS) hospitals every year resulting in 40 000 deaths [1]. The Department of Health estimates that adverse events occur in 10% of all hospital admissions [2]. Errors in anaesthesia have been studied by simulator groups and by population based studies such as NCEPOD in the UK [3], the Harvard Medical Practice Study in the USA [4] and, in Australia, by the Quality in Australian Healthcare Study [5] and the Australian Incident Monitoring Study (AIMS) [6]. Large studies report dramatic figures such as those recorded by The Times headline and suggest that up to 70% of adverse events are preventable [7-9]. However, the true prevalence and magnitude of errors is unknown despite these epidemiological reviews. James Reason, a pioneer of the study of error, defines error as ‘occasions in which a planned sequence of mental or physical activities fails to achieve its intended outcome’[10]. Another expert, Lucian Leape, has described how errors often result from system failures; accidents are the end results of a chain of events due to ‘blatant’ errors, each error in turn related to the overlying organisation, its ownership and its regulation [11]. Despite the huge body of work defining types of error, their causes and the comparative studies relating anaesthesia accidents to accidents in the airline industry, the error rate of an ordinary anaesthetist is unknown. By extension, what defines unsatisfactory practice is also unknown. Appraisal and revalidation, thought to be crucial in underpinning the delivery of clinical governance, cannot identify anything but the grossest malfunctions. With the intention of improving my practice, I set about recording my every error. My prestudy impression was that I make errors during 10% of my anaesthetics. I decided to test this hypothesis and tried to enlist colleagues, recording every error we made for a year. No-one would join me in my study and several colleagues cautioned me against the undertaking. Every time I made what I considered to be an error I recorded it. After 6 months I realised I should be recording a background narrative describing each situation. I was rigorously honest and it became a joke in theatre. ‘Go on, Hazel, there's another one', ‘Did you get that one?’, ‘Well, you haven’t made a mistake yet.' I have been unable to persuade anyone to give me an honest opinion about my relative level of caution. I think I am average, certainly not super-cautious and hopefully not slap-dash. During the year, I anaesthetised 160 neurosurgical patients, exactly half of whom were in a private hospital, and 131 patients for other surgical specialities, principally obstetrics. The numbers appear small because I work part-time, have a large administrative commitment and the neurosurgeon with whom I work specialises in posterior fossa procedures including cerebello-pontine angle surgery. I made mistakes during 29 weeks of the year and none during 14 weeks. During the 29 weeks the number of mistakes varied between one and six per week. Every anaesthetist will be familiar with leaving the pressure-limiting valve screwed down or forgetting to turn off the nitrous oxide at the end of the operation. Other errors will be denied by some and would have been denied by me had I not recorded them. Did I really forget to check the anaesthetic machine, although I'd noted the list started 2 h late? Other poignant remarks appear in my records: ‘A 7-h operation’, ‘11 h’, ‘rushed’, ‘feeling poorly’. During one operation I made five errors: I forgot to restart the remifentanil, the total intravenous anaesthesia became disconnected and I delivered increasing amounts of anaesthesia into the sheets, the central venous pressure monitor malfunctioned and so on. I made drug errors, one right/left error and the majority were slips and lapses. I concluded that with a little imagination and probably a lot of money many of my mistakes could be prevented. My study was not sufficiently rigorous. I cannot comment, for example, on whether a previous night on-call or concomitant teaching affected my performance, having failed to record such details on the days when I did not make errors. I enjoy perfect health and sleep well and yet my records contain several remarks about feeling unwell or very tired. It is unlikely to be coincidence that the page recording most errors contains the words ‘Felt ill all week’. I have recorded four occasions when I was teaching, on one occasion another consultant, but most of my errors occurred when I was alone. No patient suffered permanently. However, I must have caused morbidity as, for example, when I failed to insert a central venous pressure line when I should have sought a better site. (I counted this as an error, as I thought I would get away with it, whereas if I had chosen the best site and failed that would not have been an error.) I made four significant errors on my anaesthetic charts, failing to record drugs I had given, yet I am forever telling trainees that the chart must be the most honest record possible. I am simultaneously fascinated and appalled that I once commenced induction of anaesthesia without an ECG or, on another occasion, an oximeter. It is unbelievable, particularly as I have gone on to record that I checked the screen and failed to notice that crucial information was missing. I am horrified at the number of mistakes I am making. Effectively, I am making errors all the time. Is this normal? I work in the most supportive environment possible with the same people with whom I could not have a better relationship. I have worked with the same surgeon for 12 years and he is one of my closest personal friends. I have taught him some anaesthesia over the years and he can ventilate, intubate and make sense of my monitors. How many more mistakes would I make working with staff who might be less concerned to assist me? For more than 10 years I have worked in only three theatres, two NHS and the other in a private hospital. I had imagined that I would make fewer errors in the private hospital where I feel less subject to constant interruption. Not so. The cause of all these errors is distraction. I have no proof and it needs further study, but most days, like everyone reading this, I come to work and wonder how I am going to get through all I have to do. Constant phone calls, a queue of trainees who know when they can catch me, nurses who just want a bit of advice, not to mention teaching the various hordes who are supposed to benefit from my knowledge; none of these understand the effects of the constant but justifiable demands as we juggle with all the tasks which consultants are expected to willingly handle. As my hospital Trust refuses to pay for anything other than direct patient care whilst shouting about improving working lives, leadership and team working, I wonder whether our managers live on the same planet. Two words feature prominently in my records, ‘forgot’ and ‘rushed’. Two years ago I spent a sabbatical abroad and envied the doctors' lives, giving anaesthetics without any of the pressures of administration or teaching. I had time to think and spent much of it wondering why we take on all we do. It is challenging and interesting; intelligent people need stimulation and develop in response. I could have devoted half of this article to comparative studies between pilots and anaesthetists and their work. There is one important difference. When tested, anaesthetists were significantly more intelligent than another large group of doctors [12] and may be in the top 1% of the population [13]. Pilots, in contrast, are specifically selected not to challenge rules and high intelligence is disadvantageous. Our current trainees are likely to spend over 30 years as consultants and we all know bored and disillusioned colleagues. I love my job. Am I making more mistakes because I cram my life too full or do bored and disillusioned doctors have the same problem? How many mistakes are reasonable in the working life of one person and what constitutes under-performance? Do under-performing staff make occasional catastrophic errors or a whole series of trivial errors or both? Probably, the answer is that differing types of errors, made in differing situations where most people would not default is an indication of suboptimal performance. I pray that is not me but there is no way of knowing. Detecting errors relies largely on incident reports or informal criticism following poor performance. Such mechanisms detect a small fraction of problems and the usual response to a serious incident is an intensive search for a human culprit. A colleague was suspended for a well-publicised accident. During a major vascular operation he connected bupivacaine to the central venous pressure line. He is a most respected, careful and knowledgeable anaesthetist. He was forbidden to talk to anyone, not allowed to discuss what happened nor how the incident could have been avoided. It seems that the very people who could help us with errors are banned from doing so. ‘Retraining’ was prescribed. He watched consultant colleagues but was not allowed to touch a patient. Give some thought to that. Think about how he felt, what he could gain (or lose) and how the other consultants felt. Who could possibly imagine that retraining could prevent someone from picking up the wrong bag? Errors are symptoms and such a focus retards the search for improvement. The collective research shows that complex systems operate smoothly until several specific latent faults are combined in a particular manner. Corrective measures aimed only at the human elements are unlikely to prevent future problems. Thirty-one reports of inadvertent epidural injection of drugs intended for non-epidural use have been published [14]. In the first 5000 AIMS reports there were four reports of drugs intended for epidural infusion administered intravenously in an obstetric setting [15]. The same paper illustrates the ‘pervasive nature of human error as part of everyday activity’. We deserve a better response than the easy ‘retraining’ option. The classification of the varieties of human error is well known and the reader is referred to Reason [10]. He makes the distinction between unintended errors (slips and lapses) and intended actions (mistakes and violations). An example of the former would be forgetting to label a syringe, due to a distraction, and of the latter, taking the decision not to label a syringe. He goes on to summarise the equally well-known model of how errors develop into an accident. An individual, neither prone to error nor intending to make an error, errs. He takes the whole blame despite the organizational deficiencies and the added contributory factors that summate to cause the accident. At every level there are constraints, e.g. financial, but equally at every level defences are built in, very few of which will be penetrated to produce a damaging outcome. The evolving study of High Reliability Organizations, for example aircraft carriers, is proving valuable in preventing accidents but reveals further dilemmas. Should the human element be designed out of a system with the risk of preventing a successful human intervention when the automated system, designed by humans, fails? The psychological processes leading to error are the least controllable. Of the known error producing conditions, unfamiliarity with the task (increases risk by a factor of 17) should rarely apply to an anaesthetist. Time shortage (× 11), poor signal to noise ratio (× 10) and information overload (× 6) are all powerful error-producing conditions that apply most of the time [16]. The effects of stress have been studied extensively. Stress is normal, occurs in everyone's lives, is usually transitory but is associated with errors, mistakes and conflict [17]. There is evidence that good team-working leads to less stress and, time and time again, teamwork is identified as essential to develop quality and safety. How much emphasis do we put on the evaluation of teamwork? The function of a good team is dependent on the team leader. ‘Leadership is an essential ingredient of success in the search for safety’, says the British Medical Journal leader in the issue devoted to error and safety in medicine [18]. Firth-Cozens reports that organizational trust is given to those leaders who have ability, are benevolent and understanding and demonstrate integrity [19]. Being strongly competitive or highly controlling is a hindrance to leadership and the willingness to take on challenge is an important positive factor. Those airline captains with the fewest errors are warm, friendly, self-confident and able to stand up to pressure [20]. Pilots are selected for technical skills, an ability to co-ordinate and to learn from error [21]. Identifying these factors at interview should take on more importance and I now challenge candidates who have been highly successful in a competitive field outside medicine, whereas previously I might have been impressed by their achievements. It is an area where further work needs to be done. Improved teamwork and increased clinical exposure both diminish stress. The Association of Anaesthetists Sick Doctors' Scheme has found that reducing workload is rarely the answer for those doctors losing confidence in areas of their clinical practice and who believe that going part-time could be the answer [Personal communication. D. Saunders]. Perhaps we should be looking to more formal on-going training as a routine. This should be addressed by appraisal, but hands up how many of you have found your appraisal useful, other than the opportunity to talk about oneself for half an hour. In a seminal paper in the British Medical Journal, Reason summarises the pros and cons of the person approach and the system approach for the analysis of error [22]. In the former, error arises from, for example, inattention. In the latter, there is recognition that one cannot change the human condition but the situations where people work can be changed and an understanding of how and why the defences fail is possible. He describes the Swiss cheese model where the presence of holes in the defensive layers does not normally cause a bad outcome. Occasionally, I made errors where I sited my lines, usually with no consequence, but, on one occasion, I sited an arterial line on the right although I sit on the left of a prone patient. The line malfunctioned and then I did something I never do. I cannot explain why and nowadays when I ‘violate’ alarm bells ring inside me. I turned off the alarm. My attention was held inspecting the arterial cannula. After an indeterminate time I looked at the monitor to note a saturation of 86%. I always leave the alarms on until I have rectified the cause but even if I had not departed from my usual practice I would not have considered that an extra fault had developed during my period of distraction. No one in that close-knit theatre team noticed the anaesthetic tubing on the floor. Anaesthesia is acknowledged as the leading medical speciality addressing issues of patient safety. Simulators help us to design better equipment, test the man/machine interface and study team performance within the clinical context. Despite this, errors and system failures continue to plague us. We work when fatigue and stress lead to suboptimal performance and we violate basic standards. Simulator training, which I have completed three times in the last 5 years and resuscitation courses, which I attend annually, are not available to all and not repeated frequently enough. Should we be forced to use simulators under adverse conditions, at 3 a.m. having been dragged out of bed, to learn how we truly react under such conditions? My motives are philanthropic. Human error remains the primary cause of accidents. Voluntary reporting provides information not obtainable by any other means and the very act of recording all my errors has altered my understanding of my behaviour substantially. Not once did I fill in a critical incident form despite three critical incidents. Critical incident reporting lacks human factors analysis, the power of hindsight provides a narrow focus and it is my impression that rarely does any change result, if indeed anyone takes any notice. It is an area where considerable improvement could be made. There is a need for assessment and training in team management. Reluctance to question authority has resulted in commercial airline accidents in the past and similar problems may arise within the working environment of an anaesthetist. Outstandingly supportive environments such as the one in which I work can be dangerous because of the very same reluctance to challenge but I do not believe my errors arose from an unwillingness to speak up. The aviation industry deals with error non-punitively and, in the UK, pilots receive regular accident prevention leaflets in which they describe their near misses. A quarterly newsletter with the same aims is published by the independent Anesthesia Patient Safety Foundation and is circulated to every anesthesiologist and nurse anesthetist in the USA. Is a similar publication warranted here? Can we reward being open about errors? What will be my ‘reward’ for admitting I make errors all the time and will I wish I had kept quiet? After I had completed my study I was chatting to the aforementioned surgeon. ‘Did you get the one where you prescribed 100 mg morphine 2-hourly?’ he asked. I honestly thought he was joking and it was with some difficulty that he persuaded me of the awful truth. It took 2 weeks of searching the theatre records and patients' notes to discover that, after a whole year of diligent record keeping, I am also guilty of the error of not recognising my errors. How many errors then do I really make?

Referência(s)
Altmetric
PlumX