Editorial Acesso aberto Revisado por pares

Editorial: Are We All Better-than-Average Drivers, and Better-than-Average Kissers? Outwitting the Kruger–Dunning Effect in Clinical Practice and Research

2019; Lippincott Williams & Wilkins; Volume: 477; Issue: 10 Linguagem: Inglês

10.1097/corr.0000000000000948

ISSN

1528-1132

Autores

Seth S. Leopold,

Tópico(s)

Psychological and Educational Research Studies

Resumo

The greater a person's ignorance, the more confident that person is in his or her knowledge. This phenomenon—sometimes called the Kruger–Dunning effect [6]—is the premise behind the common observation that people who are below-average drivers or below-average kissers fancy themselves to be much more capable than they actually are. Unfortunately, the Kruger–Dunning effect is responsible for more than just bad dates; it also worsens the health of our patients. For example, the less a person knows about vaccines, the more certain (s)he is that the science favoring pediatric or influenza vaccine recommendations is flawed [13, 14]. Same goes for knowledge about genetically modified organisms in food [19]. When doctors become patients, we're just as susceptible to this harmful paradox. Physician-patients are hardly better than layfolk at selecting high- over low-value care, even when provided with convincing evidence favoring the former [4]. Worse still, we've been outed: The lay press is all over this [2]. The Kruger–Dunning effect carries at least three important implications for orthopaedic surgeons: (1) How to disseminate (and learn) new surgical techniques; (2) how to convince patients that unproven treatments they think they need are not as good as they believe; and (3) why discredited treatments persist in common orthopaedic practice. When Are We Good Enough to Try a New Surgical Approach? The Kruger–Dunning effect counsels caution when surgeons try out new interventions they've learned at courses or read about in journals; ignorance of this may result in serious injury to patients. In fairness, Drs. Kruger and Dunning did not evaluate surgical skills, but rather individuals' self-perceptions in cognitive domains, specifically logic, humor, and English-language grammar [6]. But subsequent work evaluating surgeons and other physicians as they acquired psychomotor skills [11, 22] arrived at similar conclusions to those of Drs. Kruger and Dunning. This problem has caused serious patient harm in our specialty, in particular when a technically demanding surgical approach is advocated for wider dissemination by a small number of experienced surgeon-developers [12]; the history of the two-incision "minimally invasive" THA comes to mind here. Such techniques don't always survive the transition to more-widespread use, and their promise seldom is realized [3, 15]. The solution is rather obvious: Each of us needs to cultivate the awareness that we almost certainly overestimate our abilities, and the modesty to know what to do with that fact. We also need to know that the way we acquired new skills during residency—with preceptored, expert instruction—perhaps now augmented with more-contemporary approaches like simulator training or cadaver-based courses, still represents the safest way to learn a novel or difficult surgical approach. Mastery takes time and supervised repetition. One wouldn't expect to be able to read an article or watch an online video about golf and be able to reproducibly chip the ball from bunker to green. The difference between bad golf and bad surgery, of course, is in the fallout, which is a great deal more severe in the operating room than on the links. How to Speak with a Distrustful Public and Skeptical Patients? Many surgeons adopt the knowledge-deficit model [5, 18, 21] when speaking with patients about the treatments they seek, particularly when those interventions are unsupported by good evidence. The knowledge-deficit model holds that physicians have the information our patients need, that we provide it objectively, and that these transactions will result in patients who make choices about their health that are consistent with the good information we've provided to them [5]. Yet how often has each of us had a conversation with a patient about some unproven treatment—stem cells for arthritis, minimally invasive whatever, you name it—only to have the patient depart our offices emptyhanded in search of someone else who will provide the treatment the patient "knows" (s)he needs? In this scenario, we're working uphill against the Kruger–Dunning effect, but we're generally using the wrong tool for the job. The knowledge-deficit model of scientific (or medical) communication has been shown again and again to be ineffective [5, 18, 21]. In this setting, we can't beat ignorance with knowledge; the solution here isn't a doctor-patient data dump. Rather, the remedies are modesty and empathy. Modesty, because we need to realize both that our communication approach here (the knowledge-deficit model) has been shown over and over to be ineffective [5, 18, 21]), and that our preferred treatments are not a perfect fit for all patients. And empathy is important because we are not the only party in these conversations with essential knowledge. The person whose actual skin is in the game sits across from us, and understanding his or her experiences with previous treatments—as well as past conversations with experts that may have resulted in resistance to future ones—are essential. Those conversations are key reasons why we need not merely listen to our patients' hopes and fears, we need to welcome their skepticism about our recommendations. Giving our patients room to express their skepticism can help alleviate it. It's important to remember that our own recommendations—well-intentioned though they may have been—have not always been correct. To achieve healthy partnerships with patients in this setting, we need first to listen carefully enough to discern that the patient is not accepting our care suggestions, and then let our own defenses down sufficiently to find out why. Understanding and endorsing our patients' own narratives about their health is a key first step towards creating connections [10]. Considering patients' deeply held values in the context of shared decision-making [16] is an essential next part of the process. Finally, to the degree we can, we need to make sure our recommendations are scientifically sound [7]. When a surgeon who recommends against stem cells for osteoarthritis recommends instead discredited treatments like viscosupplementation [9] or surgical interventions like arthroscopic débridement (or even meniscectomy for "mechanical symptoms" [8]) later in that same conversation, it won't take a search-engine-savvy patient long to discern the hypocrisy. Why Do Discredited Operations Persist in Practice? Some pressures that delay the acceptance of high-quality evidence into practice include assessment bias and transfer bias [8]. When it comes to surgical results, surgeons get to grade their own tests, and surgeons—being human—can be rather permissive graders. And the phenomenon of transfer bias (happy patients tend to return; unhappy ones complain to the crosstown competition) tends to reinforce our often-mistaken self-perceptions of efficacy. High-quality randomized trials and large, national registry studies tend to minimize these biases, and for those reasons, we need to believe them when we see them. Most arthroscopic knee surgeons can't claim 99% follow-up 2 years after a knee scope for a degenerative medial meniscus tear, as was achieved recently in yet another high-quality randomized trial showing the inefficacy of this procedure [20]. If a patient is happy 3 months after a knee scope, that patient's surgeon might surmise the patient will remain happy ever after. But the likelihood is otherwise, unhappy patients tend to follow-up elsewhere when the pain returns, and the surgeon probably won't hear about it. The knowledge deficit is ours, and Kruger–Dunning is working against us. Demonstrating objective competence and the ability to learn flexibly are graduation requirements in surgical training. Residents who don't accept new knowledge from reliable sources violate important professional norms, and risk being fired [1]. But somewhere along the way, we self-perceive competence in excess of our knowledge, and we develop obstructions to flexible thinking and integration of new discoveries into our practices. This strikes me as a peculiar and unexplored reversal of the Kruger–Dunning effect, which in principle diminishes with increasing expertise. Our profession may yet prove to be a rich ground for social psychologists. Regardless, every orthopaedic subspecialty has seen examples of operations we believed were effective, but time, follow-up, and better-quality research proved us wrong. The time window between the availability of adequate evidence and abandonment of ineffective or dangerous interventions is a span of time we should have difficulty explaining to patients and the public. With all this in mind, it seems fair to give the last word to non-expert: A journalist in a business magazine [17] who has figured this out. He complained, appropriately, in 2015 about the then-persistent (and now increasing [9]) use of hyaluronic acid injections despite years of robust evidence demonstrating their inefficacy and strongly worded national guidelines against their use: "I can hear the response from some doctors already: 'In my experience,' they say, 'this works for some patients.' Sorry, but one doctor's experience (or 'clinical judgment', as some call it) doesn't trump science. That's why we do experiments, to determine whether or not our subjective impression is correct. In this case, the science is clear" [17] – Steven Salzberg, Forbes Magazine, 2015.

Referência(s)