Vaccine disinformation from medical professionals—a case for action from regulatory bodies?
2024; Wiley; Volume: 30; Issue: 4 Linguagem: Inglês
10.1111/jep.13985
ISSN1365-2753
AutoresDavid Robert Grimes, Trisha Greenhalgh,
Tópico(s)Ethics and Legal Issues in Pediatric Healthcare
ResumoWith doctors among the most trusted members of society, David Robert Grimes and Trisha Greenhalgh argue that regulatory bodies have a duty to act against doctors who intentionally mislead over the safety and efficacy of vaccination. Doctors enjoy a uniquely privileged position of public trust across most of the world.1 However, some doctors have emerged as perpetrators of inadvertent misinformation and deliberate disinformation about vaccines, with detrimental implications for public trust in vaccination and vaccine uptake.2-4 Vaccine hesitancy predates the current pandemic: a resurgence of diseases like measles across the European region and United States led the WHO to declare vaccine hesitancy among the top ten global threats to public health in 2019.5 Anti-vaccine rhetoric on social and mainstream media has had substantial impact on COVID-19 vaccine uptake, especially in certain demographic groups.6 Social media has had an especially significant influence on public perception of vaccine safety. Indeed, exposure to anti-vaccine conspiracy theory is a leading driver of vaccine hesitancy.7, 8 Antivaccine propaganda is designed to be emotive and induce visceral fears. Its sheer abundance online means that the public can fall victim to the illusory truth effect, where the weight of repetition gives a fiction underserved legitimacy.9 This is compounded by the availability heuristic—our tendency to ascribe more memorable or concerning accounts a greater weight than the evidence supports. The highly trusted nature of medical professionals means they are privileged to play a critical role in combatting these fictions and reversing vaccine hesitancy. Medical advice strongly influences a person's intention to accept a vaccination (or have their child vaccinated)10, 11 and, at a public health level, instils confidence in immunisation. For example, when antivaccine propaganda about Human Papilloma Virus (HPV) hit Japan in 2013, it reduced vaccine uptake from 70% to under 1%.12 A similar incident in Denmark in 2014 was associated with a reduction in uptake from 79% to 37%.13 When the same crisis hit Ireland in 2015, initial panic saw uptake fall from 87% to 51%. Ireland responded with a coalition of doctors, scientists, and patient advocates who were able to reverse harms and bolster public support.14, 15 Consequently, Ireland became the only afflicted nation to recover its high level of HPV vaccine uptake, returning to and even exceeding previous levels by 2019. Yet despite laudable efforts by most clinical and scientific professionals, the COVID-19 pandemic also saw the worrying rise of doctors and scientists who made assertions about alleged lack of benefit and alleged serious harms of vaccines that were based on evidence that was selective, questionable or already refuted; some did so repeatedly and using emotive language to huge audiences. A recent JAMA study identified 52 prominent physicians in the US alone who consistently shared medical misinformation to large audiences on social media platforms, most of it antivaccine in nature.16 Perceived experts make up a significant proportion of anti-vax social media communities.4 How medical regulators should respond to this phenomenon remains an open, urgent question. With medico-scientific professionals viewed as highly credible sources,4, 17 their pronouncements on vaccination (both positive and negative) garner significantly more attention online than those of other individuals,4, 18 and there is increasing awareness from both the public and medical bodies that false assertions from doctors do disproportionate harm to public understanding.19 In the 1990s, for example, the South African government of Thabo Mbeki embraced AIDS denialism, citing unorthodox opinions of scientist Peter Duesberg, resulting in an estimated 343,000 avoidable deaths before the policy was reversed.20 With regards to vaccination, perhaps the most infamous example is the 1998 paper by UK gastroenterologist Andrew Wakefield, which falsely linked the combined measles-mumps-rubella vaccine to autism. This led parents to reject or delay key childhood vaccinations and was followed by a rise in measles in UK. While Wakefield's paper was exposed as fraudulent and retracted in 2010,21 its damaging legacy endures decades on.8, 22 The Wakefield case occurred in a pre-social media age, spreading mainly through conventional mass media. Whilst Wakefield was struck off the medical register by the GMC, this was for research fraud, not for the long-term damage to public health caused by his deliberate spreading of vaccine disinformation.21 The world is now a very different place. Social media platforms are now a major source of information and connection for millions of people around the world; they gave 24-h coverage and manipulated by hidden algorithms, vastly magnifying the scope for harm from doctors and scientists who put out unbalanced messages. COVID-19 was the first 'digital pandemic', in which extensive communication about science and public health (both good and bad) occurred on platforms such as Twitter (now X), Facebook and YouTube. While many doctors and scientists used their platform to provide accurate, measured advice, a few mobilised social media to garner huge audiences, appearing on podcasts with audiences of tens of millions and popular YouTube channels, conveying unsupported claims to massive audiences with the appearance of professional authority. In 2021, the Centre for Countering Digital Hate traced 65% of all COVID-19 misinformation shared online to just 12 social media figures, half of whom were qualified physicians.22 Such individuals, a tiny fraction of medico-scientific professionals, have disproportionate influence. The ease with which video can now be made and distributed means that soundbites travel quickly from one social media platform to another and find their way into a wide range of conventional media outlets and even parliamentary exchanges. A doctor's 'influencer' status on social media makes them extremely marketable for in-person events and lecture tours. Curtailing the speech of doctors and medical scientists risks suppressing clinical and scientific debate, including the sharing of novel hypotheses, alerting colleagues and the lay public to possible rare harms of drugs and vaccines, pointing out conflicts of interest (e.g. from pharmaceutical manufacturers) and questioning custom and practice. Box 1 gives examples of doctors who have gone public with their views on drugs or vaccines. In some examples, doctors clearly acted in good faith and with great urgency to spread a message which led to actions that saved lives. In other examples, the doctor's claims were considered so misleading and dangerous that their regulatory body took action to restrict their licence to practise. But between these extremes, there is undoubtedly a grey zone where doctors express unorthodox (and, in some people's view, potentially dangerous) opinions but are not considered in need of regulatory action. Rather, their views are rebutted by other doctors and scientists in the peer-reviewed literature and (sometimes) mainstream media. Some doctors, for example, feel passionately that patients should be taking fewer drugs, that polypharmacy should be taken more seriously than it currently is and that the harms of some vaccines outweigh their benefits. Arguably, such views should be aired precisely because they invite counter-arguments which lead to greater scientific and public understanding. Doctors who raised early concerns about serious harms from drugs or vaccines In 1955, Drs Harold Graning and Robert Dyar, public health physicians from USA, sounded the alarm about a batch of polio vaccine which appeared to be causing polio in children, heralding the 'Cutter incident' in which killed vaccine contaminated with live vaccine caused thousands of cases of polio, 164 cases of paralysis and 10 deaths.23 In 1961, Dr William McBride, obstetrician from Australia, published a letter in the Lancet about the teratogenic effects of thalidomide,24 which led—after some delay—to withdrawal of the drug, introduction of mandatory testing in animals before a drug could be given to pregnant women, and an extension of the powers of drug regulatory authorities. In 1996, Professor Nancy Olivieri, haematologist from Canada, raised concerns that an experimental iron chelating drug (deferiprone) that she was trialling in patients with thalassaemia was losing efficacy and causing serious adverse effects, leading to premature termination of a major trial25; In 2005, Dr Irene Frachon, chest physician from France, alerted colleagues of serious adverse effects (including deaths) from the anti-obesity drug Médiator (benfluorex), eventually resulting in the manufacturer being found guilty of manslaughter, involuntary injury and aggravated deceit.26, 27 In 2017, Professor Silvia Brandalise, paediatric oncologist from Brazil raised the alarm on substandard childhood cancer drugs which lacked efficacy and contained harmful contaminants.27 Doctors who have published non-mainstream views on drugs and vaccines In 1989, Dr Uffe Ranskov, a physician from Denmark, published the Danish edition of a book called Kolesterolmyten (translated: The Cholesterol Myths28) arguing that in his view there was no evidence that high cholesterol levels were harmful to health and that the harms of cholesterol-lowering drugs outweighed their benefits. He propagated this argument widely in the Internet and his book has been translated into many languages. Critics have accused him of citing evidence selectively to support his views. In 2012, Professor Peter Gøtzsche, an epidemiologist from Denmark, published a book Mammography Screening: Truth, Lies and Controversy,29 arguing, controversially, that the harms of screening women for breast cancer using mammography outweighed the benefits; a peer-reviewed paper he co-authored on this topic was later unilaterally retracted by the journal.30 In 2013 his book Deadly Medicines and Organised Crime: How Big Pharma Has Corrupted Healthcare31 used several examples (including antidepressants) to argue that medicines are overprescribed and in many cases do more harm than good. In 2020 he published a book Vaccines: Truth, Lies and Controversies,32 arguing that the harms of many vaccines (and the opportunity costs of vaccine programmes) outweigh their benefits. His 2022 systematic review on the harms of COVID-19 vaccines remains a preprint.33 However, Gøtzsche's work to highlight the harms caused by drugs and vaccines has made some important contributions to science—for example by developing and disseminating more rigorous methodology for reporting such harms in randomised controlled trials.34 Doctors whose public statements on drugs or vaccines have led to regulatory action In 2021, Dr Peter A McCullough, cardiologist from USA, promoted ivermectin and hydroxychloroquine for COVID-19 and spoke to the Texan senate about the alleged harms of COVID-19 vaccines. McCullough developed a prominent profile on social media, claiming that COVID-19 vaccines were responsible for huge numbers of deaths. He repeated this claim on Joe Rogan's podcast and in other venues. In July 2023, he claimed in a preprint (which was rapidly retracted) that 74% of COVID fatalities were vaccine-induced, falsely portraying the SSRN preprint as a Lancet publication.35 The American Board of Internal Medicine in October 2022 recommended that McCullough's board certification be revoked due to his promotion of vaccine disinformation and refusal to correct false statements,36 determining that he had 'provided false or inaccurate medical information to the public. By casting doubt on the efficacy of COVID-19 vaccines with such seemingly authoritative statements, made in various official forums and widely reported in various medica, [McCullough's] statements pose serious concerns for patient safety. Moreover, they are inimical to the ethics and professional standards for board certification'.36 In 2021, Mr Mohammed Adil, a consultant surgeon from UK, began to post videos on social media claiming that the COVID-19 pandemic was a hoax. Some of these videos also claimed that COVID-19 vaccines would be administered to everyone, by force if necessary, and could be used to control the world's population. In June 2022, a Medical Practitioners Tribunal appointed by the General Medical Council (GMC) found that through these videos Mr Adil had used his position as a doctor in the UK to undermine public health and public confidence in the medical profession. Mr Adil was suspended from the Medical Register for 6 months. He appealed against this decision but in April 2023 the High Court upheld the decision.37 This was the first UK case in which the Court had been asked to consider the extent to which the GMC can place limits on a doctor's right to freedom of expression and sanction them for such expression. The question for medical regulators is: where to draw the line between freedom of speech and protection of the public, especially since over-assiduous investigation and sanction of a doctor who is acting in good faith is likely to cause both individual stress and potential harm to patients (e.g. if a doctor is suspended without due cause). One factor which should be taken into account is whether statements appear to have been made in good faith. Medical professionals, like all citizens, are entitled to their own opinion—but not their own facts. The authority ascribed to doctors who make statements to the public stems not just from their training and qualifications but from the reasonable assumption that they will accurately reflect what they believe to be the totality of evidence, conveying the spectrum of opinion and scientific uncertainty on contentious matters. If the doctor is speaking on a topic on which they do not have specialist knowledge, or if they misrepresent that knowledge, they are betraying the trust which the public has placed in the medical profession and medical science. It is of course difficult to quantify the impact of a medical qualification on public acceptance of antivaccine sentiment, since people may follow these influencers because they already hold anti-vaccine views. Individuals with strong antivaccine views amplify ostensible experts who affirm their pre-existing beliefs, the phenomenon of motivated reasoning being endemic across social media.35, 38 But vaccine hesitancy is a spectrum not a binary, and exposure to authoritative antivaccine statements will nudge vaccine-cautious people along the spectrum of medico-scientific acceptance10, 39 towards fear and ultimately rejection. These effects are exacerbated by low media and health literacy,40 and consequently false or misleading claims by professionals are likely to affect the most vulnerable. How medical regulators adapt to the challenges of health disinformation and maintaining standards is critical as anti-vaccine views grow even inside the medical profession (an estimated 10% of primary care physicians in the USA hold negative views on immunisation, e.g.41). The UK regulator seems have no qualms about dealing assertively with misinformation complaints from individual patients. In July 2023, for example, the GMC struck off a GP and homoeopath who had spread antivaccine misinformation for over 15 years and advised parents to falsify their children's vaccination records.42 But despite the GMC charter's obligating conduct which 'justifies the public's trust in the profession', it appears to be taking a less proactive stance on members who make antivaccine statements in public. 'Physicians who generate and spread COVID-19 vaccine misinformation or disinformation are risking disciplinary action by state medical boards, including the suspension or revocation of their license. Due to their specialized knowledge and training, licensed physicians possess a public trust and therefore have a powerful platform in society, whether they recognize it or not. They also have an ethical and professional responsibility to practice medicine in the best interests of their patients and must share information that is factual, scientifically grounded and consensus-driven for the betterment of public health. Spreading inaccurate COVID-19 vaccine information contra responsibility, threatens to further erode public trust in the medical profession and puts all patients at risk.'43 But on the other hand, as a reviewer of an earlier draft of this paper pointed out, the US has a long tradition of supporting freedom of speech through the First Amendment to the US Constitution ('Congress shall make no law… abridging freedom of speech'). A law passed in California in 2022 against misinformation by doctors, for example, was effectively repealed a year later after legal challenges from doctors asserting their First Amendment rights.44 These conflicts highlight that regulation of what doctors say and write is a fine line to tread, particularly in the unregulated world of social media. There is clearly need for scientific debate and open public discussion on any medico-scientific topic. But equally, there surely need to be some regulatory safeguards against doctors who persistently ignore or distort data to present disingenuous positions to the public. Box 2 presents some hypothetical scenarios where a regulator might act to both protect the public and to ensure that good-faith scientific discussion is not impeded. Professional bodies have a critical role in supporting members in good-faith public engagement,45 but they also have an obligation to prevent harms by members engaging in bad-faith discussion. As vaccine disinformation continues to escalate, it is vital that regulatory bodies do not risk placing the 'right' of their members to mislead the public above the right of the public not to be misled. Case 1: A doctor on social media with a large following persistently links the death of public figures and celebrities with their vaccination status, despite their being no evidence for this. Following public complaints, a regulator investigates and issues a warning to the physician over this conduct, directing them to desist from this behaviour in future. They are also directed to post a corrective to their social media, and the regulator issue a citable document outlining their warning. The physician initially complies but begins again several months later. This time, action is taken against the physician, including the issuing of an open report on the warning and subsequent action. Case 2: In a television discussion, a doctor cites a preprint alluding to potential harms from vaccination. They appear unaware, however, that the preprint was retracted for data fabrication. Following the broadcast, public complaints are raised over its uncritical inclusion. The regulator contacts the doctor to inform them of this, advising that this source is unreliable and should not be used, issuing a publicly citable document affirming this communication. The doctor agrees, and no further action is taken but as a record of the corrective exists, any future repetition of that claim would unlikely be viewed as innocent error and may lead to disciplinary action. Case 3: In a popular podcast, a physician talks about the rare side effects of a vaccine reported in one paper. They are mindful to present these risks in context of the demonstrated benefits, and clarify that this one paper has several methodological weaknesses. A complaint is raised by a member of the public, but no action is required as the doctor did not mispresent or distort any aspects of the evidence base in their discussion. The UK General Medical Council has on occasion suggested that one reason for not formally investigating high-profile doctors who make anti-vaccine statements is that such investigations could amplify their reach further through media publicity of the case. They are right to raise this concern, but the reality is that high-profile 'celebrity doctors' already have vast media reach for their fringe positions, and there is no evidence that this reach would be significantly extended by the publicity of a case against them. Regulatory action against doctors will not solve the complex problem of rising anti-vaccine sentiment among the lay public. But a rebuke from the regulator, should an investigation deem this appropriate, will send a clear signal to the public that a doctor has been judged to have acted inappropriately, which will help limit audience reach and mitigate potential harms. This is especially critical in the current era, as medical influencers have vast reach and social media companies may be motivated by the revenue from misinformation circulating on their platforms. Given that regulators exist to monitor and regulate the behaviour of doctors, there may be both an ethical and a legal requirement for the regulator to act in such cases. By publicly and robustly taking a stand against the propagation of misinformation by their members, regulators will maintain their own credibility as well as helping to bolster public trust in medicine and medical science. Dr Grimes is scientist with research interest in medical disinformation and public understanding of science; he has written extensively for the public on the harms of misinformation and disinformation. Professor Greenhalgh is a medical doctor interested in prevention of covid-19 and in professional regulation and governance. The idea for the article came from a case in which several doctors complained to the General Medical Council (GMC) about statements about vaccines. made by a medical doctor with a high social media profile. The GMC refused to investigate that doctor. A search on similar cases suggested the GMC tended to act promptly when doctors gave their own patients potentially harmful advice but appeared to feel less compelled to act if the doctor was making false or misleading statements to the general public (e.g. to large social media followings or on public lecture tours). This led us to consider the question of whether the GMC is currently acting as an 'analogue regulator in a digital age', and what regulators should be expected to do in this digital age to counter the harms of vaccine disinformation from a minority of their members. The article was read by 5 members of the lay public and small amendments were made to increase clarity. We thank academic and clinical colleagues for discussions which helped shape our ideas for this article. DRG has authored several pieces on the antivaccine movement for popular press on vaccine disinformation, and on the influence of fringe scientific figures on public perception. He also authored a book ('The Irrational Ape: Why We Fall for Disinformation, Conspiracy Theory and Propaganda,' Simon & Schuster UK/'Good Thinking—why flawed logic puts us all at risk and how critical thinking can save the world', The Experiment, New York) which focuses extensively on antivaccine propaganda, and has publicly criticised some of the scientists and doctors mentioned in this piece for propagating falsehoods. TG has complained to the GMC about a doctor who made repeated antivaccine statements on social media. She is supervising a PhD student who is studying antivaccine movements. Data sharing not applicable to this article as no datasets were generated or analysed during the current study.
Referência(s)