26 Oct What Does Relentless Misinformation Mean for Medicine?
The onslaught began just in time for school. Florida, where Dr. Geeta Nayyar, MD, MBA, and her 8-year-old daughter live, had gone through a summer of pain during which its handle on COVID-19 loosened by every measure. As a rheumatologist and med school professor, Dr. Nayyar had long ago grown accustomed to answering questions, from classmates and their parents, about health issues as varied as the sniffles and sepsis. Why would they stay quiet amid the worst pandemic in a century?
The inquiries came through via “mommy circle” group chats and email threads. At a time when distrust in the press and the government had been growing and COVID-19 placed so much at risk, it was helpful to have a physician in the mix. What do you, as a doctor, think? Should I send my kid back to school? Are masks safe?
In the past, even when Dr. Nayyar advocated for actions targeted by conspiracy theories, like getting a flu shot, her audience mostly took the advice. But something had changed.
Among her child’s social circle, Dr. Nayyar was relegated to another voice in a chorus of conflicting, and often incorrect, information. It was as if her medical degree meant zilch.
“I’m in there trying to debunk all of these myths—and these are my friends and my daughters’ friends’ moms—but if they don’t like what I say, now they’re just like, ‘Oh, well, we’re not doing that,’” she says. “And it’s hard because it impacts us all.”
Dr. Nayyar now faced her own dilemma: How could she send her child to school when she couldn’t trust that others were taking proper precautions? The doctor kept her kid home, for virtual learning. The tougher question, however, focused on her profession and its role in a critical conversation that should have leaned on medical expertise. After nearly six months of enduring COVID-19 and all of the misinformation and disinformation that came with it, discrediting the nonsense had become exhausting.
“There was always misinformation out there, on the internet and social media. But now there’s so much more competition,” Dr. Nayyar says. “We’re not just competing with other sources of information—we’re competing with people’s inability to be objective.”
An Infodemic Infects the World
By the time COVID-19 blossomed into a pandemic, physicians and scientists worldwide had mobilized to fight an anticipated fog of misinformation. In mid-February, 27 public health researchers published a call for solidarity and data transparency in The Lancet, warning of the dangers of speculation and conspiracy theory. Less than a month later, two medical students in the UK described a “global rise” in the “spread of misinformation that has plagued the scientific community and public” by comparing it to bogus claims that swirled unchecked at the start of the HIV epidemic. Reporters and pundits, all too aware of how good-natured conjecture and malicious manipulation had infected their own industry, began echoing the World Health Organization’s belief that COVID-19 represented an “infodemic” as much as it did a pandemic.
They all were right to be worried.
By April, researchers found that 70 percent of webpages in the top 110 Google searches for “Wuhan Coronavirus” had low reliability scores, while none received a high score. Another benchmark suggested that just two of the websites were trustworthy, and all but 11 of them satisfied a third measure, set by JAMA. The findings meant that bad information was getting great exposure on the world’s top search engine. Social media amplified the noise. So too did opportunists, in medicine and media, who hoped to ride the decade’s biggest story to prominence. Plandemic, anyone?
Before long, US government leaders, including President Donald Trump, shared their own distorted takes on COVID-19. (He’s still arguing that the coronavirus will simply “go away.”) And the effects of misinformation became more concrete than estimated increases to infection rates and death tolls. For example, after Trump erroneously suggested bleach might kill the virus, the number of calls to poison control centers skyrocketed. Anti-mask protests cropped up across the country, gyms opened in defiance of local restrictions, and the unconvinced and unconcerned hosted parties that emerged as coronavirus hot spots.
Even now, in spite of more than 215,000 dead Americans, the CDC is burning through trust that took decades to build, as politicos appear to vie for control over what the agency says about the pandemic’s spread and how to stop it.
If the torrent of COVID-19 misinformation and disinformation is enough to turn you around, imagine what it can do to the physicians and clinicians on the front lines of this pandemic. Conspiracy theories seem to only be getting wilder and stronger, and it’s our doctors who must rein them in or lose a patient’s life to a fantasy. They know firsthand that truth can be the difference between life and death.
But the misinformation conversation hasn’t much covered how the phenomenon affects healthcare workers, professionally and personally. We need research on that—and we might soon get it—but for now, all we have are anecdotes.
Why does it matter? “We have a problem wherein there is a non-trivial number of Americans who don’t trust advice from doctors and medical scientists,” says Dr. Matt Motta, PhD, a political science professor at Oklahoma State University who focuses on science communication, health policy, and anti-science attitudes. “And that distrust, at least based on my research and research from others, is increasingly common and prevalent on the ideological right.”
If we can’t build a world where the public trusts medicine and science, we can’t expect medicine and science to work as they should.
‘I Think It Is Burning Me Out’
Dr. Nayyar, the Florida-based rheumatologist, didn’t recall any political conversations with patients prior to the pandemic. That kind of discussion, after all, doesn’t belong in the clinic, according to the American Medical Association’s code of ethics. But nowadays, partisan beliefs appear to affect how her patients perceive of and mitigate health risks, particularly the coronavirus. In some ways, it’s not so different—doctors have always treated stubborn patients—but in other ways, it’s draining, Dr. Nayyar says.
“We’re here to take care of people, and we’re even risking our own lives, only to then be completely undermined,” she adds. “To hear people say, ‘Don’t worry about wearing a mask’ when it’s actually hurting our colleagues and our own lives—it’s such a Catch 22 for so many of us right now.”
She’s not the only physician who believes that anti-science rhetoric is wearing out doctors, whether or not they’re practicing. Consider these tweets from early this month.
I think it is burning me out, esp when we were in the surge of COVID. It is truly disheartening to be helping pts who are so scared and then to see all the misinformation being spread. Professionally, I have to spend more time addressing misinformation.
— Linda Girgis MD (@DrLindaMD) September 1, 2020
Mentally draining – it became so bad I wrote a blog post that I use to answer those that continue to push this dangerous content
I am certain I have lost long time ‘friends’ but I follow the #science not the politicshttps://t.co/VZJ7gGj9Ya
— Nick van Terheyden, MD – “Dr Nick” (@drnic1) September 1, 2020
At a time when physicians are putting in extra hours, the burden of myth busting only worsens their workloads.
I have to dispel untruths and myths all the time with patients. A recent patient was extremely distressed and felt stigmatized due to her testing for covid and her employer’s attitude to her. Had to perform a lot of supportive counseling.
— Bhadrashil Modi (@BhadrashilModi) September 1, 2020
Despite hours spent searching Google Scholar and discussions with several experts, I failed to find any research into links between medical misinformation and physician burnout. Experts confirmed that, to their knowledge, the scientific community has yet to take on this question. But we can draw some insights from distant and recent history.
Let’s start with the influenza pandemic of 1918, better known as the Spanish Flu. Back then, physicians and scientists knew less about infectious diseases and their spread than we do today, but they weren’t totally oblivious. In December 1918, Dr. George Price wrote an article for The Survey (PDF) detailing a medical conference during which public health experts grappled with how to manage an outbreak that had already killed hundreds of thousands of Americans. Most attendees agreed that “isolation” was the nation’s best bet, but they stumbled through discussions on masks and vaccines.
Dr. Price’s piece reeked of frustration.
“The saddest part of my life,” he quoted an army doctor from Michigan as saying, “was when I witnessed the hundreds of deaths of the soldiers in the army camps and did not know what to do. At that moment, I decided never again to prate about the great achievements of medical science and to humbly admit our dense ignorance in this case.” Dr. Price went on to lament how helpless the physicians and scientists were in deciding on an appropriate course of action to protect patients against infection. Some lobbied for masks, while others questioned their validity. (Some physicians went on to join the Anti-Mask League, and high-ranking public health officials received fines for not wearing one in public.) Put simply, the people who should have known best felt frustrated by knowing so little.
Dr. Price’s article spotlighted how this lack of concrete information pitted experts against perceived misinformation that stemmed from their own ranks, rather than powerful interests or a duped public. “All the various and conflicting testimony, however, was declared to be unreliable by no less an authority than Frederick L. Hoffman,” the author wrote, “who remarked that statistics were never so much abused as by the doctors and health officers in the epidemic…” As such, the speaker concluded, the data was “worthless.”
How this information gap affected doctors on the ground is unclear, but it’s conceivable that they shared at least some of the conference attendees’ frustrations.
Research into the anti-vaxxer movement has produced firmer findings into misinformation’s effects on physicians. A 2011 article published in the American Journal of Preventive Medicine found that more than half of physicians spent 10 to 19 minutes discussing vaccinations with parents who harbored “substantial concerns.” A small portion, 8 percent, said they spent 20 or more minutes talking to these misguided parents. That burden contributed to big problems for doctors: 46 percent of pediatricians and 21 percent of family medicine physicians found their job “less satisfying because of parental vaccine concerns,” according to the study.
Dousing the flames of misinformation might not be the kind of work that ignites a love for the job.
What Do Patients Think of Physicians?
As public trust in major institution erodes, many people remain confident in their doctors. This is good news for the truth, but a new study published in Social Science & Medicine suggests the way the public views physicians could harm the physicians themselves.
Wait a second. How?
Dr. Amelia Goranson, PhD, who studies moral typecasting and perceptions at the Deepest Beliefs Lab at the University of North Carolina at Chapel Hill, found that Americans see doctors as “godlike.” Her research team probed how 681 people view the mental capacities of doctors, including their agency, self-control, and invulnerability, to others. Doctors, it turns out, received similar ratings to God.
Again, is that a bad thing?
Dr. Goranson’s study participants went further, claiming that physicians are less apt to feel fear, pain, embarrassment, and even hunger. A second inquiry revealed that people believe doctors can ignore physical and mental health issues better than the rest of us. What’s more, the findings suggest that bodily and emotional constraints affect job performance less when it comes to physicians.
But science knows, thanks to intensive research into physician burnout, that doctors are not gods. They’re not robots. And their work can take its toll.
Our incorrect perceptions of physicians can hurt them by enabling regulators to develop harmful rules, administrators to institute grueling scheduling, and patients to become dissatisfied at the first sign of imperfection.
While this area requires more research, Dr. Goranson describes how the godlike perception might backfire. “Every time we’re a little disappointed, or every time we don’t get the exact right answer right away, perhaps that could feed into this cycle of, ‘Well, maybe this person just doesn’t have time for me, or maybe they think they’re better than me, or they’re just not paying attention,’” she says. Patients could begin to believe their doctor is stuck in the ivory tower, reading studies that few others comprehend. “And so they’re thinking, ‘You’re not treating me like I think you could treat a patient—which is perfectly—so maybe I shouldn’t trust this institution, or maybe I won’t come back to this doctor.”
Although that scenario remains hypothetical, Dr. Goranson’s findings highlight the risk in believing that doctors are bulletproof. Because, despite their empathy and education, they’re not. And at some point, some doctor will let down some patient, and the implications of that distress in our hyper-partisan, misinformation-driven society are unclear.
In the End, the Truth Might Depend on Doctors
What about the people who don’t trust physicians? The folks who believe doctors are part of some grand conspiracy or simply don’t know what they’re doing? These people might be the ones who cause the many to suffer.
Dr. Motta, the Oklahoma-based political science professor who studies anti-science attitudes and health policy, makes a compelling case: While most Americans accept information from doctors and scientists, the number has dropped amid COVID-19 pandemic, especially concerning Dr. Anthony Fauci, MD, and the CDC. “A majority is not consensus. It doesn’t mean that everyone is accepting those prescriptions,” Dr. Motta says. “And when it comes to something like, let’s say, resolving the current pandemic, we may be in a situation where we need somewhere between 70 and 80 percent of Americans to choose to vaccinate against coronavirus, in order to put the virus’s spread into decline.”
In other words, the few skeptics can prevent herd immunity—and delay the return to something of a normal life—for everyone else.
Unreliable media outlets harden their followers to buy the party line, and social media propagates their misinformation. Remember the doctors who discussed how COVID-19 misinformation has affected them? A QAnon adherent joined the very same thread to spread his unverified gospel. Twitter has since suspended his account.
The CDC and other government health leaders, meanwhile, are losing trust, mostly on the right but also on the left due to concerns of political manipulation. When we don’t know where to turn for the facts, we’re liable to turn to spin and straight lies. That’s especially true in a situation like the pandemic, where the information is always changing and we seem to know so little.
“It is absolutely a communications problem,” Dr. Motta says. “And I’d like to flip it on its head and say it’s potentially a communications solution.” Although more Americans are beginning to distrust medical and scientific institutions, most do trust their doctors, at least for now. So, Dr. Motta argues that physicians should deliver public health messaging, in one-on-one interactions with patients. Dr. Fauci has begun to embrace this tactic to promote acceptance of COVID-19 vaccinations, social distancing, and other positive behaviors.
Some patients will continue to reject recommendations from their physicians, as anti-vaxxers have in the pursuit of medical exemptions. Dr. Motta and the research have a solution for those individuals: ordinary people.
“What we can’t do is hit people over the head with the facts and say, ‘You’re wrong, and here’s why you’re wrong,’” he says. Too often, the people who reject science know a lot about the topic in question because they use that information to hone their arguments. The key, however counterintuitive, is to talk less about the science. “We talk about the other things that make people more likely to reject the science,” Dr. Motta says, “and we talk to people on their own terms.” Anti-vaxxers, for instance, might hold that position because they value bodily sanctity. Ordinary people can reach anti-vaxxers by noting that they share that value—and that a measles infection violates that sanctity.
Physicians can also use this strategy to clean up misinformation. It’s a lot to ask of doctors who are working long hours and dealing with widespread devastation, but then again, the effects of ignorance are inescapable. Misinformation and disinformation can worsen the pandemic and health crises to come. They can kill. And until we clean our media landscape, there’s little doubt that conspiracy theory will continue to enter the clinic.
— Jack Murtha. Connect with him on Twitter.