Almost three years after the pandemic, Covid-19 remains stubbornly persistent. The same goes for misinformation about the virus.
As Covid cases, hospitalizations and deaths mount in parts of the country, myths and misleading stories continue to develop and spread, exasperating overworked doctors and evading content moderators.
What started in 2020 as rumors casting doubt on the existence or severity of Covid quickly evolved into often outlandish claims about dangerous technology lurking in masks and the supposed miracle cures of unproven drugs, such as ivermectin. Last year’s vaccine rollout sparked another wave of unfounded alarm. In addition to all the claims that are still being debated, there are now conspiracy theories about the long-term effects of the treatments, researchers say.
Ideas still thrive on social media platforms, and the constant barrage, now years of accumulating, has made it increasingly difficult to break through accurate advice, disinformation researchers say. That leaves people already suffering from pandemic fatigue further accustomed to the ongoing dangers of Covid and susceptible to other harmful medical content.
“It’s easy to forget that misinformation about health, including about Covid, can still contribute to people not being vaccinated or create stigma,” said Megan Marrelli, the editor-in-chief of Meedan, a non-profit organization focused on digital literacy and access to information. “We certainly know that misinformation about health contributes to the spread of disease in the real world.”
Twitter is of particular concern to researchers. The company recently dismantled the teams responsible for controlling dangerous or imprecise material on the platform. stopped enforcing its Covid disinformation policy and began basing some content moderation decisions on public polls posted by its new owner and CEO, the billionaire Elon Musk.
From November 1 to December 5, Australian researchers collected more than half a million conspiratorial and misleading English-language tweets about Covid, using terms such as “deep state,” “hoax” and “bioweapon.” The tweets attracted more than 1.6 million likes and 580,000 retweets.
The researchers said the amount of toxic material soared late last month with the release of a film containing baseless claims that Covid vaccines caused “the largest orchestrated extinction in the history of the world.”
Naomi Smith, a sociologist at Federation University Australia who helped conduct the research along with Timothy Graham, a digital media expert at Queensland University of Technology, said Twitter’s disinformation policy helped to spread anti-vaccine content common on the platform in 2015 and 2016. From January 2020 to September 2022, Twitter suspended more than 11,000 accounts for violations of its Covid disinformation policy.
Now, said Dr. Smith, “the protective barriers are falling down in real time, which is both interesting and academic and absolutely terrifying.”
On Elon Musk’s Twitter
“Pre-Covid, people who believed in medical misinformation generally just talked to each other, within their own little bubble, and you had to put in some work to find that bubble,” she said. “But now you don’t have to do any work to find that information — it’s presented in your feed with other types of information.”
Several high-profile Twitter accounts suspended for spreading unfounded claims about Covid have been reinstated in recent weeks, including those of Representative Marjorie Taylor Greene, a Georgia Republican, and Robert Malone, a vaccine skeptic.
Mr. Musk himself has used Twitter to weigh in on the pandemic, predicting in March 2020 that the United States is likely to “almost zero new casesby the end of that April. (In the last week of the month, more than 100,000 positive tests were reported to the Centers for Disease Control and Prevention.) This month, he took aim at Dr. Anthony S. Fauci, who will soon step down as President Biden’s top medical adviser. and the longtime director of the National Institute of Allergy and Infectious Diseases. Mr Musk said Dr. Fauci should be prosecuted.
Twitter did not respond to a request for comment. Other major social platforms, including TikTok and YouTube, said last week they remain committed to fighting disinformation about Covid.
YouTube bans content – including videos, comments and links – about vaccines and Covid-19 that contradicts recommendations from local health authorities or the World Health Organization. Facebook’s policy on Covid-19 content is over 4,500 words long. TikTok said it had removed more than 250,000 videos due to misinformation about Covid and was working with partners like the content advisory board to develop its policies and enforcement strategies. (Mr. Musk disbanded Twitter’s advisory board this month.)
But the platforms are struggling to enforce their Covid rules.
Newsguard, an organization that tracks online misinformation, found this fall that typing “covid vaccine” into TikTok caused searches for “covid vaccine injury” and “covid vaccine warning” to be suggested, while the same search on Google led to recommendations for “walk-in covid vaccine” and “types of covid vaccines.” A TikTok search for “mRNA vaccine” returned five videos containing false claims within the first 10 results, according to researchers. TikTok said in a statement that the community guidelines “make it clear that we do not allow harmful disinformation, including medical disinformation, and that we will remove it from the platform.”
In recent years, people were getting medical advice from neighbors or trying to self-diagnose through Google Search, said Dr. Anish Agarwal, an emergency physician in Philadelphia. Now, years after the pandemic, he still gets patients believing “crazy” claims on social media that Covid vaccines will put robots in their arms.
“We fight that every day,” says Dr. Agarwal, who teaches at the University of Pennsylvania Perelman School of Medicine and is an associate director of Penn Medicine’s Center for Digital Health.
Online and offline discussions about the coronavirus are constantly evolving, with patients asking him questions about booster injections and long Covid lately, said Dr. Agarwal. He has a grant from the National Institutes of Health to study the Covid-related social media habits of different populations.
“As we move forward, understanding our behavior and thoughts around Covid will also likely shed light on how individuals interact with other health information on social media, how we can actually use social media to combat misinformation,” he said.
Years of lies and rumors about Covid have had a contagious effect, hurting public acceptance of all vaccines, said Heidi J. Larson, the director of the Vaccine Confidence Project at the London School of Hygiene & Tropical Medicine.
“The rumors about Covid will not go away – they will be repurposed and they will adapt,” she said. “We can’t remove this. No company can solve this.”
Some efforts to slow the spread of misinformation about the virus have run into First Amendment concerns.
A law that California passed several months ago and is set to go into effect next month would penalize doctors for spreading false information about Covid vaccines. It is already facing legal challenges from plaintiffs who describe the regulation as an unconstitutional infringement on freedom of expression. Tech companies, including Meta, Google and Twitter, have faced lawsuits this year from people banned for misinformation about Covid, claiming the companies have taken their content moderation efforts too far, while other lawsuits have pushed their platforms have been accused of not doing enough to spread misleading stories about the pandemic.
Dr. Graham Walker, an emergency physician in San Francisco, said rumors spreading online about the pandemic led him and many of his colleagues to social media to try to correct inaccuracies. He has posted several Twitter threads with over a hundred tweets brimming with evidence to debunk misinformation about the coronavirus.
But this year he said he felt increasingly defeated by the onslaught of toxic content on a variety of medical issues. He left Twitter after the company abandoned its Covid misinformation policy.
“I was starting to think this wasn’t a winning battle,” he said. “It doesn’t feel like a fair fight.”
Now, Dr Walker said, he is watching a “triple illness” of Covid-19, RSV and flu bombard the healthcare system, pushing emergency room wait times from less than an hour to six hours in some hospitals. Misinformation about readily available treatments is at least partly responsible, he said.
“With the most recent vaccines, if we had a bigger surge in vaccinations, we would probably have a smaller number of people getting extremely sick with Covid, and that will definitely put a dent in hospital admissions,” he said. “Honestly, right now we’re going to take any dent we can get.”