Social media companies like Facebook and YouTube have ramped up their policies against coronavirus misinformation and banned false claims about Covid-19 vaccines. But as the vaccine begins distribution, online accounts are exploiting loopholes in new policies and successfully sharing misleading claims that attempt to discourage vaccination.
Throughout the pandemic, platforms have established and updated rules meant to curb false claims related to Covid-19. Between March and October, Facebook took down 12 million pieces of content on Facebook and Instagram, and it added fact-checking labels to another 167 million posts. But the rollout of an authorized Covid-19 vaccine has forced social media companies to adapt again, changing their approach to both Covid-19 misinformation and longstanding anti-vaccination content.
There are already plenty of examples of online content that sow doubt in Covid-19 vaccines. Posts that suggest vaccination is part of a government scheme and memes that imply that the vaccine comes with extreme side effects either aren’t being caught by the platforms or don’t appear to violate their rules.
The platforms aren’t just contending with anti-vaccination communities. Conspiracy theorists, conservative groups, fringe outlets, and others are actively hyping concerns about vaccines, according to Yonder, a firm that’s advising companies involved in vaccine development. While recent polls indicate the number of Americans willing to get the vaccine has grown — to about 70 percent, according to the Kaiser Family Foundation — millions of Americans are still reluctant to take the vaccine, and many may not take it immediately.
Facebook has vowed to remove false Covid-19 vaccine claims that could cause imminent physical harm, and YouTube has said it will take down videos about Covid-19 vaccines that contradict health authorities like the World Health Organization. Twitter is taking the two-pronged approach of taking down Covid-19 misinformation it deems the most harmful plus labeling claims that are merely misleading.
But overall, these approaches thus far seem focused on removing misinformation rather than addressing the broader scope of vaccine hesitancy and skepticism — a hurdle that could be much more complicated to address.
While platforms tend to tout new policies designed to curb misinformation, they don’t always find and remove all the content that violates their rules. In searching Facebook, YouTube, and Twitter, Recode found plenty of vaccine misinformation that had yet to be removed or labeled as such.
On Facebook, Recode identified several posts that were only taken down after we flagged them. Some of those removed claim the pandemic was planned or that the vaccine would include a microchip, a claim that’s specifically banned under Facebook’s rules. Another post that was taken down by Facebook was a meme that jokingly implied that the vaccine comes with extreme side effects. The image had already been shared more than 100,000 times by the time Facebook took it down.
Other posts identified by Recode that appeared to violate the company’s rules include one Facebook post claiming that the Covid-19 vaccine will “alter your DNA” and “attack the uterus.” It linked to a YouTube video that references the “Plandemic” conspiracy theory and Bill Gates. The post had been shared in a Facebook group with more than 12,000 members, and the video was viewed more than 15,000 times on YouTube. Similarly, in a public Facebook group with 50,000 members, a post alleged that the Covid-19 vaccines were part of an attempt to “keep us from ascending into the spiritual beings that we were meant to be.”
While YouTube has promised to remove Covid-19 vaccine misinformation, Recode found a range of content on the platform that seemed to violate those policies, including easily discovered videos suggesting that the Covid-19 vaccine changes peoples’ DNA or that the vaccine is a ploy to intentionally kill the elderly in nursing homes. YouTube took down one video flagged by Recode that suggested the vaccine could be the “mark of the Beast” and connected it to the end times in the Book of Revelation.
Media Matters has found that, despite YouTube’s policies, videos suggesting that the Covid-19 vaccine included a microchip have received more than 400,000 views, and some of them had ads running on them. Meanwhile, Sam Clark, at the YouTube watchdog Transparency Tube, points out that plenty of channels known for pushing conspiracies are posting about vaccines.
Twitter will begin enforcing its new policies against Covid-19 misinformation starting on December 21, and research shows that the problem is significant and growing. November saw the greatest increase in the number of retweets of vaccine misinformation on Twitter this year, according to the misinformation-tracking company VineSight.
Individual posts on these platforms don’t necessarily gain a lot of engagement, but they can get a significant amount of traction in aggregate and can even spread to other platforms. According to data from Zignal Labs, between December 8 and 14, there were nearly 30,000 mentions of the claim that the Chinese Communist Party had ties to the vaccines and nearly 90,000 mentions of Bell’s palsy, an often temporary condition that causes parts of one’s face to sag. After four participants in the Moderna vaccine trial got the condition, the FDA warned people to watch for signs of Bell’s palsy, but the agency says there’s not enough information to link Bell’s palsy and the vaccine.
Meanwhile, much of the content sowing doubt about Covid-19 vaccines avoids making factual claims and doesn’t get removed. In an Instagram post, for instance, conservative commentator Candace Owens called people who get the vaccine “sheep.” The video was given a label by Facebook, but it was still viewed more than 2 million times.
Also fueling anxiety are those making false claims about mandatory vaccinations, which the US government is not considering. Research from Zignal Labs found that, between December 8 and 14, there were more than 40,000 mentions of a mandatory vaccine on the platforms it tracks.
“Factually, they’re fighting a ghost. They’re fighting a boogeyman,” notes David Broniatowski, who studies behavioral epidemiology at George Washington University. “There is nobody out there who’s saying that we’re going to pass a law mandating a Covid vaccine.”
These ideas don’t exactly amount to misinformation, and they often stop short of making claims about the vaccine itself. Still, they serve to undermine confidence in vaccination by raising the prospect of government control, politicizing the vaccine, or raising doubts about the science behind it.
“Somebody says, ‘Do you know what’s in the Covid vaccine?’ And they just leave it at that —it’s not really misinformation,” said Broniatowski. “But it’s certainly increasing distrust in the vaccine.”
This ambiguity makes the job of moderating what’s allowed on sites like Facebook and YouTube very difficult. These platforms don’t want to be accused of amplifying anti-vaccination content, but responsibly sorting through content that includes Covid-19 vaccine-oriented debates, humor, opinions, and facts as well misinformation is a major endeavor, especially because we’re still learning more about Covid-19 vaccines. At the same time, public health experts have also emphasized that people ought to have space to ask questions about vaccines.
Importantly, these platforms are using strategies beyond take-downs, like applying labels and elevating accurate information from health authorities. But the primary concern is that the policies of Facebook, Twitter, and YouTube could ultimately exacerbate the problem of vaccine hesitancy, not only through policing misinformation but also dealing with those gray areas. So while the public might pressure platforms to take down objectionable content, what they leave up is just as tricky.
Open Sourced is made possible by Omidyar Network. All Open Sourced content is editorially independent and produced by our journalists.