Facebook and YouTube are losing the fight against Covid-19 vaccine misinformation



[ad_1]

Open source logo

Social media companies like Facebook and YouTube have stepped up their policies against coronavirus misinformation and banned false claims about Covid-19 vaccines. But as the vaccine begins to roll out, online accounts are taking advantage of loopholes in new policies and successfully sharing misleading claims that try to discourage vaccination.

Throughout the pandemic, platforms have established and updated rules aimed at curbing false claims related to Covid-19. Between March and October, Facebook removed 12 million pieces of content on Facebook and Instagram, and added fact-check tags to another 167 million posts. But the launch of a licensed Covid-19 vaccine has forced social media companies to adapt again, shifting their focus from both Covid-19 misinformation and long-standing anti-vaccination content.

There are already many examples of online content that cast doubt on Covid-19 vaccines. Posts suggesting that the vaccination is part of a government scheme and memes that imply that the vaccine has extreme side effects or is not being detected by the platforms or does not appear to violate their rules.

The platforms are not only competing with the anti-vaccination communities. Conspiracy theorists, conservative groups, fringe media and others are actively promoting concerns about vaccines, according to Yonder, a firm that advises companies involved in vaccine development. While recent polls indicate that the number of Americans wanting to get the vaccine has grown to about 70 percent, according to the Kaiser Family Foundation, millions of Americans are still reluctant to take the vaccine, and many may not. take immediately.

Facebook has vowed to remove false claims of the Covid-19 vaccine that could cause imminent physical harm, and YouTube has said it will remove videos about Covid-19 vaccines that contradict health authorities like the World Health Organization. Twitter is taking the two-pronged approach of eliminating Covid-19 misinformation that it considers to be the most harmful and labeling claims that are simply misleading.

But overall, these approaches so far seem focused on eliminating misinformation rather than addressing the broader scope of vacillation and skepticism about vaccines – a hurdle that could be much more difficult to address.

While platforms tend to promote new policies designed to curb misinformation, they don’t always find and remove all content that violates their rules. By searching Facebook, YouTube, and Twitter, Recode found a lot of misinformation about vaccines that had not yet been removed or tagged as such.

On Facebook, Recode identified several posts that were only removed after we flagged them. Some of those eliminated claim that the pandemic was planned or that the vaccine would include a microchip, a claim that is specifically prohibited under Facebook rules. Another post that was removed by Facebook was a meme that jokingly hinted that the vaccine has extreme side effects. The image had already been shared more than 100,000 times when Facebook removed it.

This meme, which implies that the vaccine has serious physical side effects, is no longer available on Facebook.
Facebook screenshot

Other posts identified by Recode that appeared to violate company rules include a Facebook post claiming that the Covid-19 vaccine will “alter your DNA” and “attack the uterus.” He linked to a YouTube video that references the “Plandemic” conspiracy theory and Bill Gates. The post had been shared on a Facebook group with more than 12,000 members, and the video was viewed more than 15,000 times on YouTube. Similarly, in a public Facebook group with 50,000 members, a post alleged that the Covid-19 vaccines were part of an attempt to “keep me from ascending the spirit beings we were meant to be.”

While YouTube promised to remove misinformation about the Covid-19 vaccine, Recode found a variety of content on the platform that appeared to violate those policies, including easily discoverable videos suggesting that the Covid-19 vaccine changes people’s DNA or that the vaccine is a ploy. intentionally killing the elderly in nursing homes. YouTube removed a video flagged by Recode that suggested the vaccine could be the “mark of the Beast” and connected it to the end of time in the Book of Revelation.

Media Matters has found that despite YouTube policies, videos suggesting that the Covid-19 vaccine included a microchip have received more than 400,000 views and some of them had advertisements. Meanwhile, Sam Clark of YouTube watchdog Transparency Tube points out that many channels known for driving conspiracies post about vaccines.

Twitter will begin enforcing its new policies against Covid-19 disinformation from December 21, and research shows the problem is significant and growing. November saw the largest increase in the number of vaccine misinformation retweets on Twitter this year, according to misinformation tracking company VineSight.

Individual posts on these platforms don’t necessarily get a lot of engagement, but they can get a significant amount of traction together and can even spread to other platforms. According to data from Zignal Labs, between December 8 and 14, there were nearly 30,000 mentions of the claim that the Chinese Communist Party had links to vaccines and nearly 90,000 mentions of Bell’s palsy, an often temporary condition that makes which parts of the face sag. After four participants in the Moderna vaccine trial contracted the disease, the FDA warned people to be on the lookout for signs of Bell’s palsy, but the agency says there isn’t enough information to link Bell’s palsy and the vaccine.

Meanwhile, much of the content that casts doubt on Covid-19 vaccines avoids making factual claims and is not removed. In an Instagram post, for example, conservative commentator Candace Owens called people receiving the vaccine “sheep.” The video received a Facebook tag, but was still viewed more than 2 million times.

Also fueling anxiety are those who make false claims about mandatory vaccines, which the US government is not considering. Zignal Labs research found that between December 8 and 14, there were more than 40,000 mentions of a mandatory vaccine on the platforms it tracks.

“In fact, they are fighting a ghost. They are fighting a bogeyman, ”says David Broniatowski, who studies behavioral epidemiology at George Washington University. “There is no one out there who says that we are going to pass a law that requires a Covid vaccine.”

These ideas don’t exactly amount to misinformation, and they often fall short of making claims about the vaccine itself. Still, they serve to undermine confidence in vaccination by increasing the prospect of government control, politicizing the vaccine, or casting doubt on the science behind it.

“Somebody says, ‘Do you know what’s in the Covid vaccine?’ And they just leave it like that, it’s not really misinformation, ”Broniatowski said. “But mistrust of the vaccine is certainly growing.”

This ambiguity makes it very difficult to moderate what’s allowed on sites like Facebook and YouTube. These platforms don’t want to be accused of amplifying anti-vaccination content, but responsibly classifying content that includes debate, humor, opinions, and facts geared to the Covid-19 vaccine, as well as misinformation, is an important effort, especially since we are still learning more about Covid-19 vaccines. At the same time, public health experts have also emphasized that people should have space to ask questions about vaccines.

Importantly, these platforms are using strategies beyond removals, such as applying labels and obtaining accurate information from health authorities. But the main concern is that Facebook, Twitter and YouTube policies could ultimately exacerbate the problem of vaccine vacillation, not only by monitoring misinformation, but also by addressing those gray areas. So while the public could pressure platforms to remove objectionable content, what they leave behind is just as complicated.

Open Source it’s made possible by the Omidyar Network. All open source content is editorially independent and produced by our journalists.



[ad_2]