Facebook Extends Ban on Vaccine Disinformation



[ad_1]

On Monday, Facebook announced that it was taking more aggressive steps to combat conspiracy theories and misinformation about vaccines. In a statement detailing its steps to promote authoritative information about COVID-19 vaccines, the social media giant shared an expanded list of false and dangerous claims about vaccines that will not be allowed on the platform.

Facebook initially announced in December that it would ditch unproven theories about coronavirus vaccines on both Instagram and Facebook, a move that was made just as the first vaccines were approved for use. The initial list of false information that the platform said it would remove included “false claims that COVID-19 vaccines contain microchips, or anything else that is not on the official list of vaccine ingredients,” states that COVID-19 does not existed or was no worse than the flu and claims that COVID-19 was caused or linked to 5G communications technology.

The expanded list of false information to be removed now includes claims that COVID-19 is artificial, that vaccines are not effective against the diseases they are meant to prevent, that vaccines cause autism, and that it is safer to get COVID-19. to get the vaccine.

Facebook said these new standards will take effect “immediately”, adding that accounts, pages and groups that “repeatedly share these discredited claims” on Facebook and Instagram can be removed.

Facebook launched its first major campaign against disinformation in 2016 after US intelligence agencies determined that Russia meddled in the US elections. The company formed an alliance with fact-checking organizations, but had been reluctant to ban or remove false or misleading posts. Instead, for years, it just added a pop-up dialog that warned of wrong or untested information. Users could still read the posts, comment, and click on the included links.

Facebook continued to add these warnings to posts that contained misinformation during the 2020 election, including flagging a large number of posts by former President Trump, who repeatedly spread unsubstantiated claims about voter fraud. Following the deadly Capitol riot on January 6, Facebook blocked Trump’s account and banned him from the platform a day later.

Before Trump’s ban, and after facing pressure for years, the platform also tried to crack down on hate speech by blocking the likes of Alex Jones, Louis Farrakhan and Milo Yiannopoulos.

However, when it came to anti-vaccination content, a previous commitment to combat misinformation focused more on reducing the visibility of hoaxes and misinformation in news sources and searches, rather than removing content for full.

The anti-vaccine movement has grown rapidly in recent years. Myths about the potential harm of vaccinating children persist, despite multiple studies showing that vaccines do not cause autism. Some experts have blamed the unwillingness of social media platforms to regulate vaccine misinformation for an outbreak of measles in 2015, a disease that had previously been eradicated in the United States in 2000.

[ad_2]