Facebook ‘keeps making money with anti-vax sites’ | Facebook



[ad_1]

Facebook allows users to benefit from the spread of potentially dangerous false theories and misinformation about the pandemic and vaccines, including the implementation of tools to raise money on pages with content flagged by the network giant’s own fact-checkers. social.

An investigation has found 430 pages, followed by 45 million people, using Facebook tools, including virtual “stores” and fan subscriptions, while spreading false information about Covid-19 or vaccines.

The findings come despite the platform’s promise last year that no user or business should directly benefit from false information about immunization against Covid-19.

Facebook generally doesn’t share this revenue, but it occasionally takes a cut and financially benefits from users engaging with the content and staying on its services, exposing them to more ads.

The investigation, conducted by the London-based Bureau of Investigative Journalism, is likely to have uncovered just a small snapshot of the large amount of monetized misinformation on Facebook related to the pandemic and vaccines.

A Facebook spokesperson said the company was investigating the examples pointed out to it and that it had “removed a small number of the pages shared with us for violating our policies.”

However, many of the posts identified as disinformation do not violate Facebook’s rules, the spokesperson added, without providing any details.

“Our initial investigation shows that a large number of the flagged pages did not have any violations of our harmful disinformation policies, and we would dispute the overall accuracy of the data that is provided,” he said.

The identified pages included sites for comedians and religious leaders, social media personalities, and traditional media reporters.

There are a host of alternative health sites, focused on a variety of topics, from nutrition to yoga to wellness. Only a minority is clearly focused on the pandemic or anti-vaccine sentiment. The others share content with broader audiences.

Seven languages ​​are represented, including German, Hebrew, Polish and Spanish, and reach readers around the world.

More than 260 of the pages identified by the office have posted misinformation about vaccines. The rest include false information about the pandemic, about vaccines in general, or a combination of the two. More than 20 identified pages have earned Facebook’s blue mark as a sign of authenticity.

For Facebook, offering ways to make money is likely a route to encourage people to use its platform instead of its competitors, according to Dr. Claire Wardle, CEO of First Draft, a United States-based nonprofit. United that fights misinformation online, which contributed to the bureau investigation.

However, Facebook can also benefit from the popularity of brands and people spreading misinformation. A 5% to 30% cut is needed on their Stars coin, used by fans to tip creators who stream live video.

Facebook also briefly took up to 30% of the fees paid by new followers since January last year, but reversed it in August.

The office found two pages featuring Stars: It’s Supernatural by An0maly and Sid Roth, a religious site that has blamed abortion for the pandemic and featured guests describing a dream in which God showed them the virus being created in a lab Chinese. Among them, the pages have reached more than 2.6 million people.

The site run by An0maly, whose real name is AJ Feleski, who describes himself as a “news analyst and hip-hop artist,” is one of the most influential pages that share misinformation identified by the research, with more than 1.5 million followers.

A video from last March, in which he questions whether the pandemic is “bioterrorism,” is one of at least three posts on the page that Facebook fact-checkers have flagged as containing false or partially false information.

However, even on Saturday a strap appears under the videos inviting viewers to pay for “Become a Supporter” and “Support An0maly and Enjoy Special Benefits.”

Facebook’s policies for creators using monetization tools include rules against disinformation, especially medical disinformation.

In November, Facebook, along with Google and Twitter, agreed to a joint statement with the UK government committing to “the principle that no user or business should directly benefit from the disinformation / misinformation of the Covid-19 vaccine. This removes an incentive for this type of content to be promoted, produced and distributed ”.

The office’s findings suggest that Facebook has breached this agreement and has not enforced its own policies.

A Facebook spokesperson said: “Pages that repeatedly violate our community standards, including those that spread misinformation about Covid-19 and vaccines, are prohibited from monetizing on our platform.

“We take aggressive steps to remove Covid misinformation that leads to imminent physical harm, including false information about approved vaccines.”

The company removed 12 million pieces of Covid misinformation between March and October, and placed fact-checking warning labels on 167 million other pieces of content, it added.

Organizations such as the UN, the World Health Organization and UNESCO said in September that online disinformation “continues to undermine the global response and jeopardize measures to control the pandemic.”

Some of the pages identified in the research also directed their followers to more extreme content that has been largely removed from social media.

Veganize, a Brazilian-based Portuguese site with 129,000 followers, offers paid subscriptions for followers.

A “pinned post,” which is pinned to the top of the page even when new content is added, includes a link to a collection of Google-hosted files, including “Plandemic,” a pair of conspiracy-laden and completely discredited videos. which briefly went viral last summer before social media went to great lengths to remove them.

Groups that spread information flagged by factcheckers as false have also used Facebook to raise funds. The Informed Consent Action Network (Ican), a US non-profit organization, is one of the best funded organizations in the US that is opposed to vaccines.

Facebook and YouTube removed the pages from Highwire, an online program run by Ican founder Del Bigtree, which made claims repeatedly branded false by fact-checkers, which is why Ican says it is suing tech companies.

Yet despite removing the Highwire page, Facebook still allows Ican to solicit donations from its more than 44,000 followers on a page that has had at least two posts flagged by factcheckers. According to its page, Ican has raised nearly £ 24,000 since February 2020.

Organizations that sign up to raise funds must be approved by Facebook, and misinformation about vaccines is explicitly cited as a reason why fundraising can be removed from an organization.

First Draft’s Wardle believes that Facebook’s money-generating systems could encourage people to spread misinformation.

“It is human nature. We know that one of the motivations is financial, “he said.

“They have started to believe these things, but when you are in that circle, you also realize that there is a way to make money, then you realize that the more hits you take, the more money you are making. It’s more than the dopamine hit, it’s dopamine plus dollars. “

[ad_2]