Websites that spread misinformation about health nearly drew a billion on Facebook alone in April alone as the coronavirus pandemic escalated worldwide, a report has found.
Facebook had promised to break conspiracy theories and inaccurate news early in the pandemic. But because their executives have promised responsibility, their algorithm appears to have redirected traffic to a network of sites that share dangerously false news, campaign group Avaaz has found.
False medical information can be fatal; researchers led by the International Center for Diarrheal Disease Research in Bangladesh, writing in The American Journal of Tropical Medicine and Hygiene, have directly linked a single piece of coronavirus incorrect information to 800 deaths.
Pages of the top 10 sites that receive inaccurate health information and collusion theories received nearly four times as many views on Facebook as the top 10 reputable health information sites, Avaaz warned in a report.
The report focused on Facebook pages and websites that shared large numbers of false claims about coronavirus. The pages and pages covered a variety of different backgrounds, including alternative medicine, organic farming, politics for law, and generalized conspiracy.
It found that worldwide networks of 82 pages spreading misinformation about health across at least five countries had generated an estimated 3.8 billion views on Facebook in the last year. Their audience peaked in April, with 460m views in one month.
“This suggests that just when citizens were most in need of credible health information, and although Facebook was proactively trying to raise the profile of authoritative health institutions on the platform, the algorithm was potentially undermining these efforts,” the report said.
A relatively small but influential network is responsible for driving enormous amounts of traffic to health misinformation sites. Avaaz identified 42 ‘super-spreader’ sites that had 28m followers, generating an estimated 800m views.
One article, which falsely claimed that the American Medical Association encouraged doctors and hospitals to estimate Covid-19 deaths, was viewed 160m times.
This vast collective reach suggests that Facebook’s own internal systems are incapable of protecting users from misinformation about health, even at a critical time when the company has promised to keep users “safe and informed”.
“Avaaz’s latest investigation is yet another damning accusation of Facebook’s capacity to crack down on false or misleading health information during the pandemic,” said British Second Chamber member Damian Collins, who is leading a parliamentary inquiry into disinformation.
“The majority of this dangerous content is still on Facebook with no warning or context though… The time for [Facebook CEO, Mark] Zuckerberg to act is now. He needs to clean up his platform and help stop this harmful infodemy. ”
According to a second research paper, published in The American Journal of Tropical Medicine and Hygiene, the potential harm of disinformation about health is immense. Media and social media reports from 87 countries identified researchers more than 2,000 claims about coronavirus circulating, of which more than 1,800 were proven false.
Some of the false claims were directly damaging: one, suggesting that pure alcohol could kill the virus, has been linked to 800 deaths, as 60 people go blind after drinking methanol as a cure. “In India, 12 people, including five children, became ill after drinking liquor made from poisonous seed Datura (ummetta plant in local parlance) as a cure for coronavirus disease,” the paper says. “The victims apparently watched a video on social media giving Datura seed immunity from Covid-19.”
In addition to the specific dangerous forgeries, a lot of misinformation is only useless, but may contribute to the spread of coronavirus, as with one South Korean church that believed that spraying salt water could fight the virus.
“They put the mouthpiece of the spray bottle in the mouth of a follower who was later confirmed as a patient before doing the same for other followers, without disinfecting the syringe,” an official said later. More than 100 followers were infected as a result.
“National and international agencies, including the fact-checking agencies, should not only identify and debunk rumors and conspiracy theories, but they should also engage social media companies to disseminate accurate information,” the researchers conclude.
Under Facebook’s tactics to combat disinformation on the platform, independent fact-checkers have provided the ability to place warning labels on items they consider to be untrue.
Zuckerberg said false news would be marginalized by the algorithm, which determines what content viewers see. “Posts that are rated as fake are demographed and lose an average of 80% of their future images,” he wrote in 2018.
But Avaaz found that enormous amounts of misinformation slipped through Facebook’s verification system, despite being flagged by fax organizations.
They analyzed nearly 200 pieces of misinformation about health that were shared on the site after they were identified as problematic. Less than one in five carried a warning label, with the vast majority – 84% – slipping through checks after they were translated into other languages, or re-published in whole or in part.
“These findings point to a gap in Facebook’s ability to detect clones and variations of fact-checked content – especially across multiple languages - and to put warning labels on them,” the report said.
Two simple steps could greatly reduce the reach of traffic information. The first would be to proactively correct incorrect information that was seen before it was deemed false, by putting forward corrections in user feeds.
Recent research has found corrections such as these can halve the belief in incorrect reporting, Avaaz said. The other step would be to improve the detection and monitoring of translation and cloning material, so that Zuckerberg’s promise to starve the pages of its audience is made really good.
A Facebook spokesperson said: “We share Avaaz’s aim to limit misinformation, but its findings do not reflect the steps we have taken to not spread the word about our services. Thanks to our worldwide network of fact-checkers, from April until June, we affixed warning labels on 98m pieces of Covid-19 misinformation and removed 7 pieces of content that could lead to imminent harm.We targeted more than 2 billion people on health authority sources and if anyone attempted a link about Covid -19, we show them a pop-up to connect them with credible health information. ”