Index – Tech-Science – As of now, Facebook is watching who’s lying



[ad_1]

A hot topic on Facebook for years has been “fake news”: deliberate misinformation spreads for various reasons, which can even affect the outcome of elections or affect democracy, political and public life. Facebook has been repeatedly asked to take serious action against fake news, but so far Mark Zuckerberg has only been able to deliver on promises, not very tangible facts. Recently, however, the US firm has engaged outside research organizations that can take a closer look at the veracity of certain suddenly pervasive, high-profile articles or publications.

Facebook is currently working with more than eighty third-party fact-checking companies that verify content in more than sixty languages ​​around the world, and promises that number will expand in the future.

The French control us

The Palo Alto company announced Tuesday that its external research program will be launched in Hungary with the help of the AFP news agency. Facebook regards AFP as one of its most prominent investigative partners, if only because the French national news agency works with a large number of journalists from around the world.

Which, in turn, is somewhat surprising for supposedly

There will only be one person in Hungary to consult the news.

However, this lone journalist will not be alone, as he “will work closely with the international AFP network, which is part of the European team,” says Bronwen Roberts, a member of the AFP staff who will coordinate European investigative cooperation.

External and independent partners of Facebook have been certified to operate through the International Data Verification Network and must adhere to the IFCN Code of Ethics in their work. Social media also outlined the guiding principles of the IFCN, which are:

  • impartiality and fairness,
  • transparency of funding and organization,
  • resource transparency,
  • transparency of the methodology,
  • open and honest repair guidelines.

Following the accession of Poland, Romania, Slovakia, the Czech Republic and the Balkans, AFP has successfully strengthened its research network in Central and Eastern Europe through the accession of Hungary and Bulgaria and can therefore combat the spread of fake news. in this region as well. The AFP has a key role to play here in terms of research: the French agency now covers 17 European countries and operates in 12 different European languages.

They also hit the deepfake

The most important task of third-party fact-finding software is to first identify the most common fake news and deal with it appropriately, especially in the case of obvious fake news that is truly baseless.

In the work of research partner companies a special role is given to the processing of provocatively formulated false statements, especially those that live in the public consciousness, are on popular topics, and are also considered important by the average person. . The technology extends not only to text, but also to other types of content, such as photos and videos. The proliferation of deepfake images and videos is perhaps the hottest problem today, which is why fact-checking companies pay a lot of attention to them.

Organizations do not prohibit …

It is important to note that such third party investigation company will not delete anyone’s Facebook account or page as they are not authorized to do so. In the event that a news or other content is lied, or is only partially lied, or is perceived as modified, that news will appear much less on the news wall by user feeds.

Not only Facebook, but also Instagram are fighting fake news: on the company’s other social media platform, they can’t appear on Explore and hashtag pages, and here too they are placed much lower in the feed content, so making your search difficult. the information in question. This method reduces, slows down the spread of false information and reduces the number of users who come into contact with it.

… but Facebook is punishing him

Of course, the fact that organizations do not punish does not mean that Facebook itself does not resort to retaliation. If a Facebook page or external page has already shared too much of this type of content, Facebook will block these monetization and advertising opportunities, and they will also face a drastic drop in access.

When Facebook considers a particular post to be bogus content, it flags posts in users’ feeds in a number of ways. If a post is flagged as fake by an external verification partner, Facebook will alert both readers and the post’s actions.

Content classified as partially or totally false will be marked with a distinctive Facebook mark, but it is our responsibility to decide whether to read or redistribute it. Tags will appear on top of fake and modified photos and videos, in addition to Facebook posts on Instagram story. You will also be able to click on a link that points to a review of that fact checker.

Three hits in Facebook mode

In their work, research companies follow Facebook’s strategy, which the company has developed to improve the quality and credibility of the stories that appear on our news feed.

This strategy is based on three pillars, according to the company’s prospectus:

  • Facebook continues to remove content – and, in more serious cases, the accounts themselves – that violate our “Community Principles.” Fake accounts spreading fake news continue to be closely monitored: in the last quarter of 2020 alone, algorithms banned 1.3 billion camouflage profiles. Facebook pays particular attention to fake news aimed at influencing voters and misinformation that can cause real harm to users.
  • Beyond direct blocking, the less spectacular but also painful step is to reduce access to content. This is the case for posts which, although they do not violate Community principles, are considered to be of ‘poor quality’. External research partners will have an important role to play in this: if they find something false, they will backtrack on their achievement drastically. Moderators also crack down on spam-like content, especially click-seeking posts.
  • Finally, the third method is to “inform users”: Facebook does not remove the stigmatized content, but it gives the user the opportunity to decide for themselves whether to read the fake news or not.

(Cover Image: Jaap Arriens / NurPhoto / Getty Images)



[ad_2]