The strict rules of Facebook to block content related to the “end of life” | Technology



[ad_1]

On Facebook, content related to assisted suicide, the end of life or self-mutilation, are subject to a series of rules to reconcile user safety and freedom of expression, as reflected in the case of French Alain Cocq.

This 57-year-old man, suffering from an incurable disease, and demanding an end of life “worthy”, announced on the night of Friday to Saturday that he would stop any treatment and allow himself to die live on Facebook.

But the platform announced on Saturday that it was blocking the broadcast of Cocq’s video.

“Although we respect your decision to want to draw attention to this complex issue, based on the advice of experts we have taken measures to prevent the broadcast live on Alain’s account, as our regulations do not allow the representation of suicide attempts”, a Facebook spokesperson told AFP

The balance is delicate for the social network that has 1.8 billion users, and is sometimes accused of not using enough means to avoid the dissemination of violent or shocking content.

These norms have been reinforced over time, after a series of cases that generated scandal, such as the death in 2017 in the United Kingdom of Molly Russell, a 14-year-old teenager who committed suicide after viewing content related to self-mutilation and suicide in Instagram, Affiliate of Facebook.

Facebook has very precise rules: although they do not have specific provisions on the end of life, they are instead very strict in terms of content that may resemble a promotion of suicide or self-mutilation.

“In order to promote the safety environment on Facebook, we suppress any content that encourages suicide or self-mutilation, including some explicit images (…) that according to experts could incite certain people to adopt similar behavior,” warns the network in its rules.

However, these same regulations authorize “photos or videos representing a person who has committed suicide in a current situation”, as well as “photos or videos that represent a person who has been the subject of assisted suicide or euthanasia. in a medical context ”.

In these cases, access is limited to those over 18 years of age, and a warning message.

Furthermore, the network can relax its own rules if it considers that unauthorized content is in the public interest.

It also leaves controversial content online at times, the time it takes to assist its publishers, if it saves lives.

“We have been informed by experts that we should not suppress live self-use videos while the victim’s family or relatives can still intervene,” says Facebook, by way of example.



[ad_2]