[ad_1]
Video-sharing site TikTok is struggling to remove clips that show a man committing suicide.
The footage, which has been circulating on the platform for several days, originated from Facebook and has also been shared on Twitter and Instagram.
TikTok is very popular with young people, and many have reported that they came across the video and were traumatized by the content.
The app said it would ban accounts that repeatedly upload clips.
‘Warned others’
“Our systems have been automatically detecting and flagging these clips for violating our policies against content that displays, praises, glorifies or promotes suicide,” said a representative.
“We appreciate the members of our community who have reported on content and have warned others not to view, participate in or share such videos on any platform, out of respect for the person and their family.”
Facebook told BBC News: “We removed the original video from Facebook last month, the day it aired, and have used automation technology to remove copies and uploads since then.
“Our thoughts remain with Ronnie’s family and friends during this difficult time.”
Sensational self-harm
TikTok algorithms often recommend content from people not directly followed by a user.
Several people have broadcast their suicides on Facebook Live since its launch in 2015.
Facebook has also faced criticism. Instagram shares tabloid content about self-harm and suicide.
After Molly Russell’s death in 2017, her father said the rig had “helped kill his daughter.”
access the BBC Action Line.
Related topics
-
Facebook
- Tik Tok
- Suicide prevention
- Social media