[ad_1]
AP
TikTok says it is working to remove videos of a man who apparently took his own life and ban users from continuing to try to post the clips on the popular social media platform, reports say.
TikTok says it is working to remove videos of a man who apparently took his own life and ban users who keep trying to post the clips on the popular social media platform.
It is the latest example of the continued struggle by big tech companies to police their platforms for harmful content amid mounting pressure from regulators.
The video was originally streamed live on Facebook before circulating on other platforms, including TikTok, the company said.
He did not elaborate on the video, but news reports say it has been circulating on TikTok since Sunday and shows the death of a man.
READ MORE:
* Schools issue warnings to parents about graphic death on social media
* Could you build a better TikTok?
* Local TikTok ban unlikely until New Zealand shifts away from Chinese economy, says expert
“Our systems, along with our moderation teams, have been detecting and blocking these clips for violating our policies against content that displays, praises, glorifies or promotes suicide,” TikTok said in a statement.
“We are banning accounts that repeatedly try to upload clips,” the company said, adding that it appreciated users who reported on content.
TikTok has become very popular with teens in large part due to the company’s algorithms, which decide which videos users watch without first requiring them to follow other users or specify their preferences. US President Donald Trump ordered Tiktok’s Chinese owner ByteDance to sell its US operations over concerns about cybersecurity and censorship.
Facebook said it removed the original video last month the day it aired and has “used automation technology to remove copies and uploads since then.”
Social media users have been warning others about the clips, saying some have been edited to include cat shots to fool viewers. Others are posting a screenshot of the beginning of the video so people know which clips to avoid.
TikTok urged people who were struggling with suicidal thoughts or worried about someone seeking support.
It comes days after another controversy on social media about a live death. Facebook on Saturday blocked live broadcasts of a bedridden man with a chronic illness who wanted to show what it hopes will be a painful end to his life and had asked French President Emmanuel Macron for a doctor-assisted death.
On Tuesday separately, TikTok adhered to the European Union Code of Conduct aimed at preventing and countering illegal hate speech online, authorities said.
“It is good that #TikTok, a company favored by young users who are particularly vulnerable to online abuse and illegal hate speech, has joined the Code of Conduct,” tweeted EU Commissioner Vera Jourova. “Of course, I hope that TikTok will adhere not only to (the) principles of the Code, but also to fully respect EU law when operating on EU territory.”
The EU launched the code in 2016, but the problem has only grown since then, with social media companies accused of amplifying divisions, hatred and misinformation on their platforms.
Facebook, Microsoft, Twitter, and YouTube were the first to subscribe to the code when it launched, and Instagram, SnapChat, and Dailymotion joined last year.