Similar spot checks from less popular Twitch competitors YouTube Gaming and Facebook Gaming yielded far fewer cases of apparent live streaming of children. To stream on YouTube via mobile devices, a user must have more than 1,000 followers. Facebook Live doesn’t have a comparable restriction, but its live channel discovery sections for Facebook Gaming and Facebook Live seem more curated or moderate than those on Twitch. (Facebook also works with some 15,000 content moderators in the US alone.) That doesn’t mean those platforms are flawless; Facebook Live in particular has publicly struggled to moderate violent or dark live broadcasts. And the problems with child predation and exploitation extend far beyond live streaming; The New York Times reported earlier this year that media cases related to online child sexual abuse increased 50 percent in 2019, including 60 million photos and videos tagged by Facebook alone.
The dozens of active accounts WIRED discovered on Twitch sometimes contain heartbreaking conversations between apparent and strange children. In some cases, strangers “dare” young streamers for their entertainment, even asking the girls to turn their hair or kiss their friend on camera. Other times, strangers request the contact information of young streamers in other apps, such as Instagram or WhatsApp, owned by Facebook. (Twitch also has a built-in private chat feature.) They also intend to donate money, making a chat message appear as a verified donation, or posting inappropriate ASCII art in the chat. The streamers themselves are generally unsupervised.
WIRED shared dozens of apparent kid accounts with Twitch. Some have since been disabled.
“The security of our global community is a priority for Twitch and one that we are constantly investing in,” a Twitch spokesperson told WIRED. “We are continually working to ensure that all members of the community experience Twitch in the way that we intend and are investing in technologies to support this effort. In addition, we periodically evaluate and update policies and practices to ensure that we adequately address emerging and evolving behaviors. ” The spokesperson says that Twitch has a dedicated law enforcement response team, and that it works with the law enforcement team at parent company Amazon. When appropriate, the company reports violations to law enforcement and works with the Technology Coalition and the National Center for Missing and Exploited Children.
Dr Martha Kirby, UK’s Child Safety Policy Manager for the UK’s National Society for the Prevention of Cruelty to Children, says that Covid-19-related blocks have exacerbated the risk of online sexual abuse. ” like never before”.
“Bad design choices on live streaming sites can be exploited by hairdressers to abuse children in real time,” she says. “Technology firms have consistently failed to design their sites with the safety of children in mind, allowing criminals to easily view children’s live broadcasts and send them messages directly.”
In a video, archived three days ago, an apparent girl describes herself as “boring” and asks people to speak to her. She sits outside her house eating ice cream and talks to a stranger. “I really don’t have much to do,” she says before asking the stranger where they live. At least half a dozen other videos streamed live on the last day show children apparently referring to boredom.
Safety expert and founder of Savvy Cyber Kids Ben Halpert also noted that in quarantine, children often go unnoticed as they spend more time online. “Kids feel connected to other people when they broadcast live and communicate about things like Twitch,” says Halpert. At the same time, it is notoriously difficult to moderate live content, especially when there is a lot. According to analyst firm Arsenal, hours viewed from Twitch’s Just Chatting section increased from 86 million in January to 167 million in June.
.