TikTok has asked other social media platforms to join it in establishing a partnership to better combat content depicting self-harm and suicide after clips from a Facebook livestream of a man taking his own life circulated around TikTok, Facebook, Instagram, YouTube and more for weeks earlier this month.
The August 31 Facebook livestream of Ronnie McNutt, a 33-year-old veteran, taking his own life remained on Facebook for nearly three hours after his death, and quickly went viral on other social media platforms, which struggled to keep up with accounts reposting clips of his death, sometimes disguised as videos of cute animals.
On TikTok, where an estimated 18 million daily users are 14 or younger, teens and their parents complained that videos were recommended on the “For You” discovery page, with users warning each other to stay off the app until the problem was fixed.
In response, TikTok interim chief Vanessa Pappas has written to the chief of executives of nine social and content platforms—Facebook, Instagram, Google, YouTube, Twitter, Twitch, Snapchat, Pinterest and Reddit—asking to create a partnership through which violent and graphic content can be better addressed, the company said in a a blog post Tuesday.
“What we are proposing is that, the same way these companies already work together around child sexual imagery and terrorist-related content, we should now establish a partnership around dealing with this type of content,” Theo Bertram, the company’s public policy head in Europe said on Tuesday during an appearance before members of the U.K.’s parliament.
According to Bertram, who admitted the platform needs to “do better” following what he described as “a coordinated effort by bad actors to spread this video across the internet and platforms,” TikTok will also implement changes to machine learning and emergency systems, in addition to tweaking how its algorithms detect violent content and co-ordinate with moderators for quicker takedowns.
When asked by Forbes if TikTok has received responses from any other social media platforms, spokesperson Jamie Favazza said “the letter was sent yesterday, and we look forward to hearing their thoughts” (the nine platforms that received the letter did not immediately respond to questions from Forbes about whether they’ll join the proposed collaboration).
“Recently, social and content platforms have once again been challenged by the posting and cross-posting of explicit suicide content that has affected al of us—as well as our teams, users and broader communities,” wrote Pappas in the letter. “We believe each of our individual efforts to safeguard our own users and the collective community would be boosted significantly through a formal, collaborative approach to early identification and notification amongst industry participants of extremely violent, graphic content, including suicide.”
Bertram said on TikTok, the way the video was shared suggested a coordinated attack—possibly from bot accounts. “We saw people searching for content in a very specific way. Frequently clicking on a profile of people as if they’re kind of anticipating that those people had uploaded a video,” he said. The virality of these clips has been traumatizing to some social media users, as well as to the family of McNutt, who watched him take his own life on Facebook and then said they were inundated with bot accounts reposting the clips and spamming their accounts, as well as McNutt’s memorial page on Facebook. A long-time friend of McNutt’s, Josh Steen, who has since launched an initiative entitled #ReformForRonnie calling on social media platforms to better address violent content, said he thinks McNutt would still be alive if Facebook had taken down the video faster. Steen said he and other friends of Ronnie’s reported the two-hour-long livestream to Facebook hundreds of times while McNutt was still alive, but didn’t hear anything back until nearly an hour and a half after his death. “Ronnie’s video was up for eight hours and it had already been shared to a viral level before it was pulled down,” Steen explained. Facebook told Forbes at the time it was reviewing how the livestream could’ve been taken down faster.
“We also fully agree that [TikTok], along with every single other major social platform must do better,” said Steen in a Tuesday statement to Forbes. “It’s the least these companies can do to make sure the content we are shown on their platforms is safe and responsible.”
If you or someone you know is thinking about suicide, please call the National Suicide Prevention Lifeline at 800-273-TALK (8255) or text the Crisis Text Line at 741-741.
“YouTube reverts to human moderators in fight against misinformation” (Financial Times)