TikTok and other social media apps are technically in the clear – at least in the eyes of Australian law – over a viral graphic suicide video that continues to circulate across its platforms.
Australian legislation passed following the viral live-streamed event of the Christchurch massacre, which requires social media platforms to quickly remove abhorrent violent material does not apply to suicide videos, according to the eSafety commissioner’s office.
The law requires social media companies to alert the AFP of any abhorrent violent material circulating across their platforms and swiftly remove it. Failing to do so could earn social media platforms a $10.5m fine or 10 per cent of turnover.
Yet videos of suicide, it appears, is not within the legal definition of abhorrent violent material.
A spokesperson for the eSafety commissioner’s office said: “The video is not considered abhorrent violent material – because it is not violent terrorism, murder or attempted murder, rape, torture or kidnapping”.
However, eSafety comissioner Julie Inman Grant is currently investigating the viral suicide video and working with TikTok and other platforms to remove it.
Meanwhile, TikTok has provided an updated response to the ongoing issue. TikTok ANZ general manager Lee Hunter said the app is working “quickly and aggressively” to remove the video.
“On Sunday night clips of a suicide that had originally been livestreamed on Facebook circulated on other platforms, including TikTok.
This content is both distressing and a clear violation of our Community Guidelines and we have acted quickly and aggressively to detect and remove videos, and take action against accounts responsible for re-posting the content, through a mix of machine learning models and human moderation teams. We appreciate our community members who’ve reported content and warned others against watching, engaging, or sharing such videos on any platform out of respect for the person and their family.
“We have also updated related hashtags to surface a public service announcement, with resources for where people can seek help and access our Safety Centre.
Hunter also said TikTok locally is working with local policymakers and relevant organisation to keep them informed of the situation, while also acknowledge the “serious responsibility” it has to “address harmful content”.
If you or anyone you know needs help:
- Lifeline on 13 11 14
- MensLine Australia on 1300 789 978
- Beyond Blue on 1300 224 636
- Headspace on 1800 650 890