Instagram has announced upcoming changes to its content policies, with graphic self-harm images set for removal.
The shift comes after the 2017 death of British teenager, Molly Russell.
Her parents believe that Russell, 14, took her own life in response to images of suicide that she had seen on the social media platform.
In an interview with the BBC, her father, Ian Russell, said: “I have no doubt that Instagram helped kill my daughter”.
Instagram VP of product Adam Moressi agreed that the current content policing policies needed to be revised in order to make the platform safer.
While meeting with UK health secretary Matt Hancock, Moressi said: “We are not where we need to be on self-harm and suicide, and we need to do more to protect the most vulnerable in our community”.
“We will get better and we are committed to finding and removing this content at scale.”
He clarified that there would be exceptions if the content was non-graphic and deemed acceptable.
Moressi said: “I might have an image of a scar or say, ‘I’m 30 days clean’, and that’s an important way to tell my story”.
“That kind of content can still live on the site but the next change is that it won’t show up in any recommendation services so it will be harder to find”.
In a press release, Facebook, which owns Instagram, noted that independent experts “unanimously reaffirmed that Facebook should allow people to share admissions of self-harm and suicidal thoughts, but should not allow people to share content promoting it”.
The statement went on to say that great importance should be placed on establishing products and services that can support those at risk during their time of need.
“The experts emphasised the importance of building products that facilitate supportive connections, finding more opportunities to offer help and resources, and importantly, avoiding shaming people who post about their self-harm thoughts or actions”.