Instagram has pledged to remove images, drawings and even cartoons showing methods of self-harm or suicide.
The move is its latest response to the public outcry over the death of British teenager Molly Russell.
The 14-year-old killed herself in 2017 after viewing graphic content on the platform.
Molly’s father has described the Facebook-owned app’s commitment as “sincere” but said managers needed to act more swiftly.
Instagram’s latest promise covers explicit drawings, cartoons and memes about suicide, in addition to any other method “promoting” self-harm.
“It will take time to fully implement… but it’s not going to be the last step we take,” Instagram chief Adam Mosseri told BBC News.
It extends measures announced in February, which banned “graphic images of self-harm” and restricted those with suicidal themes. This included both stills and videos.
Instagram has been under pressure to act after Mr Russell said he believed the US-based service had been partly responsible for his daughter’s death.
After she died, Mr Russell found large amounts of graphic material about self-harm and suicide on her Instagram account. He also found similar content on her Pinterest login.
The 56-year-old went public in January of this year.
The UK government, charities and the media were among those who subsequently called on Instagram and other technology companies to make changes.
‘Lack of responsibility’
Instagram’s latest announcement coincided with a visit by Mr Russell to Silicon Valley.
There, he told BBC News: “The big platforms really don’t seem to be doing much about it.”
During his visit, Florida-based internet safety campaigner and paediatrician Dr Free Hess showed him content still available on Instagram.
It included graphic photographs, videos of self-harm and cartoons advocating suicide.
She said hashtags had helped lead young people to the content.
“It’s grooming that young person to self-harm more, consider suicide more,” the doctor said.
Ian responded: “I was rather hoping that the steps taken would made it at least harder to find that stuff.”
Mr Russell also met Jim Steyer, the founder of Common Sense Media – the US’s largest charity dealing with child safety online.
“The lack of responsibility of the social media platforms is absolutely mindboggling,” said Mr Steyer, who wants new regulations to be imposed on the companies.
Instagram response
Instagram says it has doubled the amount of material removed related to self-harm and suicide since the first quarter of 2019.
Between April and June this year, it said, it had removed 834,000 pieces of content, 77% of which had not been reported by users.
“There is still very clearly more work to do, this work never ends,” said Mr Mosseri.
To which Mr Russell responded: “I just hope he delivers.”