Meta has announced it will begin hiding more types of content for teens on Instagram and Facebook, including searches related to suicide, self-harm and eating disorders.
Today’s announcement was made “in line with expert guidance”, the company said, stating it wanted teens to have safe, age-appropriate experiences on its apps.
The apps will continue to allow people to share content discussing their own struggles with such topics, but the new policy will not recommend the content, with direct searches hiding related results and instead guiding users towards helpful resources.
Andy Burrows, a spokesperson for the Molly Rose Foundation said: “Our recent research shows teenagers continue to be bombarded with content on Instagram that promotes suicide and self-harm and extensively references suicide ideation and depression.
“While Meta’s policy changes are welcome, the vast majority of harmful content currently available on Instagram isn’t covered by this announcement, and the platform will continue to recommend substantial amounts of dangerous material to children.”
The research, which was produced in partnership with Bright Initiative by Bright Data, showed that almost half of the most-engaged posts on Instagram (48% ), were posted using well-known suicide and self-harm hashtags, contained material that promoted or glorified suicide and self-harm, referenced suicide ideation, or otherwise contained intense themes of misery, hopelessness or depression.
What’s more, 99 per cent of algorithmically recommended Reels, viewed using an account used to access suicide and self-harm content, contained harmful references to suicide, self-harm and depression.
Much of the harmful content identified was posted by meme-style accounts and so would not be covered by today’s announcement.
Burrows added: “Unfortunately this looks like another piecemeal step when a giant leap is urgently required.”
If you’re struggling just text MRF to 85258 so you can speak to a trained volunteer from Shout, the UK’s Crisis Text Line service