Half of girls saw high risk suicide, self-harm, depression or eating disorder content on social media in a week, major new research shows

Molly Rose Foundation - Half of girls saw high risk suicide, self-harm, depression or eating disorder content on social media in a week, major new research shows
Half of girls saw high risk suicide, self-harm, depression or eating disorder content on social media in a week, major new research shows

One in two (49%) girls were exposed to high risk suicide, self-harm, depression or eating disorder content on major social media platforms in a single week, comprehensive new research has found.

Conducted just weeks before the Online Safety Act took effect, the research suggests that teens aged 13-17 were being algorithmically recommended potentially harmful content at deeply worrying levels.

Girls were twice as likely to see high risk content than boys, with young people experiencing low wellbeing or with SEND also more likely to encounter high risk content.

1,897 13-17 year-olds were surveyed in 20 schools in the study, commissioned by Molly Rose Foundation, which aimed to understand the scale of exposure to harmful content prior to regulation taking effect.

The research found that 68% of children categorised as having low wellbeing saw high-risk suicide, self-harm, depression or eating disorder content in a single week. These findings suggest that algorithms were continuing to push potentially dangerous content to vulnerable teens and targeting those at greatest risk of its effects.

Since the Online Safety Act took effect, major platforms have been required to either prevent these high-risk types of content from appearing in children’s feeds or prevent them from appearing as often.

This research suggests that prior to the Act coming into force exposure to the highest risk types of suicide and self-harm content was in fact much greater than previously understood, with Molly Rose Foundation saying that Ofcom’s unambitious implementation of the Online Safety Act fails to match the levels of harm children are exposed to.

The charity said it should be a wake-up call for the urgent need to strengthen the Online Safety Act and called on new Technology Secretary Liz Kendall to act decisively with robust legislation to prevent more young lives being lost.

The study found:

  • More than a third (37%) of teenagers had seen at least one type of high risk content in the previous week. This equates to roughly 1.5 million 13-17 year olds across the UK.
  • Some children were being repeatedly bombarded by harmful content. Of those who had seen high risk content, between 13% and 27% had seen this 10 times or more on at least one platform.
  • Children with SEND were also at higher risk than their peers. Two in five (43%) of those with SEND reported that they had seen the highest risk content.
  • Children were considerably more likely to be exposed to high risk content on TikTok and X. Comparing those who spent a similar amount of time on each platform, children using TikTok and X were more than twice as likely to have encountered the high risk content compared to users of other platforms.
  • Young people were not actively searching for harmful content. Over 50% reported being exposed to potentially high risk content algorithmically in platforms’ recommender feeds.

Andy Burrows, Chief Executive of Molly Rose Foundation, said: “This groundbreaking study shows that teenagers were being exposed to high-risk suicide, self-harm or depression content at an incredibly disturbing scale just weeks before the Online Safety Act took effect, with girls and vulnerable children facing markedly increased risk of harm.

“The extent to which girls were being bombarded with harmful content is far greater than we previously understood and heightens our concerns that Ofcom’s current approach to regulation fails to match the urgency and ambition needed to ultimately save lives.

“The Technology Secretary Liz Kendall must now seize the opportunity to act decisively to build on and strengthen the Online Safety Act and put children and families before the Big Tech status quo.”

The research also found that many children were at risk of cumulative harm, either repeatedly seeing high risk content or seeing it alongside high volumes of material which can compound distressing or negative feelings.

For example, over a third (34%) of those who had seen content that encourages or promotes self-harm or suicide had also seen content about feeling down, sad or lonely more than ten times on at least one platform.

Molly Rose Foundation said the findings must inject new urgency and ambition into Ofcom’s implementation of the Online Safety Act.

They warned that the findings have increased concerns about Ofcom’s approach with their current measures poorly placed to respond to the scale of harmful content and cumulative harm young people were being exposed to this summer.

It comes after Ofcom estimated it will cost up to just £80,000 to address algorithmic harm which Molly Rose Foundation labelled inadequate to tackle how teenagers are being bombarded with high levels of harmful content.

The charity also said that stronger legislation is required to address structural weaknesses in the Act itself.

This should include a harm reduction duty on Ofcom with clear targets which would incentivise the regulator to drive down exposure to harmful content.

Read Children’s exposure to suicide, self-harm, depression, and eating disorder content online here. Read a research briefing here.

If you’re struggling just text MRF to 85258 so you can speak to a trained volunteer from Shout, the UK’s Crisis Text Line service.