November 2023

New research exposes tech giants’ amplification of content promoting suicide and self-harm

**Trigger warning**

This report contains extensive references to suicide, self-harm and poor mental health

  • Data shows the astonishingly high reach that suicide and self-harm related posts achieve because of poorly designed, algorithm-powered consumption models
  • Research published in the week that Molly Russell would have turned 21
  • Ian Russell: this fundamental systemic failure will cost young lives

Today, a first-of-its kind report from suicide prevention charity, the Molly Rose Foundation (MRF) in partnership with Bright Initiative by Bright Data warns of the significant fundamental system failings by leading social media giants in handling self-harm and suicide content.

Released in the week that Molly Russell should have been celebrating her 21st birthday, the research demonstrates the shocking scale and prevalence of harmful content on Instagram, TikTok and Pinterest. On TikTok, some of the most viewed posts that reference suicide, self-harm and highly depressive content have been viewed and liked over 1 million times.

The new data reveals young people are routinely recommended large volumes of harmful content, fed by high-risk algorithms, that when viewed in large amounts present a clear risk of exacerbating feelings of depression, hopelessness, and misery.

In partnership with the data-for-good organisation, The Bright Initiative by Bright Data, the Molly Rose Foundation collected and analysed data from 1,181 of the most engaged posts that were publicly available on TikTok and Instagram and posted using well-known suicide, self-harm and depression hashtags.

In the context of the report, posts were considered harmful if they promoted and glorified suicide or self-harm, referenced suicide and self-harm ideation, or contained relentless themes of misery, hopelessness and depression.

The report demonstrates there is a clear and persistent problem with readily available and discoverable harmful content. Almost half of the most engaged posts that were surveyed on Instagram (48%) and TikTok (49%) were found to be harmful, with a particular risk associated with these posts being algorithmically recommended and viewed in large volumes.

Algorithms enable harmful content to reach staggeringly high audiences. More than half (54%) of the most engaged harmful posts that were surveyed on TikTok had been viewed over 1 million times. Almost one in eight (12%) of harmful posts had been liked by over one million accounts. 

The Molly Rose Foundation is concerned that the design and operation of social media platforms is sharply increasing the risk profile for some young people, with teenagers being bombarded with huge amounts of harmful content, able to save posts with a single click, and readily able to ‘binge watch’ large volumes of material in saved collections and albums. 

These risks are increased by the stark failure of platforms to assess the risks posed by user engagement features that make it easier for content to be readily discovered and binge-watched. For example, Instagram has user prompts to ‘use hashtags’ such as #letmedie, and TikTok recommends search results such as ‘the quickest way to end it’ and ‘attempt tonight’.

The Molly Rose Foundation is concerned that commercial pressures may be increasing the risks faced by young people. On Instagram Reels, the platform’s short form video competitor to TikTok, we were algorithmically recommended significantly higher amounts of harmful content than on any other part of the site.

After engaging with suicide, self-harm and depressive content elsewhere on Instagram, 99 per cent of the Reels we were algorithmically recommended were considered harmful. Instagram has identified Reels as a major area for its revenue and user growth.

The Molly Rose Foundation believes these findings underscore the scale of the challenge facing the new online safety regulator, Ofcom. Social media companies are continuing to prioritize growing their user base, at the expense of user safety, in a race for market share.

Ian Russell, father of Molly Russell and Chair of Trustees at the Molly Rose Foundation said: “This week, when we should be celebrating Molly’s 21st birthday, it’s saddening to see the horrifying scale of online harm and how little has changed on social media platforms since Molly’s death.  The longer tech companies fail to address the preventable harm they cause, the more inexcusable it becomes. Six years after Molly died, this must now be seen as a fundamental systemic failure that will continue to cost young lives.  

“Just as Molly was overwhelmed by the volume of the dangerous content that bombarded her, we’ve found evidence of algorithms pushing out harmful content to literally millions of young people. This must stop. It is increasingly hard to see the actions of tech companies as anything other than a conscious commercial decision to allow harmful content to achieve astronomical reach, while overlooking the misery that is monetised with harmful posts being saved and potentially ‘binge watched’ in their tens of thousands.

“Our findings show the scale of the challenge facing Ofcom and underline the need for them to establish bold and ambitious regulation that delivers stronger safety standards and protects young lives.”

Or Lenchner, CEO of Bright Data, said: “The report has some incredibly disturbing findings regarding the significant failings of social media giants in terms of their inconsistent and at times erratic moderation of harmful content. Tech giants must take responsibility for the implications on individuals, often children and young people, who consume large amounts of harmful material on their platforms.

“To mitigate the risks of technology-facilitated self-harm and suicide, strong regulation and supporting data is needed. It is crucial for Ofcom and government to remove any barriers to conducting research so organisations can continue their invaluable work in holding social media companies accountable.”

Sir Peter Wanless, NSPCC Chief Executive, said: “I will never forget sitting in the Molly Russell inquest to hear the coroner conclude that a contributing factor to her tragic death was the suicide and self-harm content she was exposed to on social media. This should have been a seminal moment and a turning point for the wellbeing of children online, but a year on this damning report shows the response by tech firms has been far too slow and far too narrow.

“The worrying reality is that vulnerable children are still being put at an increased risk of harm because of the actions and choices of some of the world’s biggest companies.

“Tech firms should be acting now to make sure that children are not exposed to this harmful content rather than waiting until Ofcom has the powers to fine companies and hold criminal sanctions against individual managers.”

Imran Ahmed, Chief Executive of the Center for Countering Digital Hate, said: “The children’s mental health crisis is getting worse and public health experts are increasingly raising the alarm that it’s linked to social media use.

“The longer users stay glued to their screens the more money social media companies make. And so algorithms have been designed to feed us content that will keep us scrolling – no matter how damaging or destructive that content is.

“As this report shows, not only is there a staggering amount of harmful content out there – but shockingly, algorithms actively recommend it to the kids it identifies as being vulnerable.

“Social media companies know their products are unsafe. This is yet more proof that their guardrails are woefully inadequate, and they are unwilling to fix them.”

Prof Louis Appleby, Government advisor on suicide prevention and Prof of Psychiatry at the University of Manchester, said: “We’ve moved on in how we view the online world. We are in a new era of social responsibility and tech companies need to do more about their images and algorithms.

“We need modern laws and policies. Too. The Online Safety Act aims to protect people, especially young people, who may be put at risk but Ofcom as regulator faces a fast-moving challenge. The new national suicide prevention strategy highlights online safety.

“We would not have come this far without the extraordinary campaign of Ian Russell, turning his family tragedy into help for others. As a society we have to show the same determination.”

If you are affected by the news of this report, please visit the “Where to Find Help” section of our website for a list of trusted support organisations. With other mental health charities, we have put together some mental wellbeing resources and guides that we feel may be of use at this time. They can be found on our Mental Wellbeing Resource page.

If you’re struggling just text MRF to 85258 so you can speak to a trained volunteer from Shout, the UK’s Crisis Text Line service

Like this article?

Share on Facebook
Share on Twitter
Share on Linkedin
Share on Pinterest

MORE NEWS