January 2024

Self-harm and suicide content readily available on internet search engines – Ofcom report

Ofcom has published ‘ground-breaking’ research that confirms content which glorifies or celebrates suicide and self-harm is widely available on search engines. 

Carried out by the Network Contagion Research Institute, the research found that one in every five links (equivalent to 22%) in search results analysed led to content which either glorified or offered instructions about self-harm, suicide or eating disorders.  

Ian Russell, Chair of Trustees at the Molly Rose Foundation, said: “This research highlights how major search engines fail to exercise their duty of care by recommending online content that encourages, promotes or instructs young people to self-harm.

“It’s particularly concerning that major search engines are choosing to recommend harmful content on the very first page of their search results.

“As search engines rush to introduce AI into their products, without sufficient regard for safety, regrettably this is a problem that is only likely to get worse in the months and years before regulation starts to be enforced.” 

The report states search engines pose a potential risk to users suffering with mental health issues or to children, “because they serve as an open portal that encourages harmful behaviours.” 

It added that some members of the team of professional researchers, which included clinical psychiatrists and psychology professors, were left distressed by the often disturbing and graphic images easily accessed in the study. 

“It is crucial to recognise the severity of some of the content and the need for safeguarding measures, especially when considering the potential effect on unequipped internet users, particularly children and young adults… the importance of awareness of these risks and safeguarding against them should not be underestimated,” it concluded. 

In November, Ofcom issued draft Codes of Practice to big tech firms in order to protect users from illegal content online as part of the Online Safety Act. Firms will be required to assess the risk of users being harmed by illegal content on their platform and take appropriate steps to protect them from it. Child abuse, grooming and encouraging suicide are of particular focus as “priority offences” set out in the legislation. 

The final versions of the Codes of Practice are due to be published in Autumn this year. Services will then have three months to conduct their risk assessment, while the plans are subjected to Parliamentary approval. This is expected to conclude by the end of 2024, after which the Codes will come into force.

In Spring, Ofcom will publish a consultation on additional protections for children from harmful content promoting, among other things – suicide, self-harm, eating disorders and cyberbullying.

If you’re struggling just text MRF to 85258 so you can speak to a trained volunteer from Shout, the UK’s Crisis Text Line service.

Like this article?

Share on Facebook
Share on Twitter
Share on Linkedin
Share on Pinterest