Critical Harm Intelligence Briefing: Weaponised Loneliness

Molly Rose Foundation has issued a “public warning” about the prevalence and growth of online networks that coerce girls into sexual abuse, acts of violence, self-harm and the encouragement of suicide.

A new report by Resolver Trust and Safety in partnership with Molly Rose Foundation finds so-called Com networks are recruiting young victims and coercing them to become perpetrators of violence and abuse.

Content warning: This report details multi-faceted harms including (but not limited to) Child Sexual Exploitation and Abuse (CSEA), self-harm and suicide, hate speech, harassment, violent extremism and graphic violence. 

If you’re struggling text MRF to 85258 so you can speak to a trained volunteer from Shout, the UK’s Crisis Text Line service.

Social media bans – early evidence from Australia and the importance of an evidence-based approach

Bold and decisive action is needed to tackle the acute and chronic harms caused by social media. However, early evidence from Australia points to a range of unintended consequences and a malign compliance strategy from major platforms, with many under 16 accounts still active.

Multiple studies are underway to determine the effectiveness of Australia’s ban and gauge the positive and adverse impacts that may result. Until then, we cannot be confident that a ban risks swapping one failed experiment in children’s safety and wellbeing for another.

Parliamentary briefing – social media bans are the wrong approach for children

Recent weeks have seen growing momentum behind a social ban for under 16s in the UK, following the introduction of similar measures in Australia.

The intention behind calls for a ban is entirely legitimate. Parents across the country are profoundly concerned about the growing scale and complexity of online risks, with the awful reality that we lose one young person to suicide every week where technology plays a role.

But social media bans are not the answer. Though well-intentioned and appealing in their simplicity, they are a blunt measure that risk doing more harm than good, and that penalises children for tech firms and successive Governments’ failures to act. Bans will not deliver the long-term and fundamental improvements in safety that parents, children and parliamentarians want.

Systematic failures in Online Safety Act risk assessments – why is Ofcom failing to act?

Risk assessments are a cornerstone of the Online Safety Act, with online services required to produce ‘suitable and sufficient’ risk assessments for both the illegal and child safety parts of the regime. However, new Ofcom analysis suggests there were ‘notable issues’ with many of the first assessments produced.

Duty of Candour letter

A joint letter to Victims Minister Alex Davies-Jones outlining the case for a Duty of Candour to be extended to social media companies where they are suspected of being involved in a death.

Bereaved families and survivors letter to Ofcom

Members of Families and Survivors to Prevent Online Suicide Harms wrote to Ofcom Chief Executive Melanie Dawes urging further enforcement action to tackle a pro-suicide forum.

Missed chances, lost lives: How a substance and suicide forum cost lives and the state missed countless chances to act

Our report sets out how Government departments were warned 65 times about pro-suicide forums and a substance they promote as bereaved families and survivors call for a public inquiry into missed opportunities to save lives.

 

Children’s exposure to suicide, self-harm, depression and eating disorder content on social media

Research briefing – October 2025

Children’s exposure to suicide, self-harm, depression, and eating disorder content online

It is now almost 8 years since Molly Russell died after being algorithmically bombarded with harmful suicide, self-harm and depression content. This report reveals that, immediately before the Online Safety Act took effect, far too little had changed. Using a nuanced and in-depth approach to understand the true scale of children’s exposure to different types of potentially harmful content, the findings suggest that young people continued to be put at risk at a deeply worrying scale.

Teen Accounts, Broken Promises: How Instagram is failing to protect minors

The report is a result of a landmark partnership between civil society and academia in the US and UK and was conducted by Meta whistleblower Arturo Béjar, Molly Rose Foundation, Fairplay, ParentsSOS and Cybersecurity for Democracy, based out of NYU and Northeastern University.

Letter to Ofcom: Meta’s AI chatbots

Molly Rose Foundation writes to Ofcom boss Melanie Dawes urging the regulator to hold Meta to account for failures under the Online Safety Act.

Parliamentary briefing: Pervasive-by-design

From July 2025, social media platforms have been required to comply with new measures
set out in the Online Safety Act to protect children from harmful content. Just weeks
before regulation took effect, our analysis found that Instagram and TikTok continued to
algorithmically recommend harmful suicide, self-harm and intense depression content to
teenage accounts at an industrial scale.

Pervasive-by-design report

Suicide, self-harm and intense depression content on TikTok and Instagram, and how their algorithms recommend it to teens.

The report contains examples of non-graphic but disturbing posts which are freely available to teens, but that are distressing.

Joint letter to Ofcom on Meta risk assessments

Molly rose Foundation coordinated a letter to Ofcom to warn against Meta’s plans to automate 90% of its risk assessments.

The letter was signed by 24 charities and online safety.

Briefing on Ofcom’s child safety measures

This briefing sets out our initial assessment of Ofcom’s Protection of Children measures, which in our view fail to rise to the challenge of protecting children from algorithmically-driven preventable harm.

Letter to Business Secretary

Molly Rose Foundation writes to Business Secretary Jonathan Reynolds amidst reports the Online Safety Act could be watered down to facilitate a US trade deal.

Meta’s rollback of safety protections – why the Government and Ofcom must act

Policy briefing – This briefing presents the results of new representative polling of adults across Great Britain, and it shows the public wants and expects a stronger legislative and regulatory response in the face of significant weakening of safety measures by large social media sites.

The Online Safety Act: public support for a stronger approach

Policy briefing – Online safety is at the top of the political agenda. With just days to go until the Online Safety Act takes effect, Molly Rose Foundation (MRF) has warned that Ofcom’s
implementation has proven disastrous – and that a strengthened and reworked Act is
urgently required.

The economic case for a stronger Online Safety Act

Parliamentary briefing – Molly Rose Foundation aims to challenge the flawed assumption that stronger online safety legislation is incompatible with the Government’s primary mission for growth.

The Online Safety Act: why we need further action to protect young lives

Parliamentary briefing – A new Online Safety Act that strengthens the regime – and that fixes weaknesses in the statutory framework – should be urgently brought forward.

Molly Rose Foundation urges Ofcom to act on Meta changes

Molly Rose Foundation has written to Ofcom urging them to commit to significant new, fast-tracked measures to prevent teens from being exposed to a tsunami of harmful content on Facebook and Instagram.

Ian Russell writes to the Chancellor

MRF Chair Ian Russell has written to the Chancellor of the Exchequer Rachel Reeves to say that online safety can result in economic growth for the country.

Ian Russell writes to the Prime Minister

Molly Rose Foundation Chair Ian Russell has written to Prime Minister Sir Keir Starmer calling on him to act urgently in order to protect young people online.

Further and faster

Public and parental views on and support for a new Online Safety Act.

Response to Ofcom’s consultation on protecting children from harms online

Molly Rose Foundation responds to Ofcom’s consultation on their Children’s Safety Codes.

How effectively do social networks moderate suicide and self-harm content?

This report is the first major analysis of DSA transparency data relating to content moderation
decisions relating to suicide and self-harm material.

It analyses over 12 million decisions taken by six major platforms between September 2023 and April 2024: Instagram, Facebook, TikTok, Pinterest, Snapchat and X.

Please be aware that this report contains extensive references to suicide, self-harm and poor mental health.

General Election Manifesto 2024

In its General Election 2024 manifesto, Molly Rose Foundation set out five bold policies that can have a transformational impact on children’s online safety and well-being.

 

Consultation response to Ofcom’s illegal harms approach

The Molly Rose Foundation responded to Ofcom’s consultation on its proposed approach to illegal online content, the first substantive part of the Online Safety Act to be consulted on.

 

 

Preventable yet pervasive

The prevalence and characteristics of harmful content, including suicide and self-harm material, on Instagram, TikTok and Pinterest.

This is a first-of-its kind report, in partnership with Bright Initiative by Bright Data.

Please be aware that this report contains extensive references to suicide, self-harm and poor mental health.