Teen Accounts, Broken Promises: How Instagram is failing to protect minors

The report is a result of a landmark partnership between civil society and academia in the US and UK and was conducted by Meta whistleblower Arturo Béjar, Molly Rose Foundation, Fairplay, ParentsSOS and Cybersecurity for Democracy, based out of NYU and Northeastern University.

Letter to Ofcom: Meta’s AI chatbots

Molly Rose Foundation writes to Ofcom boss Melanie Dawes urging the regulator to hold Meta to account for failures under the Online Safety Act.

Parliamentary briefing: Pervasive-by-design

From July 2025, social media platforms have been required to comply with new measures
set out in the Online Safety Act to protect children from harmful content. Just weeks
before regulation took effect, our analysis found that Instagram and TikTok continued to
algorithmically recommend harmful suicide, self-harm and intense depression content to
teenage accounts at an industrial scale.

Pervasive-by-design report

Suicide, self-harm and intense depression content on TikTok and Instagram, and how their algorithms recommend it to teens.

The report contains examples of non-graphic but disturbing posts which are freely available to teens, but that are distressing.

Joint letter to Ofcom on Meta risk assessments

Molly rose Foundation coordinated a letter to Ofcom to warn against Meta’s plans to automate 90% of its risk assessments.

The letter was signed by 24 charities and online safety.

Briefing on Ofcom’s child safety measures

This briefing sets out our initial assessment of Ofcom’s Protection of Children measures, which in our view fail to rise to the challenge of protecting children from algorithmically-driven preventable harm.

Letter to Business Secretary

Molly Rose Foundation writes to Business Secretary Jonathan Reynolds amidst reports the Online Safety Act could be watered down to facilitate a US trade deal.

Meta’s rollback of safety protections – why the Government and Ofcom must act

Policy briefing – This briefing presents the results of new representative polling of adults across Great Britain, and it shows the public wants and expects a stronger legislative and regulatory response in the face of significant weakening of safety measures by large social media sites.

The Online Safety Act: public support for a stronger approach

Policy briefing – Online safety is at the top of the political agenda. With just days to go until the Online Safety Act takes effect, Molly Rose Foundation (MRF) has warned that Ofcom’s
implementation has proven disastrous – and that a strengthened and reworked Act is
urgently required.

The economic case for a stronger Online Safety Act

Parliamentary briefing – Molly Rose Foundation aims to challenge the flawed assumption that stronger online safety legislation is incompatible with the Government’s primary mission for growth.

The Online Safety Act: why we need further action to protect young lives

Parliamentary briefing – A new Online Safety Act that strengthens the regime – and that fixes weaknesses in the statutory framework – should be urgently brought forward.

Molly Rose Foundation urges Ofcom to act on Meta changes

Molly Rose Foundation has written to Ofcom urging them to commit to significant new, fast-tracked measures to prevent teens from being exposed to a tsunami of harmful content on Facebook and Instagram.

Ian Russell writes to the Chancellor

MRF Chair Ian Russell has written to the Chancellor of the Exchequer Rachel Reeves to say that online safety can result in economic growth for the country.

Ian Russell writes to the Prime Minister

Molly Rose Foundation Chair Ian Russell has written to Prime Minister Sir Keir Starmer calling on him to act urgently in order to protect young people online.

Further and faster

Public and parental views on and support for a new Online Safety Act.

Response to Ofcom’s consultation on protecting children from harms online

Molly Rose Foundation responds to Ofcom’s consultation on their Children’s Safety Codes.

How effectively do social networks moderate suicide and self-harm content?

This report is the first major analysis of DSA transparency data relating to content moderation
decisions relating to suicide and self-harm material.

It analyses over 12 million decisions taken by six major platforms between September 2023 and April 2024: Instagram, Facebook, TikTok, Pinterest, Snapchat and X.

Please be aware that this report contains extensive references to suicide, self-harm and poor mental health.

General Election Manifesto 2024

In its General Election 2024 manifesto, Molly Rose Foundation set out five bold policies that can have a transformational impact on children’s online safety and well-being.

 

Consultation response to Ofcom’s illegal harms approach

The Molly Rose Foundation responded to Ofcom’s consultation on its proposed approach to illegal online content, the first substantive part of the Online Safety Act to be consulted on.

 

 

Preventable yet pervasive

The prevalence and characteristics of harmful content, including suicide and self-harm material, on Instagram, TikTok and Pinterest.

This is a first-of-its kind report, in partnership with Bright Initiative by Bright Data.

Please be aware that this report contains extensive references to suicide, self-harm and poor mental health.