Molly Rose Foundation generates high-quality evidence and insights to inform and empower the online safety sector. In conjunction with our Lived Experience networks, we underpin our policy and legislative objectives to create powerful calls for change.
Our research programme is focused on identifying and tracking online harm, understanding links to mental health and wellbeing in young people, highlighting products that have potential to cause online harm and identifying and understanding solutions with the aim to push for the systemic conditions and changes needed to deliver them.

A new report by Resolver Trust and Safety in partnership with Molly Rose Foundation finds so-called Com networks are recruiting young victims and coercing them to become perpetrators of violence and abuse (Jan 26)

immediately before the Online Safety Act came into effect (Oct 25)

Collaborative report revealing how Instagram is failing to protect minors (Sept 25)

Suicide, self-harm and intense depression content on TikTok and Instagram, and how their algorithms recommend it to teens (Aug 25)

Report setting out how a substance and suicide forum costs lives and the state missed countless chances to act (Oct 25)
of 13-17 year olds had seen high risk suicide, self-harm, depression or eating disorder content on social media in the last week (Oct 25 report)
girls had seen high risk suicide, self-harm, depression or eating disorder content in the last week, including one in five who had seen content showing self-harm (Oct 25 report)
of children with SEND had seen high risk suicide, self-harm, depression or eating disorder content in the last week, (Oct 25 report)
of Meta’s safety tools are either substantially ineffective or no longer existed. Just 1 in 5 worked as described (Sept 25 report, Teen Accounts, Broken Promises)
of recommended harmful posts on TikTok’s ‘for you page’ contained references to suicide and self-harm ideation (Aug 25, Pervasive by Design report)
Instagram reels and 96% TikTok videos were found to be harmful (Aug 25, Pervasive by Design report)

Report by global trust and safety intelligence group Resolver, in partnership with Molly Rose Foundation, exposes disturbing scale and nature of “Com networks”

Ofcom’s unambitious implementation of the Online Safety Act fails to match the levels of harm children are exposed to.

Instagram’s Teen Accounts are abjectly failing to keep young people safe despite Meta’s PR claims, a major new report has revealed.

We are dedicated to ensuring that children and young people are protected from online harm to bring an end to preventable deaths by suicide where technology plays a role.