Technology plays a role in at least one in four deaths by suicide among young people aged 10 to 19, and we now lose a young person to technology-influenced suicide every single week.
Across the country, children and families are being left to experience the devastating consequences of inaction from tech firms. Although progress has been made, urgent and decisive action is still needed to build and strengthen legislative and regulatory guardrails that can turn the tide on preventable online harm.
Our policy work sets out to,
Our Roadmap for a better online future sets out a five-point plan to deliver meaningful change and attract the confidence and support of parents, children, and civil society experts.
We launched our Roadmap in Westminster in 2026 – watch event video below.


Read policy briefing providing an overview of Molly Rose Foundation’s five-point plan laid out in our Roadmap for a better online future.

A new online safety settlement for children, parents and families.

A new report by Resolver Trust and Safety in partnership with Molly Rose Foundation finds so-called Com networks are recruiting young victims and coercing them to become perpetrators of violence and abuse (Jan 26)

immediately before the Online Safety Act came into effect (Oct 25)

Bold and decisive action is needed to tackle the acute and chronic harms caused by social media.

Recent weeks have seen growing momentum behind a social ban for under 16s in the UK, following the introduction of similar measures in Australia.

We are dedicated to ensuring that children and young people are protected from online harm to bring an end to preventable deaths by suicide where technology plays a role.

Ian Russell demands a bold new online safety settlement to deliver quick, meaningful and decisive action on preventable harms

Report by global trust and safety intelligence group Resolver, in partnership with Molly Rose Foundation, exposes disturbing scale and nature of “Com networks”

Many parents and campaigners are calling for a ban on social media for under 16s, and their concerns are entirely legitimate. Children have been exposed to worrying levels of harm online for far too long, made worse by design choices that push content through powerful recommendation systems.