Search our resource hub to find expert, evidence-led briefings and reports to guide effective and robust policy change in the online safety sector to reduce tech-facilitated self-harm and suicide

Recent weeks have seen growing momentum behind a social ban for under 16s in the UK, following the introduction of similar measures in Australia.

Risk assessments are a cornerstone of the Online Safety Act, with online services required to produce ‘suitable and sufficient’ risk assessments for both the illegal and child safety parts of the regime.

A joint letter to Victims Minister Alex Davies-Jones outlining the case for a Duty of Candour to be extended to social media companies where they are suspected of being involved in a death.

Members of Families and Survivors to Prevent Online Suicide Harms wrote to Ofcom Chief Executive Melanie Dawes urging further enforcement action to tackle a pro-suicide forum.

Report setting out how a substance and suicide forum costs lives and the state missed countless chances to act (Oct 25)

Research briefing – October 2025

Molly Rose Foundation writes to Ofcom boss Melanie Dawes urging the regulator to hold Meta to account for failures under the Online Safety Act.

From July 2025, social media platforms have been required to comply with new measures set out in the Online Safety Act to protect children from harmful content.

Molly rose Foundation coordinated a letter to Ofcom to warn against Meta’s plans to automate 90% of its risk assessments.