Meta’s policy changes risk young lives without decisive action from Ofcom
  • Meta’s new rules will put Facebook and Instagram back to where they were when Molly Russell died, Molly Rose Foundation warns
  • Ofcom must commit to fast-track rules to comprehensively tackle algorithmically suggested depression, suicide and self-harm content
  • PM should be on watch and stand ready to legislate to reign in Big Tech’s cavalier conduct

Meta’s new rules on content moderation risk pushing social media back to the place it was when Molly Russell died, with Ofcom’s current approach to regulation nowhere near strong enough to stop it, Molly Rose Foundation (MRF) warns today.

The charity is calling on the regulator to commit to significant new, fast-tracked measures to prevent teens from being exposed to a tsunami of harmful content on Facebook and Instagram.

Mark Zuckerberg announced plans to scale back moderation worldwide earlier this month, including in the UK and Europe. While the platform confirmed it would still proactively scan for posts that promote or glorify suicide and self-harm, they have offered no assurances it will act on other forms of harmful material.

Crucially, this includes content that references extreme depression and that normalises suicide and self-harm behaviours. As Molly’s case shows, these types of content can have a devastating effect on teens when viewed or algorithmically recommended in large amounts.

Last spring, Ofcom set out plans to tackle this so-called ‘cumulative harm’ as part of a package of measures designed to “tame toxic algorithms”.

At the time, MRF warned the proposals were likely to be insufficient, with Ofcom only requiring platforms to stop harmful content being recommended where it had already been identified through content moderation.

However, Meta now proposes to predominantly rely on user reports, rather than proactive technology, to perform content moderation. This is despite Meta’s own data showing just 0.9% of suicide and self-injury content it took action on between July and September 2024 came from user reports.

Ahead of final codes by Ofcom to protect children online, expected before Easter, MRF has today written to Ofcom asking them to urgently strengthen their approach.

The letter urges Ofcom to:

  • bolster its requirements for content moderation and algorithms: this should include a clear requirement to proactively scan for all types of intense depression, suicide and self-harm content;
  • fast track new measures: Ofcom has previously claimed they have to consult separately on any new measure which might typically take 18 months;
  • confirm Meta’s new hate speech policies cannot apply to children: MRF is particularly concerned about the mental health impacts of Meta’s relaxed hate speech policies on children identifying as LGBTQ+, and those from ethnic minorities and other vulnerable groups;
  • clarify whether Meta can change its rules without following established processes: media reports clam Mark Zuckerberg changed the company’s processes and policies without following any of the usual internal processes, leaving policy and integrity teams ‘blindsided.’ Ofcom should clarify this cannot happen again.

Andy Burrows, Chief Executive of Molly Rose Foundation, said: “Meta’s bonfire of safety measures is hugely concerning and Mark Zuckerberg’s increasingly cavalier choices are taking us back to what social media looked like at the time that Molly died .

“Ofcom must send a clear signal it is willing to act in the interests of children and urgently strengthen its requirements on tech platforms.

“If Ofcom fails to keep pace with the irresponsible actions of tech companies the Prime Minister must intervene. Amid a strategic rollback of their safety commitments, preventable harm is being driven by Silicon Valley but the decision to stop it in its tracks now sits with the regulator and Government.”

The intervention comes weeks after Ian Russell told the Prime Minister the UK was going backwards on online safety.

He set out the need for new legislation that sets out rules to target the conduct of tech companies as well as establishing an overarching Duty of Care.

The charity’s previous research found Meta’s approach to tackling suicide and self-harm content is already unfit for purpose with the company responsible for just two per-cent of industry-wide takedowns.

Meta has advised Molly Rose Foundation it is not currently able to discuss its recent announcements.

Read the letter to Ofcom in full here.

If you’re struggling just text MRF to 85258 so you can speak to a trained volunteer from Shout, the UK’s Crisis Text Line service

Thank You For Visiting

We're Here To Help

If you or someone you care about needs help, please contact one of the services shown below or use the Find a Helpline service to locate more specific services based on your needs and the type of interaction you would prefer.