
- Seven in ten with an opinion are opposed to Meta policy changes that Molly Rose Foundation says will sharply increase suicide and self-harm risks
- By a 16-to-1 margin, the public think social media platforms should be prevented from changing their policies to scale back existing safety measures
- Government and Ofcom must act on ‘bonfire of safety measures’ as Molly Rose Foundation warns a timid response could cost lives
A majority of the public opposes Meta’s rollback of moderation on its platforms and supports decisive new action from Government and Ofcom, a new report by Molly Rose Foundation finds.
New large-scale YouGov polling of 2,275 British adults shows the public opposes by a two-to-one margin Meta’s decision to stop automatically searching for and removing harmful content in key policy areas.
By a remarkable 16-to-one margin, British adults think social media companies should be prevented from changing their policies to scale back their existing safety commitments. Excluding don’t knows, 93% would support this measure, in effect a no rollback duty which could be amended into the Online Safety Act.
In January, Meta announced it would now rely on user reports and stop using proactive technology to detect harmful content, and would also rollback its hate and unacceptable speech policies.
This is despite Meta’s own data showing just 0.9% of suicide and self-injury content it took action on between July and September 2024 came from user reports.
Molly Rose Foundation is warning about the impact on children’s safety and that Ofcom’s current proposals don’t prevent Meta’s safety rollbacks. Under Meta’s changes, teenagers are likely to be exposed to more depression content, hate speech and be at greater risk of cumulative harm driven by algorithms.
The polling finds overwhelming public support for additional new duties to be placed on platforms to protect children and young adults from tech companies slashing their safety measures:
- 86% of adults support a new duty on social media platforms to proactively search for harmful content: in effect, preventing Meta’s rollback of safety measures and stopping them putting the onus for safety onto users.
- 88% of adults support social media companies being required to prevent children being exposed to harmful content even if it’s allowed for adults: Ofcom has so far failed to say it will strengthen its Children’s Safety Codes to prevent Meta’s changes applying to under 18s. This means children could be exposed to hate speech such as “women are property” and “LGBTQ people are mentally ill”.
- 87% believe social media bosses responsible for safety should be held to minimum standards of conduct: making them fully responsible for safety rather than following a prescriptive set of rules.
Molly Rose Foundation wrote to Ofcom in January warning of the risks of Meta’s rollback but the regulator has chosen not to give any public assurances or to fast track additional measures into the Children’s Safety Code due out in a few weeks.
Meanwhile, the Government has failed to commit to amending the Online Safety Act with stronger measures to stop Meta’s changes going ahead.
Andy Burrows, Molly Rose Foundation Chief Executive, said: “Mark Zuckerberg’s reckless changes pose a fundamental risk to children and young people. We fear they will sharply increase the suicide, self-harm and depression risks they face.
“This is a big first test of regulation and a timid response could cost lives. The Online Safety Act is the best vehicle we have to protect young people and society from harm, but Meta knows only too well that unless the legislation is strengthened there is nothing to stop them lighting the touchpaper on a disturbing bonfire of safety measures.
“The public want and expect urgent action to stop us going backwards on online safety. Decisions on how we protect our children must be taken by our Prime Minister and democratically elected government, not determined by tech oligarchs and the demands of the White House.”
If you’re struggling just text MRF to 85258 so you can speak to a trained volunteer from Shout, the UK’s Crisis Text Line service.