Charities speak out against Meta’s plans to automate risk assessments

Molly Rose Foundation - Charities speak out against Meta’s plans to automate risk assessments
Charities speak out against Meta’s plans to automate risk assessments

A group of charities and online safety experts have criticised Meta’s plans to use AI to fill out 90% of its risk assessments on new products and updates.

A letter coordinated by Molly Rose Foundation was sent to Ofcom Chief Executive Melanie Dawes urging the regulator to speak out and say the move is incompatible with the Online Safety Act.

It was signed by 24 organisations and online safety experts and said Meta’s actions “threaten the safety and wellbeing of children, women, LGBTQ+ and black and minoritised communities”.

Andy Burrows, Chief Executive of Molly Rose Foundation, said: “This crass decision is another deeply retrograde step that confirms Mark Zuckerberg is determined to lead a race to the bottom on Meta’s product safety standards.

“Robust risk assessments are a cornerstone of the Online Safety Act and should be undertaken with expert human oversight not treated as a box ticking exercise that makes a mockery of the regulatory process.

“It is crucial that Ofcom acts quickly to tell Meta and the rest of the tech industry that it won’t accept them cutting corners when the safety of our children and society is at stake.”

You can read more about this news in The Guardian and download the full letter here.

If you’re struggling just text MRF to 85258 so you can speak to a trained volunteer from Shout, the UK’s Crisis Text Line service