November 2022

Meta announces new measures to protect teens

The Molly Rose Foundation (MRF) welcomes any steps to help make the online world safer but we are disappointed that one of the world’s largest tech companies isn’t doing more to protect children on its platforms. Some of the recent measures Meta has announced are long overdue, while others are still in development.

This announcement reinforces our view that there is an urgent need for government regulation to compel platforms to utilise effective safety measures for their young users. Without legislation, tech safety improvements are so often implemented on a too little, too late basis.

We hope both Facebook and Instagram will move quickly to demonstrate their oft-stated commitment to online safety by introducing truly ground-breaking safety measures to keep children safe while on their platforms.

Most of all, we hope Meta change their algorithms to stop the spread of harmful content. Research shows, the vast majority of teens have seen disturbing content online, mostly never having searched for it.

The inquest into Molly’s death showed how the algorithmic amplification of a platform’s harmful content can result in tragedy but sadly this is a digital danger faced by most children. For the sake of these children and their loved ones, the MRF hope more meaningful change comes without further delay.

If you’re struggling just text MRF to 85258 so you can speak to a trained volunteer from Shout, the UK’s Crisis Text Line service

Like this article?

Share on Facebook
Share on Twitter
Share on Linkedin
Share on Pinterest