Big Tech should be regulated like the banks with a conduct-based regime that holds senior managers accountable for product safety risks, in a sweeping new approach to online safety proposed by Molly Rose Foundation.
The Government must listen to and act on the urgent calls from parents for meaningful change as we set out a five-point plan to take quick, bold and effective action on preventable online harms.
The plan was launched in Parliament at an event addressed by Technology Secretary Liz Kendall.
It calls for immediate action to fix and strengthen the Online Safety Act; new powers to target harmful and addictive design; and to see action taken to tackle the growing risks not just on social media but on gaming sites, messaging apps and high-risk AI chatbots.
Molly Rose Foundation warns that an Australia-style ban is likely to offer parents a false sense of safety.
It comes as new polling by Molly Rose Foundation and Savanta shows three quarters (73%) of UK adults said they “support new legislation to strengthen regulation of social media platforms for children and young people to protect children and young people from harm”.
This is higher than recent polling from the Good Growth Foundation that found that 66% supported an Australia-style social media ban.
We specifically want new legislation that would implement robustly enforced risk-based age limits. This would mean platforms with the highest risk functionalities, including live streaming and AI chatbots, would have the highest age ratings.
For the first time, social media and gaming platforms would be incentivised to build safer and more age-appropriate products if they want to be offered to younger teens.
Under the proposals, social media algorithms would not only need to stop recommending harmful material, but would also have to include high-quality and age-appropriate content for teens. This means that algorithmic feeds, such as TikTok’s for You Page, would be required to include trusted sources of mental health support, educational content, and high-quality children’s content from public service media.
The new rules would see a de-facto ban of AI Chatbots for under-16s unless they can be demonstrably shown to be safe and protect wellbeing by design.
Ian Russell, Chair of Molly Rose Foundation, said: “We need a bold new reset of online safety laws that can decisively reverse years of quick fixes and put an end addictive design and aggressive algorithms once and for all.
“Parents are right to demand tough action and we are right behind them. However, children and families deserve a comprehensive strategy that will actually work, not the false sense of safety being offered by a flawed and ineffective Australia-style ban.
“The Government should have the courage to act on the evidence and stand up for children by delivering the tough and wide-reaching regulation that they promised in opposition but are yet to deliver.”
Andy Burrows, Chief Executive of Molly Rose Foundation, said: ‘We are at an inflection point for online safety and too many parents continue to feel they are on their own when it comes to protecting their children from online risk.
“This has to change and that’s why the Government should act quickly and decisively to address the root causes of online harm, listening to the urgency from parents and channelling their concern into a strong, world-leading and evidence-based approach.
“Children and families deserve better than a choice between the appalling status quo and well-intentioned but simplistic solutions. Ministers must act with courage and integrity, knowing they will be harshly judged if they continue to dodge the issues or fail to get this right.”
Molly Rose Foundation’s Five Point online safety settlement includes:
If you’re struggling just text MRF to 85258 so you can speak to a trained volunteer from Shout, the UK’s Crisis Text Line service.