Find Help
Sign Up

Regulate Big Tech like the banks to protect children from addictive design and harmful algorithms

Regulate Big Tech like the banks to protect children from addictive design and harmful algorithms
  • Ian Russell demands a bold new online safety settlement to deliver quick, meaningful and decisive action on preventable harms
  • 73% of UK adults back stronger regulation to protect children online, higher than recent support for Australia-style bans
  • New legislation would end harmful and addictive design, enforce risk-based age ratings and make safety and wellbeing ‘the price of admission’ for tech firms in the UK

Big Tech should be regulated like the banks with a conduct-based regime that holds senior managers accountable for product safety risks, in a sweeping new approach to online safety proposed by Molly Rose Foundation.

The Government must listen to and act on the urgent calls from parents for meaningful change as we set out a five-point plan to take quick, bold and effective action on preventable online harms.

The plan was launched in Parliament at an event addressed by Technology Secretary Liz Kendall.

It calls for immediate action to fix and strengthen the Online Safety Act; new powers to target harmful and addictive design; and to see action taken to tackle the growing risks not just on social media but on gaming sites, messaging apps and high-risk AI chatbots.

Molly Rose Foundation warns that an Australia-style ban is likely to offer parents a false sense of safety.

It comes as new polling by Molly Rose Foundation and Savanta shows three quarters (73%) of UK adults said they “support new legislation to strengthen regulation of social media platforms for children and young people to protect children and young people from harm”.

This is higher than recent polling from the Good Growth Foundation that found that 66% supported an Australia-style social media ban.

We specifically want new legislation that would implement robustly enforced risk-based age limits. This would mean platforms with the highest risk functionalities, including live streaming and AI chatbots, would have the highest age ratings.

For the first time, social media and gaming platforms would be incentivised to build safer and more age-appropriate products if they want to be offered to younger teens.

Under the proposals, social media algorithms would not only need to stop recommending harmful material, but would also have to include high-quality and age-appropriate content for teens. This means that algorithmic feeds, such as TikTok’s for You Page, would be required to include trusted sources of mental health support, educational content, and high-quality children’s content from public service media.

The new rules would see a de-facto ban of AI Chatbots for under-16s unless they can be demonstrably shown to be safe and protect wellbeing by design.

Ian Russell, Chair of Molly Rose Foundation, said: “We need a bold new reset of online safety laws that can decisively reverse years of quick fixes and put an end addictive design and aggressive algorithms once and for all.

“Parents are right to demand tough action and we are right behind them. However, children and families deserve a comprehensive strategy that will actually work, not the false sense of safety being offered by a flawed and ineffective Australia-style ban.

“The Government should have the courage to act on the evidence and stand up for children by delivering the tough and wide-reaching regulation that they promised in opposition but are yet to deliver.”

Andy Burrows, Chief Executive of Molly Rose Foundation, said: ‘We are at an inflection point for online safety and too many parents continue to feel they are on their own when it comes to protecting their children from online risk.

“This has to change and that’s why the Government should act quickly and decisively to address the root causes of online harm, listening to the urgency from parents and channelling their concern into a strong, world-leading and evidence-based approach.

“Children and families deserve better than a choice between the appalling status quo and well-intentioned but simplistic solutions. Ministers must act with courage and integrity, knowing they will be harshly judged if they continue to dodge the issues or fail to get this right.”

Molly Rose Foundation’s Five Point online safety settlement includes:

  • Fixing and strengthening the Online Safety Act: Immediate measures to strengthen the Online Safety Act and new legislation that refocuses the regime on delivering meaningful harm reduction. A new Act would draw on the successful approach to financial services regulation, with new outcomes and conduct based measures that would apply to corporate entities and senior managers, and that are much better placed to take on the size and financial power of Big Tech.
  • Extending laws to include wellbeing measures for children in the scope of the Act should be extended to cover wellbeing: Ending addictive and aggressive algorithms, tackling the chronic harms that are driving parental concern, and placing new duties on platforms to ensure products are age-appropriate, high quality and nourishing by design.
  • Require new levels of transparency, accountability and candour from Big Tech: Large platforms and senior managers would have to proactively disclose information about risks on their products. Learning lessons from tackling climate change, companies would have to report exposure to online risks across their supply chain, including if they advertise on social media.
  • A ‘polluter pays’ and whole platform approach to harm reduction: The levy which currently funds Ofcom should be extended to pump-prime academic and civil society research into online harm. A new Code of Practice should set consistent minimum standards for app stores, parental controls and operating systems such as Apple and Google.
  • Education as inoculation – a bold investment in critical digital and media literacy education: An overhaul of critical digital and media literacy education to inoculate children from the worst effects of online harm while also equipping young people with the skills to thrive in the AI and digital economy of the future.

If you’re struggling just text MRF to 85258 so you can speak to a trained volunteer from Shout, the UK’s Crisis Text Line service.

Stay Connected

Keep up to date with our work and connected to support

Sign up to receive regular updates

Check here for latest news stories

Support others and order free help cards