The Molly Rose Foundation has welcomed Ofcom’s draft Codes of Practice to create a safer life online, but has urged the regulator to strengthen its plans.
The plans state tech firms must use a range of measures to protect their users from illegal content online – from child sexual abuse to pro-suicide content and grooming to fraud.
Firms will be required to assess the risk of users being harmed by illegal content on their platform and take appropriate steps to protect them from it. Child abuse, grooming and encouraging suicide are of particular focus as “priority offences” set out in the legislation.
Ian Russell, Chair of MRF Trustees, said: “We welcome Ofcom’s first draft code of practice as a step towards delivering the protections set out in the Online Safety Act.
“However these draft measures must be the floor not the ceiling of Ofcom’s ambitions. It’s vital we see the regulator strengthen its plans to tackle entirely preventable harm.
“Ofcom must develop a bold set of proposals to protect children’s safety and we look forward to working closely with them to improve these draft plans.”
The final versions of the Codes of Practice are due to be published in autumn 2024. Services will then have three months to conduct their risk assessment, while the plans are subjected to Parliamentary approval. This is expected to conclude by the end of 2024, after which the Codes will come into force.
In Spring next year, Ofcom will publish a consultation on additional protections for children from harmful content promoting, among other things – suicide, self-harm, eating disorders and cyberbullying.
If you’re struggling just text MRF to 85258 so you can speak to a trained volunteer from Shout, the UK’s Crisis Text Line service