England’s Children’s Commissioner Anne Longfield has called for tech firm bosses to face criminal prosecution if children come to serious harm using their apps. She declared that children will look back on the current dangers of social media in the same way we now wonder how children were allowed in cars without seatbelts, adding that today’s children will be angry when they grow up and realise they were exposed to a “wild and dangerous” digital environment.
Interviewed by the Daily Telegraph, to mark the end of her six-year term as England’s third Children’s Commissioner, Mrs Longfield also expressed her “frustration” with the pace of progress on duty of care laws to protect children online. She told the newspaper she would “never forget” the self-harm images she saw on social media in the wake of Molly Russell’s death in November 2017.
“I remember searching at that time and actually seeing those images. I don’t think I’ll ever forget seeing them, they were so shocking. It really did mean that action had to be taken. I think we could still go online and we could still find on platforms enough distressing content to make us concerned. So I don’t think enough has been done. But certainly there was a change at that point.
“It’s really important that criminal charges are held because at the end of the day it’s comparable to the level of harm that the company (concerned) is allowing to take place. I think it’s justified in those terms and that it will demonstrate the commitment to change that really is needed,” she told the Telegraph’s Mike Wright.
Mrs Longfield, whose role is being taken over by Dame Rachel de Souza from 1st March 2021, also talked about ‘building back better for children’ in her final speech as Children’s Commissioner for England and challenged Government to “tackle the basic issues holding children back…we can help people with mental health problems”.
The Commissioner’s ‘Building back better’ report also references Molly and declares: “Too many children are still experiencing harm in their digital lives. At the root of the problem is the fact that the online platforms which play such a central role in their lives have been left to regulate themselves, and have failed to do so. Online companies have refused to take responsibility until after harm is done, and even then have failed to address the most fundamental problems on their platforms.
“This could be set to change. The introduction of the Age Appropriate Design Code and forthcoming Online Safety Bill are ground-breaking developments, signalling that government is willing to step in and end the era of self-regulation.”
If you’re struggling just text MRF to 85258 so you can speak to a trained volunteer from Shout, the UK’s Crisis Text Line service