The facebook files published in this week’s Wall Street Journal have revealed two faces of Facebook: one imperfect and private, the other polished and public facing. The shocking 14th September WSJ report, “Facebook Knows Instagram Is Toxic for teen girls, Company Documents Show” reveals that since 2019 Facebook has learnt from its own undisclosed research how harmful its Instagram platform can be, while publicly defending the safety of their platforms. This alarming WSJ story has given the Molly Rose Foundation cause to re-examine another story, one that is close to our hearts, concerning social media and online harms, one also in the news during this period.
On 22nd January 2019, when the BBC Six o’clock News first reported the death of Molly Russell, there was widespread public outcry. Within a fortnight Instagram’s head, Adam Mosseri, had flown to the UK to announce how the social media platform would be made safer. In the Daily Telegraph, Mr Mosseri wrote he had been “deeply moved” by the tragic stories and “It was overwhelming. It’s the kind of thing that hits you in the chest and sticks with you.”
It was announced that the platform’s community guidelines were to be expanded to ban all graphic suicide and self-harm content. Mr Mosseri also admitted Instagram had work to do, “We are not yet where we need to be on the issues of suicide and self-harm. We need to do everything we can to keep the most vulnerable people who use our platform safe.”
Perhaps above all, it is now timely to recall that, in his Daily Telegraph piece, Mr Mosseri stated, “We are committed to publicly sharing what we learn. We deeply want to get this right and we will do everything we can to make that happen.”
It is this statement that comes to mind when reading, in this week’s WSJ, the revelations about the learnings of Facebook/Instagram from their “teen mental health deep dive” but had decided to keep to themselves, such as:
- “We make body image issues worse for one in three teen girls,” said one slide from 2019, summarizing research about teen girls who experience the issues.”
- “Teens blame Instagram for increases in the rate of anxiety and depression,” said another slide. “This reaction was unprompted and consistent across all groups.”
- Among teens who reported suicidal thoughts, 13% of British users and 6% of American users traced the desire to kill themselves to Instagram, one presentation showed
- March 2020 internal research warns that the Explore page, which serves users photos and videos curated by an algorithm, can send users deep into content that can be harmful.
- “Aspects of Instagram exacerbate each other to create a perfect storm,” the research states.
- “Teens told us that they don’t like the amount of time they spend on the app but feel like they have to be present,” an Instagram research manager explained to colleagues, according to the documents. “They often feel ‘addicted’ and know that what they’re seeing is bad for their mental health but feel unable to stop themselves.”
- Teen boys aren’t immune. In the deep dive Facebook’s researchers conducted into mental health in 2019, they found that 14% of boys in the U.S. said Instagram made them feel worse about themselves. In their report on body image in 2020, Facebook’s researchers found that 40% of teen boys experience negative social comparison.
So in response to these revelations, we repeat the 2019 call made in a joint letter to the social platforms from the Molly Rose Foundation and NSPCC and once again urge them to strive to make their platforms safer, “Please use your power for good and take this chance to put fundamental protections in place to help keep children safe, both now and for future generations.”
We are also reminded of the importance of an independent regulator that can enforce a duty of care on platforms and ensure they take consistent action to protect children and we again emphasise how important the draft Online Safety Bill and Ofcom could be in legislating and regulating our digital world.
And we reiterate a point made when MRF gave oral evidence on Monday 13th September, to the joint pre-legislative Joint pre-legislative scrutiny Committee on the Draft Online Safety Bill, we call for tech companies to be compelled to supply anonymised data to bona fide academics for independent research, funded by an industry levy, in order to explore the effect the online world has on our lives offline.
The online world needs to mirror the established safety standards we expect of our offline world if we are to improve the wellbeing and the protect the lives of our young people. Let’s move fast to achieve this.