The Molly Rose Foundation (MRF) is backing a report highlighting dangerous content promoting eating disorders and self-harm on TikTok.
Compiled by the Centre for Countering Digital Hate (CCDH), the report claims some young TikTok users are being shown potentially dangerous content which could encourage eating disorders, self-harm and suicide.
Research conducted by the US-based online safety group found some accounts set up by it were repeatedly exposed to harmful topics within minutes of joining the platform.
MRF Chair of Trustees Ian Russell was invited to co-author the Parents Guide accompanying the report, which was released today.
The guide encourages parents to speak “openly” with their children about social media and online safety and to seek help from support groups if concerned about their child.
Mr Russell said: “Sadly, in their research CCDH found that vulnerable teens were more likely to be fed harmful content to their ‘For You’ page by the platform’s algorithms.
“Suicide content was recommended within 2.6 minutes and eating disorder content was recommended within 8 minutes.
“The report contains some examples of harmful the harmful content and is another shocking example of how easily vulnerable young people can be connected to online harms.”
A spokesperson for the MRF said: The Molly Rose Foundation endorses this report for its important work in highlighting the dangerous content promoting eating disorders and self-harm on TikTok.
“Exposing the underlying toxic content that infects so much of social media is vital in the battle to combat it. Self-regulation has failed in Big Tech and platforms which sell themselves as providing entertainment must pay heed to these disturbing findings.
“Through public scrutiny we must shine a spotlight on this harmful content and take affirmative action to help protect the vulnerable people exposed to it.”
Imran Ahmed, chief executive of the CCDH, said: “Parents will be shocked to learn the truth and will be furious that lawmakers are failing to protect young people from big tech billionaires, their unaccountable social media apps and increasingly aggressive algorithms.”
In response to the research, a TikTok spokesperson said: “This activity and resulting experience does not reflect genuine behaviour or viewing experiences of real people.
“We regularly consult with health experts, remove violations of our policies, and provide access to supportive resources for anyone in need.
“We’re mindful that triggering content is unique to each individual and remain focused on fostering a safe and comfortable space for everyone, including people who choose to share their recovery journeys or educate others on these important topics.”
If you’re struggling just text MRF to 85258 so you can speak to a trained volunteer from Shout, the UK’s Crisis Text Line service