Find Help
Sign Up

How effectively do social networks moderate suicide and self-harm content?

How effectively do social networks moderate suicide and self-harm content?

This report is the first major analysis of DSA transparency data relating to content moderation
decisions relating to suicide and self-harm material.

It analyses over 12 million decisions taken by six major platforms between September 2023 and April 2024: Instagram, Facebook, TikTok, Pinterest, Snapchat and X.

Please be aware that this report contains extensive references to suicide, self-harm and poor mental health.

Stay Connected

Keep up to date with our work and connected to support

Sign up to receive regular updates

Check here for latest news stories

Support others and order free help cards