TikTok and Instagram were recommending industrial levels of harmful suicide and self-harm content to teens just weeks before the Online Safety Act came into effect, actively putting young lives at risk. Almost eight years on from the death of Molly Russell, new research found algorithmically driven depression, suicide and self-harm content being recommended at a vast scale to accounts opened as a 15-year-old-girl.
The report found:

Almost all of the recommended videos watched on Instagram Reels (97%) and TikTok (96%) were found to be harmful: bombarding teens with harmful content in a similar way to what happened to Molly.

Over half (55%) of recommended harmful posts on TikTok’s For You Page actively contained references to suicide and self-harm ideation and 16% referenced suicide methods: recommended videos included posts that promoted and glorified suicide, referenced suicide methods and normalised intense feelings of misery and despair.

While both platforms had enabled teenagers to offer negative feedback on content being recommended to them, as required by Ofcom, they had also provided an option to be recommended more harmful content – including suicide and intense depression posts.