Investigation Reveals How TikTok’s Algorithm Pushes Harmful Content to Teens

Ay Dios mio, the things I have to write about sometimes. A new investigation into TikToks algorithm has revealed something that parents have been worried about for years – the platform actively pushes harmful content to teenage users, including material related to eating disorders, self-harm, and suicide. Within MINUTES of creating new accounts that expressed interest in these topics.
The investigation created dozens of test accounts posing as 13-year-olds. What they found was alarming but not exactly surprising to anyone whos been paying attention. The Wall Street Journal conducted similar research showing how quickly the algorithm serves concerning content to users who engage with it even briefly.
The way TikToks For You Page works is it learns what you engage with and shows you more of it. Simple enough in theory. But in practice, this means that a teenager who pauses on one video about dieting can quickly find their entire feed dominated by increasingly extreme content about weight loss, body image, and eating disorders. The algorithm doesnt distinguish between healthy interest and dangerous obsession.

TikTok has responded by pointing to their community guidelines and content moderation efforts. They say they remove content that violates their rules and have implemented features like content warnings and limits on search results for certain terms. The Verge reported on TikToks response and the broader debate about algorithmic responsibility
But heres my thing – and I say this as someone who spends probably too much time on social media myself – these platforms know EXACTLY what theyre doing. They have the smartest engineers in the world. They can detect and remove copyrighted music in seconds. But somehow they cant figure out how to stop pushing self-harm content to literal children? Por favor. The connection between social media and teen mental health has been extensively documented
Parents need to be aware of what their kids are seeing online. Teens need media literacy. But at some point we also need to hold these platforms accountable for the systems they design. Because right now those systems are optimized for engagement, not wellbeing. And that has real consequences for real kids.
