TikTok’s Dirty Little Secret: The Algorithm That’s Killing Our Kids
A shocking new ruling from the Third Circuit court of appeals has exposed the dark underbelly of TikTok’s algorithmic recommendations on the For You Page (FYP). In a stunning blow to the tech giant’s free speech claims, the court has ruled that TikTok’s algorithm is, in fact, its own speech – and therefore, it can be held accountable in court.
This bombshell decision has significant implications for the entire tech industry, as it pokes a major hole in the legal shield known as Section 230, which has protected online platforms from being sued over their users’ posts. But make no mistake, this ruling is not just about protecting kids from harmful content – it’s about holding corporate America accountable for the damage they’ve caused to our society.
The case in question revolves around the tragic death of 10-year-old Nylah Anderson, who "unintentionally hanged herself" after watching videos of the so-called blackout challenge on her FYP. The "challenge" was a sick and twisted trend that encouraged viewers to "choke themselves until passing out." TikTok’s algorithmic recommendations had led Nylah to this dangerous content, and now the company is being held accountable.
The Third Circuit’s opinion draws on the Supreme Court’s ruling in Moody v. NetChoice, which provided a guide for lower courts to determine what kinds of actions by social media platforms could be considered First Amendment-protected speech. In this case, the judges ruled that TikTok’s algorithmic recommendations are not just passive hosting of user-generated content, but rather an active promotion of certain content over others.
The implications are staggering. If TikTok’s algorithm is its own speech, then the company can be held liable for the harm it causes. It’s a game-changer for parents, policymakers, and anyone who’s been affected by the dark side of social media.
But don’t just take our word for it. The judges themselves said that TikTok’s algorithm "was TikTok’s own ‘expressive activity,’… and thus its first-party speech." It’s a powerful rebuke to the company’s attempts to hide behind Section 230, and a call to action for lawmakers to take a closer look at the tech industry’s role in perpetuating harm.
So, what does this mean for the future of social media? One thing is clear: it’s time for corporate America to take responsibility for the damage they’ve caused. The Third Circuit’s ruling is a beacon of hope for those who’ve been harmed by TikTok’s algorithmic recommendations. It’s a reminder that we don’t have to accept the status quo, and that we can fight back against the powerful tech companies that are shaping our world.
The game is changing, and it’s time to get ready for the fight of our lives.