The mother of a young girl who passed away after attempting a dangerous social media challenge will be allowed to pursue her case against TikTok. A U.S Appeals Court has determined that the family of Nylah Anderson has the right to argue that TikTok’s recommendation algorithm exposed the 10-year-old to the “blackout challenge,” which ultimately led to her death.
The ruling is significant because of its interpretation of Section 230 of the Communications Decency Act of 1996. For years, that “safe harbor” provision has protected social media companies from culpability for harmful content that is proliferated on their platforms.
Judges have tended to uphold the authority of Section 230. The U.S. Supreme Court ruled in YouTube’s favor in Gonzalez v. Google, which concerned ISIS terrorists who had been radicalized through online videos. The original ruling on Anderson’s mother’s lawsuit also invoked Section 230
.But recently, some courts have started to rethink their interpretation of Section 230. A July ruling from the U.S. Supreme Court stated that algorithmic recommendations represent an editorial choice, and therefore may not qualify for preexisting protections.
The classification of the algorithm is critical to the Anderson case, since the prosecution is arguing that TikTok promoted the blackout challenge by serving it as a recommended video. In her ruling, U.S. Circuit Judge Patty Shwartz invoked the previous Supreme Court precedent, stating that the lawsuit could continue because TikTok’s algorithm falls outside Section 230’s proverbial safe harbor. “TikTok makes choices about the content recommended and promoted to specific users, and by doing so, is engaged in its own first-party speech,” Shwartz said.
Anderson was one of several children who died in 2021 as a result of the blackout challenge. Her mother sued TikTok six months after Nylah’s death. The platform initially offered condolences and vowed to deplatform the blackout challenge. It has not yet issued a comment on the Appeals Court ruling.
The lawsuit is far from a foregone conclusion, but the latest twist in the tale suggests that Section 230 protections are not as ironclad as they once were. The provision’s original author, Ron Wyden, has argued that tech companies need a safe harbor to protect from spam and bad actors. But as platforms tinker with their terms and encourage community members to chip in on the moderation front, the rules that govern them may start evolving as well. Will Nylah Anderson’s mother be the catalyst who sparks that change?
There's just no winning with Netflix shareholders. After it reported 2025's Q4 earnings in January,…
As one AI-powered video generator bites the dust, another is being integrated into one of…
April is the Vylit hour in the world of social media. That's the name of a new…
Nas Daily is jumping on the AI bandwagon. He's raised a $27 million Series A…
YouTube already lets its users put a hard stop on their watch time each day, and…
Are major social media platforms safe for teens? The answer to that question could have…