News

Appeals court breaks from precedent, rules TikTok must face “blackout challenge” lawsuit

The mother of a young girl who passed away after attempting a dangerous social media challenge will be allowed to pursue her case against TikTok. A U.S Appeals Court has determined that the family of Nylah Anderson has the right to argue that TikTok’s recommendation algorithm exposed the 10-year-old to the “blackout challenge,” which ultimately led to her death.

The ruling is significant because of its interpretation of Section 230 of the Communications Decency Act of 1996. For years, that “safe harbor” provision has protected social media companies from culpability for harmful content that is proliferated on their platforms.

Judges have tended to uphold the authority of Section 230. The U.S. Supreme Court ruled in YouTube’s favor in Gonzalez v. Google, which concerned ISIS terrorists who had been radicalized through online videos. The original ruling on Anderson’s mother’s lawsuit also invoked Section 230

.

Subscribe to get the latest creator news

Subscribe

But recently, some courts have started to rethink their interpretation of Section 230. A July ruling from the U.S. Supreme Court stated that algorithmic recommendations represent an editorial choice, and therefore may not qualify for preexisting protections.

The classification of the algorithm is critical to the Anderson case, since the prosecution is arguing that TikTok promoted the blackout challenge by serving it as a recommended video. In her ruling, U.S. Circuit Judge Patty Shwartz invoked the previous Supreme Court precedent, stating that the lawsuit could continue because TikTok’s algorithm falls outside Section 230’s proverbial safe harbor. “TikTok makes choices about the content recommended and promoted to specific users, and by doing so, is engaged in its own first-party speech,” Shwartz said.

Anderson was one of several children who died in 2021 as a result of the blackout challenge. Her mother sued TikTok six months after Nylah’s death. The platform initially offered condolences and vowed to deplatform the blackout challenge. It has not yet issued a comment on the Appeals Court ruling.

The lawsuit is far from a foregone conclusion, but the latest twist in the tale suggests that Section 230 protections are not as ironclad as they once were. The provision’s original author, Ron Wyden, has argued that tech companies need a safe harbor to protect from spam and bad actors. But as platforms tinker with their terms and encourage community members to chip in on the moderation front, the rules that govern them may start evolving as well. Will Nylah Anderson’s mother be the catalyst who sparks that change?

Share
Published by
Sam Gutelle

Recent Posts

Reed Hastings leaves Netflix, which says it “really built our M&A muscle” during failed deal with Warner Bros. Discovery

There's just no winning with Netflix shareholders. After it reported 2025's Q4 earnings in January,…

6 hours ago

TikTok is turning ByteDance’s AI video generator in a brand’s best friend

As one AI-powered video generator bites the dust, another is being integrated into one of…

6 hours ago

Former OnlyFans CEO’s new platform goes softcore to become “the HBO of social media”

April is the Vylit hour in the world of social media. That's the name of a new…

6 hours ago

Nas Daily’s AI ecom biz just closed a $27 million round led by Vinod Khosla

Nas Daily is jumping on the AI bandwagon. He's raised a $27 million Series A…

6 hours ago

Sick of slop? YouTube now has a workaround that effectively turns off Shorts.

YouTube already lets its users put a hard stop on their watch time each day, and…

7 hours ago

Courts and governments say social platforms harm teens’ mental health. Here’s what the teens think.

Are major social media platforms safe for teens? The answer to that question could have…

1 day ago