A lawsuit is alleging that TikTok’s For You page recommended a deadly challenge

On December 12, 2021, ten-year-old Nylah Anderson passed away five days after she attempted a dangerous stunt called the “blackout challenge.” Anderson’s family, which alleges that the child discovered the challenge on TikTok, has filed a lawsuit against the company in the U.S. District Court for the Eastern District of Pennsylvania.

The lawsuit names both TikTok and its parent company ByteDance as defendants. It claims that Anderson was drawn to the blackout challenge after it popped up on her For You page. The suit targets TikTok’s recommendation algorithm, which allegedly “determined that the deadly blackout challenge was well-tailored and likely to be of interest to 10-year-old Nylah Anderson and she died as a result.”

Participants who take on the blackout challenge intentionally asphyxiate themselves until they fall unconscious. They then describe the euphoric feelings that rush through them as they regain consciousness. According to the lawsuit, at least five children have died after attempting the nefarious stunts. One of those kids, 12-year-old Joshua Zeryihun, lost his life this past March.

Subscribe to get the latest creator news

Subscribe

A TikTok spokesperson extended condolences to Anderson’s family and told Bloomberg that the company plans to ban all blackout challenge videos on its app. It had previously issued a statement condemning “this disturbing challenge, which people seem to learn about from sources other than TikTok.”

The wording of that statement is a reminder that some instigators are looking to blame TikTok for all sorts of viral ills. A March 2022 report in The Washington Post revealed Facebook’s attempts to tie dangerous challenges to TikTok — even though some of those stunts originated on Facebook.

In this particular case, forensic evidence has shown that Anderson was active on TikTok at the time of her accident, according to an attorney representing her family.

This is not the first time a tech company has been sued for allegedly providing harmful services that have brought about the deaths of minors. A decision in this particular case could set a precedent: Can an algorithm be responsible for someone’s unfortunate decisions? We may soon find out.

Share
Published by
Sam Gutelle

Recent Posts

After cutting 15% of staff and saying goodbye to its CEO, Peloton must figure out what’s next

Peloton is dismissing a chunk of its workforce, including its top executive. Barry McCarthy announced that he is…

12 hours ago

Meta is using AI to power brand and creator matchmaking on Facebook and Instagram

Meta is looking to improve creator and brand experiences on its platform by investing in AI. The…

12 hours ago

Bob Does Sports cracks a cold one with new “Have a Day” tequila line

Bob Does Sports, the self-dubbed home of "brilliantly dumb sporting adventures" hosted by Robby Berger,…

13 hours ago

Billion Dollar Boy launches biz dev community for creators with flagship location in London

Influencer marketing agency Billion Dollar Boy is launching a new membership community that's "dedicated to…

15 hours ago

Millionaires: Giulia Amato on faith, finding her niche, and getting up at 4 a.m.

Welcome to Millionaires, where we profile creators who have recently crossed the one million follower…

18 hours ago

Creators on the Rise: Celestial Sylvia reads the danger all around us

Welcome to Creators on the Rise, where we find and profile breakout creators who are…

2 days ago