On December 12, 2021, ten-year-old Nylah Anderson passed away five days after she attempted a dangerous stunt called the “blackout challenge.” Anderson’s family, which alleges that the child discovered the challenge on TikTok, has filed a lawsuit against the company in the U.S. District Court for the Eastern District of Pennsylvania.
The lawsuit names both TikTok and its parent company ByteDance as defendants. It claims that Anderson was drawn to the blackout challenge after it popped up on her For You page. The suit targets TikTok’s recommendation algorithm, which allegedly “determined that the deadly blackout challenge was well-tailored and likely to be of interest to 10-year-old Nylah Anderson and she died as a result.”
Participants who take on the blackout challenge intentionally asphyxiate themselves until they fall unconscious. They then describe the euphoric feelings that rush through them as they regain consciousness. According to the lawsuit, at least five children have died after attempting the nefarious stunts. One of those kids, 12-year-old Joshua Zeryihun, lost his life this past March.
Subscribe to get the latest creator news
A TikTok spokesperson extended condolences to Anderson’s family and told Bloomberg that the company plans to ban all blackout challenge videos on its app. It had previously issued a statement condemning “this disturbing challenge, which people seem to learn about from sources other than TikTok.”
The wording of that statement is a reminder that some instigators are looking to blame TikTok for all sorts of viral ills. A March 2022 report in The Washington Post revealed Facebook’s attempts to tie dangerous challenges to TikTok — even though some of those stunts originated on Facebook.
In this particular case, forensic evidence has shown that Anderson was active on TikTok at the time of her accident, according to an attorney representing her family.
This is not the first time a tech company has been sued for allegedly providing harmful services that have brought about the deaths of minors. A decision in this particular case could set a precedent: Can an algorithm be responsible for someone’s unfortunate decisions? We may soon find out.