News

TikTok, Google, and Meta to face multiple lawsuits related to the mental health effects of algorithms

A recent decree could lead to more court time for three embattled Big Tech firms. In the U.S. District Court in Oakland, California, Judge Yvonne Gonzalez Rogers ruled that a group of lawsuits filed by school districts against TikTok, Google, and Meta can move forward.

According to Bloomberg, Rogers backed more than 150 cases filed by school districts, all of which concern the effect social media algorithms may have on the youth mental health crisis. For years, scholastic authorities have argued that tech platforms are liable for the harmful content their recommendation algorithms serve to teenage viewers.

The standard defense against those allegations invokes Section 230 of the Communications Decency Act of 1996. That’s the title of a legal doctrine that provides “safe harbor” to platforms. As long as they act quickly to deal with violative content, they are exempt from legal liability. A ruling earlier this year supported platforms’ Section 230 rights, reinforcing a Supreme Court precedent and seemingly shielding them from prosecution in 600 cases related to addictive algorithms.

Subscribe to get the latest creator news

Subscribe

But elsewhere in the legal world, there are questions about how and when Section 230 can be applied to algorithmic recommendations. A pivotal decision in August 2024 determined that TikTok will have to face a lawsuit

brought by the mother of a child who died after attempting the dangerous “blackout challenge.” That ruling built on a Supreme Court decision that defined recommendation algorithms as “editorial judgements” that may not qualify for Section 230 protections.

Judge Rogers ruled that some of the claims brought before her are invalid due to Section 230, but others should be allowed to move forward. Rogers said that TikTok, Google, and Meta “deliberately fostered compulsive use of their platforms which foreseeably caused” school districts to expend resources on mental health services for students.

The outcomes of these cases are far from certain, but if the courts continue to strip away Section 230’s authority in the realm of algorithms, the ramifications for social media industry would be massive. Who knows — maybe this time next year the U.S. will be getting depersonalized algorithms to match the ones they have in Europe.

Share
Published by
Sam Gutelle

Recent Posts

Reed Hastings leaves Netflix, which says it “really built our M&A muscle” during failed deal with Warner Bros. Discovery

There's just no winning with Netflix shareholders. After it reported 2025's Q4 earnings in January,…

4 hours ago

TikTok is turning ByteDance’s AI video generator in a brand’s best friend

As one AI-powered video generator bites the dust, another is being integrated into one of…

4 hours ago

Former OnlyFans CEO’s new platform goes softcore to become “the HBO of social media”

April is the Vylit hour in the world of social media. That's the name of a new…

5 hours ago

Nas Daily’s AI ecom biz just closed a $27 million round led by Vinod Khosla

Nas Daily is jumping on the AI bandwagon. He's raised a $27 million Series A…

5 hours ago

Sick of slop? YouTube now has a workaround that effectively turns off Shorts.

YouTube already lets its users put a hard stop on their watch time each day, and…

6 hours ago

Courts and governments say social platforms harm teens’ mental health. Here’s what the teens think.

Are major social media platforms safe for teens? The answer to that question could have…

1 day ago