Meta, TikTok, and Snap agree to be rated for teen safety

By 02/10/2026
Meta, TikTok, and Snap agree to be rated for teen safety
(Photo illustration by Spencer Platt/Getty Images)

In some countries, regulators are resorting to social media bans to keep kids and teens away from harmful social media content. A project backed by the Mental Health Coalition proposes a gentler solution: safety ratings.

Meta, TikTok, and Snap are the first three Big Tech firms that have agreed to have their social media platforms independently rated. The grades will determine how safe those platforms are for users under the age of 18. Among other criteria, evaluators will gauge how easy it is for teens to take breaks from their feeds and determine how often those users encounter inappropriate posts.

The highest-rated platforms will be able to display a shield-safe emblem that certifies the safety of their feeds. The worst offenders, meanwhile, will be called out for their inability to root out harmful content.

Tubefilter

Subscribe for daily Tubefilter Top Stories

Subscribe

The Mental Health Coalition, which is throwing its weight behind the new ratings, is an advocacy group founded by fashion designer Kenneth Cole. Meta, TikTok, and Snap previously signed on as founding members of Thrive, an MHC-led program that uses hashes to track problematic content as it moves between platforms. In a statement, Cole said that the ratings “recognize that technology and social media now play a central role in mental health — especially for young people — and they offer a clear path toward digital spaces that better support well-being.”

The ratings don’t just serve the needs of regulators — they offer a softer solution for the Big Tech firms that have criticized the severity of nationwide social media bans. Meta, for example, has complied with Australia’s crackdown on teen accounts, but it has also called on regulators to consider alternatives that aren’t so restrictive for kids and teens.

One thing’s for sure: Regulators have to do something. Social media has become a dangerous space for many kids and teens, who encounter inappropriate videos and unrealistic body standards. As disruptions plague schools and concerned parents sue Silicon Valley titans, some governments have decided that wholesale bans are the best path forward.

In recent weeks, the calls to address these societal ills have grown louder. The European Union recently announced that it could force TikTok to rewrite its algorithm, and a landmark jury trial in California will interrogate the supposed links between social feeds and mental health struggles like addiction.

With the sharks circling, social media companies have good reason to accept the proposed ratings. Many of them have already adopted the health and safety features regulators are looking for, and now they can receive a seal of approval to certify the work they’ve already done.

Subscribe for daily Tubefilter Top Stories

Stay up-to-date with the latest and breaking creator and online video news delivered right to your inbox.

Subscribe