NEW YORK, NEW YORK - JANUARY 31: In this photo illustration, a teenager uses her mobile phone to access social media on January 31, 2024 in New York City. Technology executives appeared at a Senate hearing to address accusations that their companies are endangering children's lives by not adequately policing predators and others who seek to harm and exploit young social media users. (Photo illustration by Spencer Platt/Getty Images)
In some countries, regulators are resorting to social media bans to keep kids and teens away from harmful social media content. A project backed by the Mental Health Coalition proposes a gentler solution: safety ratings.
Meta, TikTok, and Snap are the first three Big Tech firms that have agreed to have their social media platforms independently rated. The grades will determine how safe those platforms are for users under the age of 18. Among other criteria, evaluators will gauge how easy it is for teens to take breaks from their feeds and determine how often those users encounter inappropriate posts.
The highest-rated platforms will be able to display a shield-safe emblem that certifies the safety of their feeds. The worst offenders, meanwhile, will be called out for their inability to root out harmful content.
The Mental Health Coalition, which is throwing its weight behind the new ratings, is an advocacy group founded by fashion designer Kenneth Cole. Meta, TikTok, and Snap previously signed on as founding members of Thrive, an MHC-led program that uses hashes to track problematic content as it moves between platforms. In a statement, Cole said that the ratings “recognize that technology and social media now play a central role in mental health — especially for young people — and they offer a clear path toward digital spaces that better support well-being.”
The ratings don’t just serve the needs of regulators — they offer a softer solution for the Big Tech firms that have criticized the severity of nationwide social media bans. Meta, for example, has complied with Australia’s crackdown on teen accounts, but it has also called on regulators to consider alternatives that aren’t so restrictive for kids and teens.
One thing’s for sure: Regulators have to do something. Social media has become a dangerous space for many kids and teens, who encounter inappropriate videos and unrealistic body standards. As disruptions plague schools and concerned parents sue Silicon Valley titans, some governments have decided that wholesale bans are the best path forward.
In recent weeks, the calls to address these societal ills have grown louder. The European Union recently announced that it could force TikTok to rewrite its algorithm, and a landmark jury trial in California will interrogate the supposed links between social feeds and mental health struggles like addiction.
With the sharks circling, social media companies have good reason to accept the proposed ratings. Many of them have already adopted the health and safety features regulators are looking for, and now they can receive a seal of approval to certify the work they’ve already done.
Creators in the U.K. have asked for more professional services that can help them level…
When the 2026 FIFA World Cup kicks off across North America in June, TikTok will…
Earlier this week, the streamer Jynxzi hosted a League of Legends event that got more…
Stephen Curry's Golden State Warriors didn't make it into this year's playoffs, but that's not…
This year's Brandcast truly feels like a TV upfront. YouTube used to run Brandcast--its annual…
It's hard to imagine what BuzzFeed looks like without Jonah Peretti at the helm, but…