Homepage Feature

Digital platforms are introducing new age verification systems to keep kids safer online. Lawmakers are taking it even further.

Twitch is the latest platform to roll out amped-up age verification systems and policies amid mounting concern over children’s safety online.

YouTube recently introduced a system that will use machine learning to parse a user’s search and watch activity, and will automatically apply age restrictions if it believes a user is under 18. (If a user gets mistakenly flagged as a minor, they can prove their age by submitting a government-issued ID—a solution that’s drawn criticism from people like VTubers, who are worried their personal information will be leaked.)

Meanwhile Roblox, facing several lawsuits from families of children who were abused and in some cases kidnapped by men who found them on its platform, introduced a system where users have to send in videos of their faces and sometimes government IDs to prove they’re age 17 or older. Users who haven’t verified their ages are not allowed in games or spaces that have “private” locations like bedrooms and bathrooms, nor virtual bars or clubs.

Subscribe to get the latest creator news

Subscribe

Twitch’s new system requires users to submit a live scan of their face when they sign in; it uses facial recognition tech to judge the person’s age based on how they look.

Now, an important note: This system is only being implemented in the U.K. in accordance with the Online Safety Act, a law that requires users to be 18 years old in order to access what it refers to as potentially “harmful” content. “Harmful” broadly covers things like graphic imagery–which could include video game violence–but, as digital/personal rights advocacy org Electronic Frontier Foundation points out, it can also be subjectively applied to content some people wouldn’t consider “family friendly,” like LGBTQ+ discussions.

Per the Online Safety Act, people who fail Twitch’s facial scan age verification and cannot otherwise prove their age will not be able to access the platform.

Twitch assured users their scan data won’t be used for anything else: “To protect your privacy, Twitch and k-ID (a third-party vendor we partner with to verify your age) do not store your face scan video selfies,” it said. “The video selfie used for facial age estimation is analyzed entirely on your device and will never leave it.”

It’s not clear if users will have to scan their face every time they log in, or if this will be a one-time thing, like YouTube and Roblox’s age verification systems.

The Online Safety Act and Twitch’s new system may only be of consequence to people in the U.K. for now, but again, as the Electronic Frontier Foundation points out, this could be a glimpse of America’s digital future.

“Nearly half of U.S. states have some sort of online age restrictions in place already, and the Supreme Court recently paved the way for even more age blocks on online sexual content,” the EFF said. “But Americans—including those under 18—still have a First Amendment right to view content that is not sexually explicit, and EFF will continue to push back against any legislation that expands the age mandates beyond porn, in statehouses, in courts, and in the streets.”

The latest age verification law comes out of New York, where Attorney General Letitia James recently unveiled proposed rules for the Stop Addictive Feeds Exploitation (SAFE) For Kids Act. Called SAFE for short, the law (which was passed in June 2024) doesn’t so much target “harmful” content, but the algorithms platforms use to dish out any and all content. It applies to all platforms that use automated systems to surface/recommend user-generated content, as well as have users who “spend at least 20 percent of their time on the platform’s addictive feeds.” That’s YouTube, TikTok, Instagram, and more.

If it keeps the proposed rules, SAFE would require all platforms to serve users under 18 a chronological feed of posts instead of one that’s interest-driven and geared at encouraging further engagement. Platforms would also have to turn off notifications between 12 a.m. and 6 a.m.

The law states that if a platform cannot verify a user’s age for whatever reason, it must serve them the chronological feed + no nighttime notifications by default. SAFE notes that platforms can verify users’ ages via a “number of different methods, as long as the methods are shown to be effective and protect users’ data.” They must, however, provide at least one alternative to government ID, like a face scan.

“Children and teenagers are struggling with high rates of anxiety and depression because of addictive features on social media platforms,” Attorney General James said in a statement. “The proposed rules released by my office today will help us tackle the youth mental health crisis and make social media safer for kids and families.”

SAFE, though passed, will not take effect for at least another year and a half. There is a 60-day period for the public to comment on proposed rules, then a further 12-month period for the Office of the Attorney General to finalize said rules. Once the rules are set in stone, it’ll be another 180 days before SAFE goes live.

As The Verge notes, this already lengthy process could be complicated by pushback from orgs like the Electronic Frontier Foundation, which has called SAFE “an assault on free speech” and alleged that in practice, the law would “block adults from content they have a First Amendment right to access.”

Digital platforms have struggled to protect children’s safety since they first came online, but with our increasingly fraught political climate and documented instances of children being harmed on platforms that claim to have robust safety mechanisms, the situation is tougher than ever. Effectively protecting kids from predators is a must—but at the same time, the systems being set in place to do so could be co-opted by those who want to broaden the definition of “harmful content.”

For now, we’ll have to see how things pan out in the U.K.

Share
Published by
James Hale

Recent Posts

YouTube just made a Shorts deepfake machine so creators don’t have to be in their own videos

Hey YouTubers! Do you want to be rid of the pesky chore of actually appearing…

1 day ago

Have you heard? Gaming Historian says so long, Ms. Rachel sells shoes, and TikTok ad exec moves on.

Each week, we handpick a selection of stories to give you a snapshot of trends,…

1 day ago

NAB Show wants to be the meeting ground for creators and legacy entertainment: “These two segments have so much to offer each other right now”

Back in 2024, the National Association of Broadcasters recognized the importance of content creators by…

1 day ago

Hoorae returns to Issa Rae’s web series roots with “Screen Time” microdrama

Too much screen time can be a dangerous thing, and Hoorae is taking that idea literally. The…

1 day ago

Kylie Jenner brings “star power and aura” to hydration product k2o, launched in tandem with Night

The latest product backed by Night's venture studio emerged out of a partnership between the creator…

1 day ago

Hollywood has a lot to learn from creator animators (and their IPs), YouTube says in latest Culture & Trends report

Indie animation is flourishing on YouTube. From the pop culture juggernaut that is The Amazing…

2 days ago