U.S. attorneys general want TikTok and Snapchat to give parents more control of kids’ DMs

The National Association of Attorneys General (NAAG) has sent a letter to TikTok and Snapchat urging them to beef up parental controls by allowing third-party monitoring apps to check what kids are up to not just in public portions of platforms, but in direct messages to other users.

Attorneys general from 44 U.S. states signed the letter, which pitches the idea that social media platforms’ community guidelines and the content moderators they hire to enforce those rules “are not always sufficient to protect children and teenagers who are particularly vulnerable to online threats, especially with regard to direct messaging.”

The letter cited a study done by parental control app Bark, which examined 3.4 billion direct messages sent across 30 social media apps in 2021.

Subscribe for daily Tubefilter Top Stories

Subscribe

Bark says messages revealed that 74.61% of teenagers expressed self-harm or suicidal ideation, that 90.73% of teenagers had encountered nudity and/or sexual content on social media, that 93.31% of teens “engaged in conversations surrounding drugs/alcohol,” that 94.5% of teens “expressed or experienced violent subject matter/thoughts,” and that 85% of teens “experienced bullying as a bully, victim, or witness.”

NAAG argues these findings indicate the negative impact of social media on teens (not a novel revelation) as well as the potential for them to be exposed to sexual predators and cyberbullying.

Lawmakers hammering social media platforms for failing to adequately protect young users is far from new. But this letter takes a different tack than previous warnings and investigations in two ways: one, it’s specifically pointing to direct messages; and two, it’s not necessarily asking platforms to improve their own in-house systems.

Instead, this letter urges platforms to partner with—and thus give access to—third-party parental control apps.

“Parental control apps can alert parents or schools to messages and posts on your platforms that have the potential to be harmful and dangerous,” the letter says. “On other platforms where these apps are allowed to operate appropriately parents have received notifications of millions of instances of severe bullying and hundreds of thousands of self-harm situations, showing that these apps have the potential to save lives and prevent harm to our youth.”

NAAG did not endorse a particular third-party app. It also did not publicize details about which platforms have embraced parental control apps, and does not appear to have discussed the potential child privacy and data security issues that could come with allowing third-party apps to access kids’ private communications.

Share
Published by
James Hale
Tags: snaptiktok

Recent Posts

Issa Rae’s new management company wants to teach creators how to get better brand deals

Issa Rae's new company wants to hook up creators and brands for "deeper relationships" beyond…

2 hours ago

MrBeast is ending his exclusive relationship with Night (Report)

MrBeast is reportedly ending his exclusive relationship with management company Night. Two people familiar with…

3 hours ago

After cutting 15% of staff and saying goodbye to its CEO, Peloton must figure out what’s next

Peloton is dismissing a chunk of its workforce, including its top executive. Barry McCarthy announced that he is…

3 days ago

Meta is using AI to power brand and creator matchmaking on Facebook and Instagram

Meta is looking to improve creator and brand experiences on its platform by investing in AI. The…

3 days ago

Bob Does Sports cracks a cold one with new “Have a Day” tequila line

Bob Does Sports, the self-dubbed home of "brilliantly dumb sporting adventures" hosted by Robby Berger,…

3 days ago

Billion Dollar Boy launches biz dev community for creators with flagship location in London

Influencer marketing agency Billion Dollar Boy is launching a new membership community that's "dedicated to…

3 days ago