TikTok could see an influx of new users over the next few weeks–not because it’s trying to draw in more Gen Z’ers, but because it may inevitably draw in some of their parents.
The shortform video platform is launching Family Pairing, a set of parental controls that allows adults to monitor what their children are doing on TikTok by creating their own accounts and linking it to their kiddos’. With their accounts linked, parents will be able to limit how much time their children spend on TikTok, control which users they can direct message, and turn on Restricted Mode.
What exactly Restricted Mode does is not entirely clear, but a TikTok spokesperson told Variety that it “essentially enables you to limit videos that you might find uncomfortable or unsuitable for yourself or your kids.” (As TechCrunch points out, the feature may rely on individual users manually flagging inappropriate content rather than something like YouTube’s algorithms, which automatically seek and flag videos using machine learning.)
“We are committed to giving parents insight into, and control over, how their teens use TikTok and helping facilitate important conversations within families about the responsible navigation of digital platforms,” the platform said in an announcement. “We believe these options promote a safer and more trustworthy experience for our users of all ages, but our progress in this area is also never finished.”
TikTok is implementing Family Pairing for users everywhere “over the coming weeks.”
TikTok is also removing direct-messaging capabilities for users 16 and under
Parents aren’t the only ones who will have an impact on users’ ability to send direct messages. In the same announcement, TikTok said that as of April 30, users 16 years old or under will no longer have access to its direct message system at all.
The platform didn’t say much about this decision, or why it chose to remove the DM function specifically for users 16 and under. It simply noted the change is being made “with user safety in mind.”
It’s possible TikTok is trying to stay ahead of the regulation curve. The app (and China-based parent company Bytedance) have already faced a $5.7 million Federal Trade Commission fine for illegally collecting personal information from children under 13, a violation of the Children’s Online Privacy Protection Act (COPPA).
COPPA has always focused on kids under 13, which is why the vast majority of digital platforms and websites require new users to be at least 13. But last month, Democrat Sen. Ed Markey, who co-wrote COPPA, introduced a new bill: the KIDS Act, which would force platforms with users age 16 or under to radically alter the way they operate, including disabling core features and preventing kids from seeing sponsored content. TikTok removing DMs for users 16 and under could mean its execs are already keeping an eye on Markey’s as-yet-unpassed legislation.
Family Pairing is not the first user safety-related initiative TikTok has rolled out in the past few months. Under scrutiny from the federal government over cybersecurity concerns, the company established an Advisory Council stacked with tech and online safety experts, expanded Trust & Safety hubs (which work with regional policymakers to ensure TikTok is abiding by the law) in the U.S. and U.K., hired its first chief information security officer, and launched a Transparency Center (where outside experts can examine its data handling procedures) with the publication of its first transparency report.