TikTok Parent Company Bytedance Fined $5.7 Million For Illegally Collecting Personal Information From Users Under 13

By 02/27/2019
TikTok Parent Company Bytedance Fined $5.7 Million For Illegally Collecting Personal Information From Users Under 13

TikTok‘s parent company Bytedance will pay $5.7 million and dramatically change the way TikTok handles young users after being charged with illegally collecting personal information from children without their parents’ consent.

The ruling, handed down by the Federal Trade Commission (FTC), is the result of charges originally laid against Musical.ly, a lip-syncing app Bytedance acquired for $800 million in November 2017. But because Bytedance shuttered Musical.ly in August 2018 and shuttled its primarily young userbase over to TikTok, Musical.ly’s problems became TikTok’s.

The FTC ruled that Musical.ly violated the Children’s Online Privacy Protection Act (COPPA), a U.S. law that prevents sites and apps from collecting information like email addresses and geolocation information from kids under 13 years old. Bytedance’s $5.7 million fine is the largest civil penalty ever brought in by a children’s privacy case, and it was calculated based on the “degree of culpability of the company,” Andrew Smith, director of the FTC’s Bureau of Consumer Protection, told Variety.


Subscribe to get the latest creator news


Here’s why Musical.ly was in violation of COPPA: In order to sign up for Musical.ly, all users — including those under 13, who were permitted to sign up for accounts — had to provide an email address, phone number, username, first and last name, a short bio, and a profile picture. Musical.ly users’ profiles, which displayed some of that information, were automatically set to public, and many young users self-identified as being under age 13, Smith said. He also said a number of kids listed their hometowns and schools in their profiles as well.

“The operators of Musical.ly — now known as TikTok — knew many children were using the app but they still failed to seek parental consent before collecting names, email addresses, and other personal information from users under the age of 13,” Joe Simons, FTC chairman, added in a statement.

Even if young users manually set their profile to private, other users could still direct-message them, the FTC noted. Part of its complaint alleges that some adult users of Musical.ly tried to use the app’s messaging system to contact obviously underage children.

On top of all that, the FTC says Musical.ly violated the COPPA rules against collecting geolocation information, as it had a feature that showed users the locations of other users within a 50 mile radius (though that feature was nixed back in 2016, long before Bytedance bought the app).

“While we’ve always seen TikTok as a place for everyone, we understand the concerns that arise around younger users,” TikTok said in a blog post today. It added that it’s now asking all users to verify their ages. Current users under 13 will be sent to a “limited, separate app experience” that “does not permit the sharing of personal information, and […] puts extensive limitations on content and user interaction.”

Those “extensive limitations” mean users under 13 will no longer be able to upload any new videos to TikTok. And, as part of the FTC settlement, TikTok has to remove every single video already posted to the app by children under 13.

The separate app experience isn’t TikTok’s only attempt at damage control. This morning, ahead of the FTC’s ruling becoming public, the app announced the launch of You’re in Control, a series that aims to educate its young users about its community guidelines, settings, controls, and resources available on the platform.

Subscribe for daily Tubefilter Top Stories

Stay up-to-date with the latest and breaking creator and online video news delivered right to your inbox.