In the months since the U.S. government opened an official national security investigation into TikTok and its China-based parent company, Bytedance, TikTok representatives have repeatedly refuted claims that Chinese officials have any influence over how the app moderates content.
But internal documents obtained by The Intercept show that users were liable to be permanently banned from TikTok if their livestreams contained a range of potentially contentious political content. This included videos “endangering…national honor and interests” or “inciting subversion of state power,” posts containing “documents, speeches, images, and videos that undermine national unity,” and videos that “exaggera[ted] the ethnic conflict between black and white.” Also bannable were videos containing topics the Chinese government has historically disapproved of, including the May 1998 riots of Indonesia, genocide in Cambodia, and Tiananmen Square.
The newly revealed guidelines also instructed moderators to ban livestreamers who are or show pregnant teenagers, show women in bikinis if they aren’t swimming, show “defamation, spoofing, or criticism” of “the heads of the Eastern and Western countries,” and those who promote competitive apps.
These policies are similar to moderation guidelines leaked by The Guardian in September. At the time, TikTok said those guidelines had been retired in May 2019, and that they had at first been created as “a blunt approach to minimising conflict” in “TikTok’s early days.”
Josh Gartner, a TikTok spokesperson, tells Tubefilter, “The livestream guidelines in question are largely the same or similar to the guidelines The Guardian already reported on last year, which were removed both before The Guardian‘s reporting and also prior to when The Intercept says the document was accessed. Over the past year, we have established Trust and Safety hubs in the U.S., Dublin, and Singapore, which oversee development and execution of our moderation policies and are headed by industry experts with extensive experience in these areas. Local teams apply the Community Guidelines that we published in January, all aimed at keeping TikTok a place of open self-expression and a safe environment for users and creators alike.”
Gartner also tells Tubefilter these guidelines were never used for U.S.-based users, only users in other markets, and that they were retired in early 2019.
TikTok allegedly suppressed content made by “ugly” creators, worried it would drive away new users
Livestream guidelines aren’t the only information obtained by The Intercept. It also got ahold of documents showing moderators being told to suppress content by “chubby” and “obese” people, “too thin” people, people with “ugly facial looks,” “senior people with too many wrinkles,” and users whose filming environments are “slummy,” with cracked walls or “old and disreputable decorations.”
Moderators were told to keep these videos off TikTok’s For You page, which is the bread and butter of its content creators. The For You page is the first thing TikTok users see when they log in, and it contains a virtually endless, curated feed of content that’s popular on the app and/or recommended based on a user’s viewing activity. Many creators tag their videos “#foryou” or “#foryoupage” in an attempt to get added to other people’s feeds. Keeping a user’s videos off the For You page restricts them to being seen only by people who already follow that user.
Like the political content guidelines, these guidelines aren’t totally unfamiliar. In December, German digital rights blog Netzpolitik posted moderation guidelines that showed TikTok suppressed content made by people it deemed at high risk of bullying, including fat, disabled, and LGBTQ+ people.
Gartner tells Tubefilter these new documents “are the same or similar to those published by Netzpolitik late last year, and as we told Netzpolitik at the time, they represented an early blunt attempt at preventing bullying. We recognize that this was not the correct approach and have ended it.”
One notable difference between the “same or similar” December documents and these new documents is the reason for suppression. The December documents showed that TikTok was ostensibly trying to protect potentially vulnerable users from ridicule based on personal characteristics. New documents appear to show that TikTok wanted this content suppressed because it’s not “attractive” to potential new users.
“[I]f the character’s appearance or the shooting environment is not good, the video will be much less attractive, not worthing (sic) to be recommended to new users,” the document lists as the reason for suppression. It adds that “slummy” filming locations are “less fancy and appealing” to new users.
TikTok says this moderation didn’t happen in the U.S. as it makes efforts to be more transparent
Yesterday, the same day The Intercept‘s report was published, TikTok revealed it will stop using Chinese moderators to screen content in other countries–a decision that will reportedly see around 100 Bytedance employees either leave the company or shuffle into positions within other departments. Gartner tells Tubefilter these moderators were not involved in going through U.S.-based users’ content.
The U.S. government’s investigation into TikTok is still ongoing. Last week, the app announced it will introduce a “transparency center” within its Los Angeles headquarters that will allow outside experts to examine its content moderation and data handling practices. Set to open in May, the site will give people “a chance to evaluate our moderation systems, processes, and policies in a holistic manner,” Vanessa Pappas, TikTok’s general manager, wrote in news post about the center. “We expect the Transparency Center to operate as a forum where observers will be able to provide meaningful feedback on our practices.”