TikTok today introduced new Community Guidelines and tools it says are intended to “better support the well-being of our community.”
Among the updates are a detailed policy explicitly banning numerous kinds of sexual harassment, a dangerous pranks policy similar to YouTube’s, click-through warning labels on potentially graphic content, mental health information popups for search terms like “selfharm” and “hatemyself,” and an expanded COVID-19 resource hub.
“At TikTok, safety isn’t a nice-to-have or an afterthought; it’s central to all our work,” Cormac Keenan, TikTok’s head of trust & safety, wrote in an official blog post. TikTok’s Community Guidelines broadly cover 10 areas of content, including violent extremism, nudity, and illegal activities/goods. Keenan says today’s updates “[add] more specifics to each area based on behavior we’ve seen on platform, feedback we’ve heard from our community, and input from academics, civil society organizations, and our Content Advisory Council.”
Subscribe for daily Tubefilter Top Stories
Perhaps the most significant policy change is the sizeable new set of rules about sexual harassment. Before today, TikTok shelved sexual harassment under its Abusive Behavior section and, in one line, broadly banned “content that sexually harasses a user by disparaging their sexual activities or attempting to make unwanted sexual contact.”
Now, the new sexual harassment policy is its own section. Like the previous policy, it explicitly bans slut-shaming content and content attempting to make unwanted sexual contact. To those rules, it adds bans on content that simulates sexual activity with another user; content that alters the image of another person to “portray or imply sexual suggestiveness or engagement in sexual activity”; content that reveals or threatens to reveal details of a person’s private sexual life, including names of previous partners; and content that exposes/threatens to expose a person’s sexual orientation without their consent or knowledge.
Another large policy change is the YouTube-esque ban on dangerous pranks–not a surprising add, though perhaps a late one, considering the dozens of risky challenges that have trended among some TikTok users over the past couple of years. The new policy bars users from uploading, streaming, or sharing content that shows inappropriate use of dangerous tools, vehicles or objects; content that “depicts or promotes ingesting substances that are not meant for consumption or could lead to severe harm” (ahem, Tide Pods); and all dangerous games, dares, or stunts that could lead to injury.
Today’s changes also include a warning screen on graphic content. TikTok says it already prevents content that may be graphic or distressing from appearing on people’s #ForYou pages; the warning label is likely to cover other areas of the platform, like users watching videos by going directly a creator’s profile. Labels will warn viewers that what they are about to watch could be disturbing, and must be clicked through for the content to be visible.
As for mental health resources, TikTok partnered with organizations like Providence, Samaritans of Singapore, Live for Tomorrow, and its Content Advisory Council to create more informational materials about things like self-harm and suicide. Beginning this week, these materials–along with “evidence-based actions” users can take–will be shown to folks who search for troubling topics on TikTok.
And last but not least, TikTok is working with Team Halo–an organization supporting scientists around the world who are working on COVID-19 vaccines–and the Centers for Disease Control to roll out a regularly updated coronavirus/vaccination FAQ, plus video updates showing progress on vaccines.
According to Keenan, TikTok’s coronavirus resource hub (linked from the platform’s main Discover page, search results, and popup banners) has been viewed more than 2 billion times over the past six months.