TikTok Is Trying To Stop Its Algorithm From Recommending “Clusters” Of Mental Health-Affecting Content

By 12/16/2021
TikTok Is Trying To Stop Its Algorithm From Recommending “Clusters” Of Mental Health-Affecting Content

TikTok is looking for ways to adjust its powerful recommendation algorithm so users aren’t served strings of content that could negatively affect their mental health.

Earlier this year, we covered a Wall Street Journal study that showed how TikTok’s algorithm susses out users’ interests and then serves them video after video catered to those interests. That can be fine when a user expresses interest in benign topics, but things get dicey when they’re indicating interest not in dogs or painting, but in depression or weight loss.

In a company blog post today, TikTok explained that it’s “testing ways to avoid recommending a series of similar content—such as around extreme dieting or fitness, sadness, or breakups—to protect against viewing too much of a content category that may be fine as a single video but problematic if viewed in clusters.”

Tubefilter

Subscribe to get the latest creator news

Subscribe

TikTok said its recommendation algorithms are already designed to spice up users’ For You feeds by tossing in the occasional video that’s outside of their indicated preferences.

“Doing so enriches the viewing experience and can help promote exposure to a range of ideas and perspectives on our platform,” the company said.

But if those occasional videos are brief interruptions in a feed that’s otherwise comprised of hours upon hours of videos about sadness and anxiety, it’s still an issue. To that end, TikTok says it’s “looking at how our system can better vary the kinds of content that may be recommended in a sequence.”

It’s also “working to recognize if our system may inadvertently be recommending only very limited types of content that, though not violative of our policies, could have a negative effect if that’s the majority of what someone watches, such as content about loneliness or weight loss,” it said.

TikTok didn’t give any specifics about what kinds of changes it might implement, or when those changes might happen. It did say it’s working with “experts across medicine, clinical psychology, and AI ethics, members of our Content Advisory Council, and our community.”

In the same post, TikTok said it’s developing a new user feature that will let people block specific words or hashtags from showing up in their For You feed. The platform didn’t provide details about the feature, but presumably it would work similarly to Twitter’s block function—as in, if someone blocks the word “depression” or a certain dance challenge hashtag, videos with those words and hashtags in their descriptions won’t be served in the user’s For You feed.

Subscribe for daily Tubefilter Top Stories

Stay up-to-date with the latest and breaking creator and online video news delivered right to your inbox.

Subscribe