Two TikTok moderators are suing ByteDance over exposure to “highly toxic and extremely disturbing” content

By 03/25/2022
Two TikTok moderators are suing ByteDance over exposure to “highly toxic and extremely disturbing” content

Two former TikTok moderators have filed what they hope will be a class action lawsuit alleging they suffered psychological trauma and received inadequate mental health support while reviewing multitudes of “highly toxic and extremely disturbing” videos in the course of their job.

Ashley Velez and Reece Young filed the federal suit against TikTok’s parent company, ByteDance, on March 24.

They are working with the San Francisco-based Joseph Saveri Law Firm, which filed a similar lawsuit on behalf of Facebook moderators in 2018. That suit ended with Facebook agreeing to a $52 million settlement in 2020.

Tubefilter

Subscribe to get the latest creator news

Subscribe

Per NPR, Velez and Young’s suit alleges TikTok broke California labor laws by putting them in an unsafe work environment.

It argues TikTok’s practices “[made] them extremely ill-equipped to handle the mentally devastating imagery their work required them to view without any meaningful counseling or meaningful breaks during their work.”

Neither Velez nor Young were direct employees of TikTok. Instead—as is common with digital platforms populated by user-generated content—they were contractors working for third-party firms that provide moderation to TikTok. Velez worked for Telus International, and Young worked for Atrium.

The suit alleges Velez and Young were subject to stringent content review quotas. During 12-hour shifts with one lunch hour and two 15-minute breaks, they were to review hundreds of potentially violative videos “for no longer than 25 seconds” each, and decide with at least 80% accuracy whether the clips did in fact cross TikTok’s community guidelines.

A number of videos they viewed contained child abuse, animal abuse, and people being killed, the suit alleges.

“Underage nude children was the plethora of what I saw,” Velez said, per NPR. “People like us have to filter out the unsavory content. Somebody has to suffer and see this stuff so nobody else has to.”

Velez said she did book one 30-minute session with a counselor provided by Telus, but didn’t feel like the appointment addressed the source of moderators’ mental distress.

“They saw so many people that it didn’t seem like they had time to actually help you with what you were suffering with,” she said. “It would have been nice if they would even acknowledge that the videos were causing a problem in the first place.”

The suit also claims that though TikTok is an official corporate partner of The National Center for Missing & Exploited Children, it does not follow the center’s best practices for protecting moderators.

The center advises blurring or placing a grid over exploitative images of children so moderators are not as dramatically affected by the sheer volume of content they review. The suit says TikTok’s offices do not do this, and instead focus on meeting quantitative moderation quotas.

A TikTok spokesperson told NPR that the platform “strives to promote a caring working environment for our employees and contractors.”

Moderators are offered “a range of wellness services so that [they] feel supported mentally and emotionally,” the spokesperson said.

TikTok reportedly works with around 10,000 content moderators.

Subscribe for daily Tubefilter Top Stories

Stay up-to-date with the latest and breaking creator and online video news delivered right to your inbox.

Subscribe