YouTube CEO Susan Wojcicki announced yesterday that the company is taking precautions to limit the amount of disturbing content that its human moderators must police every day. The new policy arrives as YouTube said it intends to have 10,000 human moderators working to purge inappropriate videos in 2018.
YouTube will limit moderators to four hours of disturbing videos daily as a means of preserving their mental health, Wojcicki said at South By Southwest. “This is a real issue and I myself have spent a lot of time looking at this content over the past year,” she explained, per The Verge. Wojcicki added that Google would also provide content moderators — who are generally part-time contract workers — with “wellness benefits,” though she did not elaborate on what these benefits might entail.
While the glut of inappropriate videos — containing sex, violence, extremist language, and toilet humor targeting children — is primarily regulated by machines, YouTube has increasingly looked to human beings to better improve its algorithms and ensure that nothing slips through the cracks. (Four hundred hours of content are uploaded to the site every minute). Last year, Wired reported on the tenuous nature of being employed as a YouTube content moderator, including poor communication with Google, a lack of job security, and tough working conditions.
Earlier this month, YouTube said that some newly-tapped moderators had misinterpreted its policies and mistakenly punished some far right and pro-gun creators. Affected creators and channels were promptly restored.