YouTube Will Have 10,000 Humans Working To Purge Questionable Content In 2018

By 12/05/2017
YouTube Will Have 10,000 Humans Working To Purge Questionable Content In 2018

Amid what amounts to the second wave of the YouTube Adpocalypse, whereby inappropriate content has been discovered to target young children, CEO Susan Wojcicki is personally addressing the video giant’s expanded efforts to wipe out policy-violating videos.

Wojcicki says that YouTube is taking lessons learned from the first wave of the Adpocalypse — which was ultimately triggered by violent and extremist content — and applying them to tackle other forms of problematic content (as well as video comments), including those that pose child safety concerns and feature hate speech. (Last month, for instance, the company notably terminated a controversial channel called Toy Freaks that featured kids in bizarre and suggestive skits).

For instance, Wojcicki says that YouTube will have 10,000 people working to address questionable content by 2018 — noting that human reviewers are essential to training the company’s machine learning systems. Since June, Wojcicki says, YouTube’s teams have manually reviewed nearly 2 million videos for violent extremist content — with more than 150,000 videos ultimately being taken down during this time.

Machine learning is key to this process, however, and YouTube has already harnessed it to great effect in flagging violent extremist content — as well as comparable comments — for human review. Now, YouTube will train its tech on disturbing videos aimed at children, and hate speech. Machine learning helps human reviewers remove five times as many videos, Wojcicki says, and, since last June, has ultimately completed the work of 180,000 people working 40 hours per week. To date, 98% of all removed videos have been flagged by machine learning algorithms, and 70% of violent extremist videos are taken down within eight hours of upload.

In the highly contentious realm of advertising, Wojcicki says the company will apply stricter criteria, conduct more manual curation, and ramp up its ad reviewer team to ensure that campaigns aren’t appearing next to offending videos. “This will also help vetted creators see more stability around their revenue,” she says.

Finally, while YouTube already works closely with the National Center for Missing and Exploited Children (NCMEC) and the Independent Women’s Forum (IWF), it plans to expand its partner network of academics, industry groups, and subject matter experts next year. And, to provide greater transparency into how the aforementioned policies are being enforced, YouTube will publish regular reports throughout 2018 aggregating data about the flags it receives and the actions it has taken.

You can read more about the company’s plans in Wojcicki’s latest blog post right here.