Categories: YouTube

YouTube Will Have 10,000 Humans Working To Purge Questionable Content In 2018

Amid what amounts to the second wave of the YouTube Adpocalypse, whereby inappropriate content has been discovered to target young children, CEO Susan Wojcicki is personally addressing the video giant’s expanded efforts to wipe out policy-violating videos.

Wojcicki says that YouTube is taking lessons learned from the first wave of the Adpocalypse — which was ultimately triggered by violent and extremist content — and applying them to tackle other forms of problematic content (as well as video comments), including those that pose child safety concerns and feature hate speech. (Last month, for instance, the company notably terminated a controversial channel called Toy Freaks that featured kids in bizarre and suggestive skits).

For instance, Wojcicki says that YouTube will have 10,000 people working to address questionable content by 2018 — noting that human reviewers are essential to training the company’s machine learning systems. Since June, Wojcicki says, YouTube’s teams have manually reviewed nearly 2 million videos for violent extremist content — with more than 150,000 videos ultimately being taken down during this time.

Subscribe to get the latest creator news

Subscribe

Machine learning is key to this process, however, and YouTube has already harnessed it to great effect in flagging violent extremist content — as well as comparable comments — for human review. Now, YouTube

will train its tech on disturbing videos aimed at children, and hate speech. Machine learning helps human reviewers remove five times as many videos, Wojcicki says, and, since last June, has ultimately completed the work of 180,000 people working 40 hours per week. To date, 98% of all removed videos have been flagged by machine learning algorithms, and 70% of violent extremist videos are taken down within eight hours of upload.

In the highly contentious realm of advertising, Wojcicki says the company will apply stricter criteria, conduct more manual curation, and ramp up its ad reviewer team to ensure that campaigns aren’t appearing next to offending videos. “This will also help vetted creators see more stability around their revenue,” she says.

Finally, while YouTube already works closely with the National Center for Missing and Exploited Children (NCMEC) and the Independent Women’s Forum (IWF), it plans to expand its partner network of academics, industry groups, and subject matter experts next year. And, to provide greater transparency into how the aforementioned policies are being enforced, YouTube will publish regular reports throughout 2018 aggregating data about the flags it receives and the actions it has taken.

You can read more about the company’s plans in Wojcicki’s latest blog post right here.

Share
Published by
Geoff Weiss

Recent Posts

With 500,000 sellers in the U.S. alone, TikTok touts the safety features of its Shop

Amidst a chaotic week at TikTok, the app took some time to acknowledge its growing community…

13 hours ago

Wesley Wang’s viral short film got 4.4 million views. A feature adaptation is in the works.

Nothing, Except Everything is getting a big-screen treatment. That's the name of a short film that…

14 hours ago

Creators on the Rise: Giulia Amato on faith, finding her niche, and getting up at 4 a.m.

Welcome to Creators on the Rise, where we find and profile breakout creators who are…

16 hours ago

Newsletter platform beehiiv prepares for expansion with $33 million Series B

A major player in the burgeoning newsletter industry has made a sizable addition to its…

1 day ago

Meta promotes original content on Instagram, launches bonus program on Threads

Meta has kicked off the week with a pair of announcements that should make its creator…

2 days ago

Top 5 Branded Videos of the Week: MrBeast’s latest sponsored smash is fun for all ages

MrBeast continues to show us that he's in a league of his own as far as…

3 days ago