Categories: YouTube

YouTube Will Have 10,000 Humans Working To Purge Questionable Content In 2018

Amid what amounts to the second wave of the YouTube Adpocalypse, whereby inappropriate content has been discovered to target young children, CEO Susan Wojcicki is personally addressing the video giant’s expanded efforts to wipe out policy-violating videos.

Wojcicki says that YouTube is taking lessons learned from the first wave of the Adpocalypse — which was ultimately triggered by violent and extremist content — and applying them to tackle other forms of problematic content (as well as video comments), including those that pose child safety concerns and feature hate speech. (Last month, for instance, the company notably terminated a controversial channel called Toy Freaks that featured kids in bizarre and suggestive skits).

For instance, Wojcicki says that YouTube will have 10,000 people working to address questionable content by 2018 — noting that human reviewers are essential to training the company’s machine learning systems. Since June, Wojcicki says, YouTube’s teams have manually reviewed nearly 2 million videos for violent extremist content — with more than 150,000 videos ultimately being taken down during this time.

Subscribe for daily Tubefilter Top Stories

Subscribe

Machine learning is key to this process, however, and YouTube has already harnessed it to great effect in flagging violent extremist content — as well as comparable comments — for human review. Now, YouTube

will train its tech on disturbing videos aimed at children, and hate speech. Machine learning helps human reviewers remove five times as many videos, Wojcicki says, and, since last June, has ultimately completed the work of 180,000 people working 40 hours per week. To date, 98% of all removed videos have been flagged by machine learning algorithms, and 70% of violent extremist videos are taken down within eight hours of upload.

In the highly contentious realm of advertising, Wojcicki says the company will apply stricter criteria, conduct more manual curation, and ramp up its ad reviewer team to ensure that campaigns aren’t appearing next to offending videos. “This will also help vetted creators see more stability around their revenue,” she says.

Finally, while YouTube already works closely with the National Center for Missing and Exploited Children (NCMEC) and the Independent Women’s Forum (IWF), it plans to expand its partner network of academics, industry groups, and subject matter experts next year. And, to provide greater transparency into how the aforementioned policies are being enforced, YouTube will publish regular reports throughout 2018 aggregating data about the flags it receives and the actions it has taken.

You can read more about the company’s plans in Wojcicki’s latest blog post right here.

Share
Published by
Geoff Weiss

Recent Posts

Top 5 Branded Videos of the Week: YouTube uses sponsorship to show off Shopping features

MrBeast continues to show us that he's in a league of his own as far as…

14 hours ago

Issa Rae’s new management company wants to teach creators how to get better brand deals

Issa Rae's new company wants to hook up creators and brands for "deeper relationships" beyond…

17 hours ago

MrBeast is ending his exclusive relationship with Night (Report)

MrBeast is reportedly ending his exclusive relationship with management company Night. Two people familiar with…

18 hours ago

After cutting 15% of staff and saying goodbye to its CEO, Peloton must figure out what’s next

Peloton is dismissing a chunk of its workforce, including its top executive. Barry McCarthy announced that he is…

4 days ago

Meta is using AI to power brand and creator matchmaking on Facebook and Instagram

Meta is looking to improve creator and brand experiences on its platform by investing in AI. The…

4 days ago

Bob Does Sports cracks a cold one with new “Have a Day” tequila line

Bob Does Sports, the self-dubbed home of "brilliantly dumb sporting adventures" hosted by Robby Berger,…

4 days ago