In a corporate blog post shared today following YouTube CEO Susan Wojcicki’s letter to creators about the importance of an open platform, the company launched a four-part series outlining its multi-pronged plan of attack against harmful content: to remove it, raise authoritative voices, reward trusted creators, and reduce the spread of content that brushes up against policy lines.
The first installment of the blog series focuses on the ‘removal’ component — where YouTube says it has redoubled efforts in recent years. In terms of developing new policies around removal, YouTube says that many updates in this area “are actually clarifications to our existing guidelines” — such as rules about dangerous challenges. In terms of totally new policies, YouTube says they can take several months to develop, during which time the company consults with experts and creators while considering regional differences — such as its new policy on hate speech.
The policy targets videos alleging that a group is superior in order to justify discrimination, segregation, or exclusion, as well as content that posits events like the Holocaust and the Sandy Hook shooting didn’t happen. And since implementing this policy in June, YouTube says it has removed 100,000 videos and 17,000 channels for hate speech — a five times increase on both counts over last quarter. Total hate speech comment removals doubled in the second quarter to 500 million, YouTube said. The spike is due to the removal of videos, channels, and comments that were previously permitted.
While YouTube’s machine learning is primed to detect violating content that often looks the same — like spam or adult content — hate speech can be harder to handle because it depends on context, and thus frequently requires human review. Nevertheless, 87% of the 9 million videos that YouTube removed last quarter were first flagged by its machine system. And at the end of the day, YouTube notes hate speech still represents a relatively small piece of the pie. “The nearly 30,000 videos we removed for hate speech over the last month generated just 3% of the views that knitting videos did over the same time period,” the company said.
You can check out YouTube’s most recent Community Guidelines Enforcement Report — a quarterly update about content removal released last week — right here. All told, YouTube says it removed 4 million channels and 9 million individual videos last quarter, with the majority of these removals being due to spam.