Earlier this year, the presence of YouTube videos from extremist groups — and the fact that ads were running before those videos — triggered a large-scale controversy that continues to reverberate across the online video community. While YouTube has already pledged to safeguard its brand partners from any association with content that contains hate speech or appeals to terrorism, its parent company has announced four additional steps it will take to curb the spread of those clips. Google has authored a blog post in which it discusses the ways it is using its technology and human resources to quash extremist videos.
The blog post, authored by Google general counsel Kurt Walker, reaffirms the commitment of both Google and YouTube to the battle against extremism. “While we and others have worked for years to identify and remove content that violates our policies,” reads the post, “the uncomfortable truth is that we, as an industry, must acknowledge that more needs to be done. Now.”
The first solution Google discusses is a continued effort to train systems like Content ID to better identify terrorist videos. In order to prevent informative or educational videos from being improperly flagged, the tech giant says it is applying its “most advanced machine learning research” to this task.
For videos that contain dicey rhetoric but do not violate YouTube’s terms of service, Google will install warnings that it hopes will discourage potential viewers without explicitly curbing free speech. “We think this strikes the right balance between free expression and access to information without promoting extremely offensive viewpoints,” reads the blog post.
Google is also emboldening two groups in order to counter the presence of extremists on its platforms. Its so-called “Trusted Flaggers,” whose reports on rule-breaking content are usually accurate, will receive operational grants to further their work. Creators will have a chance to do their part as well; an expansion of the “Creators For Change” program will give partners the chance to make counter-terrorist videos that debunk extremist viewpoints. These PSAs will be linked to terrorist content in order to shield viewers from radical propoganda.
Given the amount of content that is uploaded to YouTube on a daily basis, it is hard to believe Google can completely eliminate the presence of groups like ISIS among its users. That said, the tech giant has a wide array of resources at its disposal, and proper application of those tools will hopefully curb the presence of extremism on YouTube and beyond.