News

Google Announces Four More Steps Its Taking To Fight Extremist Content On YouTube

Earlier this year, the presence of YouTube videos from extremist groups — and the fact that ads were running before those videos — triggered a large-scale controversy that continues to reverberate across the online video community. While YouTube has already pledged to safeguard its brand partners from any association with content that contains hate speech or appeals to terrorism, its parent company has announced four additional steps it will take to curb the spread of those clips. Google has authored a blog post in which it discusses the ways it is using its technology and human resources to quash extremist videos.

The blog post, authored by Google general counsel Kurt Walker, reaffirms the commitment of both Google and YouTube to the battle against extremism. “While we and others have worked for years to identify and remove content that violates our policies,” reads the post, “the uncomfortable truth is that we, as an industry, must acknowledge that more needs to be done. Now.”

The first solution Google discusses is a continued effort to train systems like Content ID to better identify terrorist videos. In order to prevent informative or educational videos from being improperly flagged, the tech giant says it is applying its “most advanced machine learning research” to this task.

Subscribe to get the latest creator news

Subscribe

For videos that contain dicey rhetoric but do not violate YouTube’s terms of service, Google will install warnings that it hopes will discourage potential viewers without explicitly curbing free speech. “We think this strikes the right balance between free expression and access to information without promoting extremely offensive viewpoints,” reads the blog post.

Google is also emboldening two groups in order to counter the presence of extremists on its platforms. Its so-called “Trusted Flaggers,” whose reports on rule-breaking content are usually accurate, will receive operational grants to further their work. Creators will have a chance to do their part as well; an expansion of the “Creators For Change” program will give partners the chance to make counter-terrorist videos that debunk extremist viewpoints. These PSAs will be linked to terrorist content in order to shield viewers from radical propoganda.

Given the amount of content that is uploaded to YouTube on a daily basis, it is hard to believe Google can completely eliminate the presence of groups like ISIS among its users. That said, the tech giant has a wide array of resources at its disposal, and proper application of those tools will hopefully curb the presence of extremism on YouTube and beyond.

Share
Published by
Sam Gutelle

Recent Posts

YouTube just made a Shorts deepfake machine so creators don’t have to be in their own videos

Hey YouTubers! Do you want to be rid of the pesky chore of actually appearing…

3 days ago

Have you heard? Gaming Historian says so long, Ms. Rachel sells shoes, and TikTok ad exec moves on.

Each week, we handpick a selection of stories to give you a snapshot of trends,…

3 days ago

NAB Show wants to be the meeting ground for creators and legacy entertainment: “These two segments have so much to offer each other right now”

Back in 2024, the National Association of Broadcasters recognized the importance of content creators by…

3 days ago

Hoorae returns to Issa Rae’s web series roots with “Screen Time” microdrama

Too much screen time can be a dangerous thing, and Hoorae is taking that idea literally. The…

3 days ago

Kylie Jenner brings “star power and aura” to hydration product k2o, launched in tandem with Night

The latest product backed by Night's venture studio emerged out of a partnership between the creator…

3 days ago

Hollywood has a lot to learn from creator animators (and their IPs), YouTube says in latest Culture & Trends report

Indie animation is flourishing on YouTube. From the pop culture juggernaut that is The Amazing…

3 days ago