YouTube finally took down videos posted by violent neo-Nazi group Atomwaffen Division a day after the Anti-Defamation League suggested the video platform do so “immediately.”
On Monday, The Daily Beast reported on YouTube’s failure to remove violent propaganda videos published by the neo-Nazi group. Atomwaffen spouted hateful rhetoric like “gas the kikes, race war now” in their YouTube videos and has been linked to five, recent murders. As YouTube told The Daily Beast, “We announced last June that we would be taking a tougher stance on videos that are borderline against our policies on hate speech and violent extremism by putting them behind a warning interstitial and removing certain features, such as recommended videos and likes.”
Atomwaffen (which means “atomic weapons” in German) more than “borderline” violated YouTube’s assertion that the platform doesn’t “permit hate speech.” Specifically, YouTube’s guidelines state, “If the primary purpose of the content is to incite hatred against a group of people solely based on their ethnicity, or if the content promotes violence based on any of these core attributes, like religion, it violates our policy.” A video calling for a “race war” undoubtedly fits into this category. Thus, some of Atomwaffen’s videos came with the warning, “The following content has been identified by the YouTube community as inappropriate or offensive to some audiences.”
Subscribe for daily Tubefilter Top Stories
Yesterday, the ADL deemed this course of action not good enough. The organization’s CEO, Jonathan Greenblatt, said in a statement, “These videos are not only disgusting racist content that has no place in our society, but they incite hatred against one religious group—in this case, Jews—therefore violating YouTube’s own Community Guidelines. YouTube should take them down immediately.”
Another concern with videos like these, in addition to the obvious, is the recommended content that comes with them. Thanks to YouTube’s algorithm, one hateful, racist video leads to another, and dangerous content can reap the rewards of other similar content’s popularity. Jonathan Albright, research director at the Tow Center for Digital Journalism, encountered this problem when he looked into conspiracy theory videos accusing students from Marjory Stoneman Douglas High School in Florida—where a mass shooting took place on February 14—of being actors bent on promoting the liberal agenda of stricter gun laws. One such conspiracy video led to a total of 9,000 similar clips on YouTube through the site’s various recommendation protocols.
YouTube appears to have reassessed its original response to concerns over neo-Nazi content being published on the video site. Having just scoured YouTube for Atomwaffen’s videos, I couldn’t find anything published by the group. A recent follow-up report from The Daily Beast confirmed that YouTube has indeed banned the neo-Nazis from its platform.