The YouTube Radicalization Pipeline Exists, And It’s Driving Users Toward Increasingly Alt-Right Content (Study)

By 08/26/2019
The YouTube Radicalization Pipeline Exists, And It’s Driving Users Toward Increasingly Alt-Right Content (Study)

A new study out of Cornell University has found “strong evidence for radicalization among YouTube users,” and concludes that viewers who consume “mild” radical right-wing content (it cites Joe Rogan, a YouTuber with a talk show and 6.1 million subscribers, as a creator of such “mild” content) often migrate to viewing much more radical alt-right content.

The study was prompted by ongoing concerns that YouTube offers a kind of “pipeline” that funnels viewers toward increasingly disturbing white supremacist content.

Researchers Manoel Horta Ribeiro, Raphael Ottoni, Robert West, Virgílio A. F. Almeida, and Wagner Meira looked at English-language radical content produced uploaded to YouTube from around the world, and divided that content into three types, produced by three communities. To examine how this growing amount of radical content moves on YouTube, they looked at 331,849 videos with a collective total of more than 79 million comments.

Tubefilter

Subscribe for daily Tubefilter Top Stories

Subscribe

At the mildest end, there’s content produced by the Intellectual Dark Web (IDW), which is populated by people who believe they’re shut out of the mainstream media because they have “controversial” (often meaning Islamophobia, transphobic, and sexist views, the study says).

Then, in the middle, there’s the alt-lite, which the Cornell researchers class as people who “constantly flirt with ”espousing white supremacist ideals (including popularizing conspiracy theories like the anti-Semitic concept that a collective of “lizard people” secretly runs the world), but deny openly being white supremacists.

Both of these types of content are deemed “gateways” to the final, most extreme type, per the study: alt-right content. That content is made by people who are openly white supremacist, and who admit to wanting to achieve things like a state where white people live separately from people of color. The researchers note Richard Spencer, a white nationalist who’s perhaps most famous for taking a solid punch to the face, as an example of the alt-right.

All three types of content began “skyrocketing” — in number of videos uploaded, the number of views and likes received, and most importantly, the number of comments left — during the leadup to the 2016 election, the researchers say.

Processing the comments showed “that the three communities increasingly share the same user base,” the study concludes, which means it’s becoming more and more difficult to draw lines between each faction, since many of them are watching the same YouTube content and expressing the same views.

Even more crucially, though, the study showed “that users consistently migrate from milder to more extreme content,” and that “a large percentage of users who consume alt-right content now consumed alt-lite and IDW content in the past.” That backs up claims that YouTube is functioning as a “pipeline” for extremism.

But does its recommendation algorithm play a part in that pipeline? Researchers also looked at more than 2 million videos and 10,000 channels recommended by YouTube’s algorithm. They concluded that “YouTube’s recommendation algorithms frequently suggest alt-lite and IDW content,” but noted that extreme alt-right videos are not recommended by YouTube’s algorithm. Instead, alt-right content is linked to IDW and alt-lite content through channel recommendations, where viewers of a channel are recommended other channels based on similarity of content.

“This is, to our knowledge, the first large-scale quantitative audit of user radicalization on YouTube,” researchers said in the study’s write-up. “We find strong evidence for radicalization among YouTube users,” they added, and suggest “appropriate measures” need to be taken.

It’s worth noting YouTube has taken actions to suppress the spread of alt-right content, including, most recently, updating its policies to ban “videos alleging that a group is superior in order to justify discrimination, segregation or exclusion,” as well as videos that posit events like the Holocaust and the Sandy Hook shooting didn’t happen. (Those are common conspiracy theories traded among alt-righters.)

This study comes on the heels of another study published earlier this month which chronicled YouTube’s part in the radicalization of Brazilian citizens, including spreading right-wing and conspiracy theory-themed videos from the YouTuber who would later become Brazil’s current president, Jair Bolsonaro.

You can read the Cornell study in its entirety here.

Subscribe for daily Tubefilter Top Stories

Stay up-to-date with the latest and breaking creator and online video news delivered right to your inbox.

Subscribe