This Is Why Researchers Studying Radicalization On YouTube Need To Be Logged In

By 01/29/2020
This Is Why Researchers Studying Radicalization On YouTube Need To Be Logged In

At the 2020 ACM Conference on Fairness, Accountability, and Transparency, a team of researchers from two universities presented their conclusion that there is a “radicalization pipeline” on YouTube, shuffling viewers from mildly alt-right content to extreme alt-right content.

“Non-profits, as well as the media, have hypothesized the existence of a radicalization pipeline on YouTube, claiming that users systematically progress towards more extreme content on the platform,” the researchers, from Switzerland’s Ecole polytechnique fédérale de Lausanne and the Federal University of Minas Gerais in Brazil, wrote in their study report. “Yet, there is to date no substantial quantitative evidence of this alleged pipeline.”

To investigate the existence of a pipeline, researchers pulled a selection of 330,925 videos uploaded to 349 channels. Then they divided the videos into four classes: media (not alt-right at all), alt-lite (videos that contain alt-right ideology, but not to extremes), the Intellectual Dark Web (more extreme content, commonly posted by people who believe they’re shut out of mainstream media because they have “controversial” views), and straight-up alt-right.

Tubefilter

Subscribe for daily Tubefilter Top Stories

Subscribe

They focused primarily on scraping the comments–more than 72 million of them–from the videos to see if users were watching content from multiple categories, and if they progressed from watching alt-lite to extreme alt-right. By “processing 72M+ comments, we show that the three channel types indeed increasingly share the same user base,” they wrote. They also found that users “consistently migrate from milder to more extreme content” and that “a large percentage of users who consume alt-right content now consumed alt-lite and I.D.W. content in the past.” Additionally, they concluded there was a low overlap between general media content and alt-right content.

Probing hard comment data gave them a list of users they were able to follow from video to video. However, researchers also made conclusions about how YouTube’s recommendation algorithm functioned in moving those users along–and those conclusions are less concrete, considering researchers didn’t use a logged-in account while conducting their study.

As we mentioned while covering another recent study, one that unusually argued YouTube’s recommendation algorithm “actively discourages viewers from visiting radicalizing or extremist content,” researchers who trawl the platform without a logged-in account are literally unable to get a true picture of the part YouTube’s algorithm could play in radicalization. Oft-cited anecdotes from YouTube viewers who experienced radicalization rely on them having an account and YouTube allegedly recommending them increasing amounts of increasingly radical content based on what they’ve watched.

Bottom line: YouTube’s recommendation algorithm is unable to recommend if it can’t track a user’s watch history–so this study, and the previous one, are fatally flawed when it comes to making conclusions about recommendations.

In their writeup, reseachers noted this challenge (“Importantly, our recommendations do not account for personalization.”), but anyway concluded that, “Still, even without personalization, we were still able to find a path in which users could find extreme content from large media channels.” But that path is not necessarily one a logged-in user, or another non-account user, would encounter.

Researcher Manoel Horta Ribeiro, who presented the study at ACM FAT, faced questions about this aspect of the study during his post-presentation Q&A, TechCrunch reports.

“In a sense, I do agree that it’s very hard to make the claim that the radicalization is due to YouTube or due to some recommender system or that the platfrom is responsible for that,” he said onstage. He added that there is “solid evidence toward radicalization because people that were not exposed to this radical content become exposed.”

But, he said, “It’s hard to make strong casual claims–like YouTube is responsible for that.”

Whether or not YouTube’s recommendation algo funnels people toward alt-right content, the platform has a part in disseminating alt-right content, Horta Ribeiro believes.

“We do find evident traces of user radicalization, and I guess the question asks, ‘Why is YouTube responsible for this?’” he said. “And I guess the answer would be because many of these communities, they live on YouTube and they have a lot of their content on YouTube, and that’s why YouTube is so deeply associated with it.”

You can see the published paper here.

Subscribe for daily Tubefilter Top Stories

Stay up-to-date with the latest and breaking creator and online video news delivered right to your inbox.

Subscribe