YouTube’s recommendation algorithm has been one of the most talked-about stories in the online video industry over the past year, and part of its appeal is how difficult it is to understand. As Tom Scott explained in a May 2017 video, it is hard to figure out how the algorithm works because its inner workings are impossible to read. We’ve published a few attempts to reverse engineer the YouTube algorithm, but there’s still more to learn.
Enter Guillaume Chaslot. Chaslot is a former YouTube engineer who worked on the video site’s algorithm before leaving Google entirely in 2013. Now, he has built a website called AlgoTransparency, which attempts to uncover which videos YouTube’s recommendation algorithm promotes when it is fed some common searches.
To create AlgoTransparency, Chaslot fed a few search strings — things like “Donald Trump,” “dinosaurs,” and “Parkland shooting” — into a program that feeds those strings in YouTube. AlgoTransparency then lists the clips that show up most commonly as recommended videos for each of those subjects. “Everybody should know, if you start to spend time on YouTube, where it’s going to take you,” Chaslot told Technology Review.
Some of the results are expected. Others, such as this Caillou video being recommend for “global warming” searches, are odd. Some of AlgoTransparency’s findings, however, are downright concerning. On searches for “vaccines facts,” the below video, which feeds anti-vaxxer conspiracy theorists, seems to be appearing most often as a recommendation:
At the very least, Chaslot’s website a fun place to poke around for a few minutes. You can read more about its methodology here.