Mozilla Is Trying To Dissect YouTube’s Recommendation Algorithm Using Data Donated By Viewers

By 09/18/2020
Mozilla Is Trying To Dissect YouTube’s Recommendation Algorithm Using Data Donated By Viewers

Over the past year, Mozilla Foundation–the open source software-focused nonprofit that developed browser Firefox–has been conducting research into YouTube’s oftcriticized content recommendation algorithm, trying to understand why it drives users toward increasingly extremist content about harmful topics like conspiracy theories and white supremacy. Mozilla has also met with YouTube to urge the platform to be more transparent about how its machine-learning systems decide which videos users should watch.

But according to Ashley Boyd, Mozilla’s VP of advocacy and engagement, little has changed. “Despite the serious consequences, YouTube’s recommendation algorithm is entirely mysterious to its users,” she said in a statement.

That’s why Mozilla’s next step is dissecting the algorithm by collecting data directly from users. The foundation this week launched RegretsReporter, a browser extension through which users can donate general data about their YouTube usage, as well as detailed reports about recommended videos.

Tubefilter

Subscribe for daily Tubefilter Top Stories

Subscribe

When a Firefox or Chrome users installs the extension, it will run in the background, automatically gathering data about how much time they spend on YouTube. RegretsReporter will not collect any search or watch history data unless and until they decide to file a report.

Reports should be filed if a user is recommended a concerning video, Mozilla says. When they see a “regrettable” video, they can click the extension and submit a report about why it’s not a good recommendation. (The form offers reasons like “It was untrue,” “It was offensive,” and “It was bizarre,” along with a write-in option.) RegretsReporter will then scrape up to five hours of the user’s YouTube data, backtracking their viewing activity to understand why they were served that particular video.

Mozilla will look for usage patterns that might lead to “regrettable” recommendations

Mozilla intends to study the data to see if there are “specific YouTube usage patterns” that lead to users being recommended “racist, violent, or conspiratorial content,” Boyd said. “With this information, Mozilla–along with fellow researchers, journalists, policymakers, and even engineers within YouTube–can work towards building more trustworthy systems for recommending content.”

Users who contribute their data to Mozilla will not be identified using their actual account name; instead, they’ll be assigned a randomly generated user ID for their privacy, Boyd said. She explained that Mozilla plans to share the results of its research publicly, and will disclose them “in aw ay that minimizes the risk of users being identified.”

RegretsReporter does not gather data about users’ browsing activity outside YouTube, and doesn’t gather any data when users are in an incognito/private browser window, Mozilla says.

The foundation urges anyone who installs the extension to not alter the way they use YouTube. “Don’t seek out regrettable content,” Boyd said. “Instead, use YouTube as you normally do. That is the only way that we can collectively understand whether YouTube’s problem with recommending regrettable content is improving, and which areas they need to do better on.”

You can see more information about RegretsReporter here.

Subscribe for daily Tubefilter Top Stories

Stay up-to-date with the latest and breaking creator and online video news delivered right to your inbox.

Subscribe