YouTube’s mysterious algorithm is once again under fire after a new crowdsourced Mozilla study found that the technology frequently recommends videos that users regret watching.
The study looked at whether or not viewers regretted watching a YouTube video — and found that 71% of regrettable videos were recommended by YouTube’s own algorithm, CNET reports. Furthermore, 9% of regrettable videos — a total of 189 clips with more than 160 million views — were later removed by YouTube for violating community guidelines or copyright infringement.
However, as noted by Casey Newton in his Platformer newsletter, recommendations account for the majority of watch-time, so it follows that they would account for the majority of regrets as well.
YouTube responded to the study by telling CNET its internal surveys show that users are satisfied with its algorithm, which has undergone 30 changes over the past year in order to reduce harmful recommendations. YouTube also said it had not been able to review the validity of Mozilla’s data or the study’s definition of ‘regrettable’.
In terms of their reasons for regret, participants mostly frequently balked at videos containing misinformation, with hate speech and sexualized content being the second and third biggest points of contention. Additionally, the rate of regret was found to be 60% higher in countries where English is not the first language, such as Brazil, Germany, and France, the study found.
The study was conducted via a Mozilla-created browser extension called RegretsReporter for both Chrome and Firefox, which was downloaded by 37,000 people. The study asked participants to flag videos they found to be regrettable from July 2020 to May 2021. Mozilla concluded, per CNET, that YouTube should set up independent audit of its recommendation systems — a process YouTube has said it is open to — and add the ability to opt out of personalized recommendations.