A new study from Firefox developer Mozilla suggests that YouTube’s video moderation tools are ineffective as the site will continue to recommend videos you’re not interested in.
The way it’s supposed to work is that users have multiple tools to teach YouTube’s cryptic algorithm what they don’t want to watch. You have options like the Dislike button, the Do not recommend channel option, and the ability to remove videos from your account history. But according to Mozilla study (opens in new tab), users still get these “bad recommendations”. At best, YouTube tools reduce unwanted videos by almost half. In the worst case, YouTube does the opposite: it increases the number of unwanted videos you will see.
O full 47-page study can be found on the Mozilla website (opens in new tab) where it details the researcher’s methodology, how the organization obtained the data, its findings, and what it recommends that YouTube do.
The study consisted of more than 22,000 volunteers who downloaded Mozilla’s RegretsReporter (opens in new tab) browser extension that allows users to track recommendations on YouTube and create reports for researchers. Via RegretsReporter, they analyzed over 500 million videos.
According to the findings, YouTube tools are all over the place in terms of consistency. 39.3% of participants saw no change in their recommendations. One user, named Participant 112 in the study, used the moderation tools to stop getting medical videos on their account only to be flooded with them a month later. 23% said they had a mixed experience. For this group, they stopped watching unwanted videos for a while before reappearing soon after. And 27.6% of respondents said they stopped getting bad recommendations after using moderation tools.
The most effective standalone tool turned out to be the Not Recommend Channel, which reduced recommendations by around 43%. The Not Interested option and the Disliked button were the worst, as they stopped only 11% and 12% of unwanted videos, respectively.
The researchers also found that people would change their behavior to manage recommendations. In the study, users stated that they would change YouTube settings, use a different account, or avoid watching certain videos in order not to get more of them. Others would use VPNs and privacy extensions to help keep things clean.
At the end of the study, Mozilla researchers give their own recommendations on how YouTube should change its algorithm with a greater emphasis on increasing transparency. They want the controls to be easier to understand, while asking YouTube to listen to user feedback more often. Mozilla also asks the platform to be more transparent about how its algorithm works.
In response, a YouTube spokesperson made a statement to The Verge (opens in new tab) criticizing the study. The spokesperson says the researchers did not take into account how the “systems actually work” and did not understand how the tools work. Apparently the moderation tools don’t stop an entire thread, just that specific video or channel. By the researcher’s own admission (opens in new tab)the study “is not a representative sample of YouTube’s user base,” but it does provide some insight into user frustration.
That said, YouTube’s algorithm and the changes around it have drawn considerable ire from users. Many were not happy about it. YouTube removed the unlikes counter from the site to the point where people created extensions just to add it again. Furthermore, there are claims that YouTube is capitalizing on controversial content to increase engagement. Assuming Mozilla’s data is correct, unwanted recommendations can be a by-product of the platform capitalizing on content people don’t want to get more views.
If you’re interested in learning more about YouTube, check out ‘s story of malware being spread through gaming videos.