YouTube’s “Dislike” button doesn’t do what you think

YouTube says its systems are working fine. “The Mozilla report doesn’t take into account how our systems actually work, so it’s hard for us to gather many insights,” says Elena Hernandez, a YouTube spokeswoman. This includes “the ability to block suggesting a video or channel for them in the future”.

When Mozilla and YouTube differ in their interpretations of how well their “non-recommend” entries work, they seem to revolve around the similarity of topics, people, or content. YouTube says that asking its algorithm not to recommend a video or channel simply stops the algorithm from recommending a particular video or channel — and does not affect a user’s access to a particular topic, opinion, or speaker. “Our controls don’t filter entire topics or perspectives, as this can have negative effects on viewers, such as creating echo chambers,” Hernandez says.

That’s not entirely clear from public data and research published on YouTube about recommendation systems, says Jesse McCroskey, a data scientist who worked with Mozilla on the study. “We have some little glimpses in the black box,” he says, showing that YouTube broadly considers two types of feedback: On the positive side, engagement, such as how long users watch YouTube and how many videos they watch; And candid comments, including dislikes. “They have some balance, and the degree to which they respect those two types of feedback,” McCroskey says. “What we saw in this study is that the weight toward engagement is quite inclusive, and that other types of feedback have very little respect.”

Robin Kaplan, a senior researcher at Data & Society, a New York nonprofit that has previously investigated YouTube’s algorithm, says the distinction between what YouTube thinks it says about its algorithms and what Mozilla says is important. “Some of these findings don’t go against what the platform says, but it shows that users don’t have a good understanding of the features that are there to be able to control their experiences, versus the features that are there to provide feedback to creators,” she says. Kaplan welcomes the study and its findings, saying that while Mozilla’s intended ad may be more muted than the researchers had hoped, it highlights an important problem: Users are confused about their control over their YouTube recommendations. “This research speaks to the broader need to regularly poll users about site features,” Kaplan says. “If these feedback mechanisms aren’t working as intended, it could turn people off.”

Confusion over the intended function of user input is a central theme of the second part of the Mozilla study: a post-qualitative survey of approximately one-tenth of those who installed the RegretsReporter extension and participated in the study. Those Mozilla spoke to said they appreciated the input was directed specifically at videos and channels, but expected it to inform YouTube’s recommendation algorithm more broadly.

“I thought this was an interesting topic because it reveals that these people are saying, ‘It’s not just me telling you I’ve blocked this channel. This is me trying to exercise more control over the other kinds of recommendations I’ll get in the future,'” says Rex. In its research, Mozilla recommends that YouTube give users more options to proactively shape their own experiences by setting their own content preferences — and that the company do a better job of explaining how its recommendation systems work.

For McCroskey, the primary issue is that there is a gap between messaging users who see YouTube deliver through its computational input and what they actually do. “There is a disconnect in the degree to which they respect those signals,” he says.