Skip to content

YouTube’s ‘Dislike’ Button Doesn’t Do What You Think

    YouTube says its systems work as intended. “Mozilla’s report doesn’t take into account how our systems really work, so it’s hard for us to get a lot of insights,” said YouTube spokesperson Elena Hernandez, adding that viewers are taking control of their recommendations. This includes “the ability to avoid recommending a video or channel to them in the future.”

    Where Mozilla and YouTube differ in their interpretations of how successful their “don’t recommend” inputs are, it seems to be about the similarity of topics, individuals or content. YouTube says that if you ask its algorithm not to recommend a video or channel, the algorithm will simply stop recommending that particular video or channel, and will not affect a user’s access to a specific one. topic, opinion or speaker. “Our controls don’t filter out entire subjects or points of view, as this can have negative effects for viewers, such as creating echo chambers,” Hernandez says.

    Jesse McCrosky, a data scientist working with Mozilla on the study, says it’s not entirely clear from YouTube’s public statements and published research about its recommendation systems. “We’ve got some small glimpses of the black box,” he says, showing that YouTube broadly considers two types of feedback: on the positive side, engagement, such as how long users watch YouTube and how many videos they watch; and explicit feedback, including dislikes. “They have a certain balance, the degree to which they respect those two types of feedback,” McCrosky says. “What we’ve seen in this study is that the weight for engagement is quite exhaustive and other types of feedback are respected quite minimally.”

    The distinction between what YouTube thinks it says about its algorithms and what Mozilla says is important, says Robyn Caplan, senior researcher at Data & Society, a New York-based nonprofit that has previously researched YouTube’s algorithm. “Some of these findings don’t contradict what the platform is saying, but show that users don’t have a good understanding of what features are there so they can control their experiences, versus what features are there to provide feedback to content creators.” ,” she says. Caplan welcomes the research and findings, saying that while Mozilla’s intended slam-dunk reveal may be more muted than the researchers had hoped, it does highlight a key problem: Users are confused about the control they have. about their YouTube recommendations. “This research aligns with the broader need to regularly survey users about the site’s features,” Caplan said. “If these feedback mechanisms don’t work as intended, it could put people off.”

    Confusion about the intended functionality of user input is a major theme of the second part of Mozilla’s survey: a subsequent qualitative survey of about one-tenth of those who had installed the RegretsReporter extension and participated in the survey. Those with whom Mozilla spoke said they appreciated that the input was specific to videos and channels, but that they expected it to inform YouTube’s recommendation algorithm more broadly.

    “I thought that was an interesting theme because it reveals that these are people who say, ‘It’s not just me telling you I’ve blocked this channel. This is me trying to exert more control over the other types of recommendations I’ve made. going to get in the future,” Ricks says. Mozilla recommends in its research that YouTube gives users more options to proactively shape their own experiences by outlining their content preferences — and that the company better explain how its recommendation systems work.

    For McCrosky, the main problem is that there is a gap between the messages users see that YouTube delivers through its algorithmic input, and what they actually do. “There’s a discrepancy in the extent to which they respect those cues,” he says.