New study claims that Youtube’s “dislike” has almost no effect

According to a research by Mozilla, Youtube’s dislike button did not prevent the majority of recommendations that were identical


Fabio Ferreira
by Fabio Ferreira
,
Cover Image for New study claims that Youtube’s “dislike” has almost no effect

According to research by Mozilla, Youtube’s dislike button did not prevent the majority of identical recommendations

According to a recent Mozilla research, even when viewers tell YouTube that they aren’t interested in particular sorts of content, identical recommendations continue to appear.

The study was carried out to learn more about how search engines, social media, and other technological platforms influence consumers’ views when they browse online.

Mozilla, the company behind the Firefox web browser, discovered that YouTube continues to promote videos to users who have stated they are not interested in them.

Using data from over 20,000 YouTube viewers’ video recommendations, Mozilla researchers found that buttons such as “stop recommending channel,” “delete from watch history,” “dislike,” and “not interested” does not affect the recommendations the user receives. Even at their finest, these buttons allow through more than half of the recommendations comparable to what users stated they were not interested in, according to the analysis. At their worst, the buttons hardly made a difference in preventing comparable movies from being viewed.

Mozilla researchers enlisted volunteers who utilized the foundation’s RegretsReporter, a browser plugin that overlays a generic “stop recommending” button to YouTube films seen by participants, to collect data from real videos and users. On the back end, users were given a group at random, so various signals were transmitted to YouTube each time they clicked the Mozilla-placed button — hate, not interested, don’t suggest channel, delete from history, and a control group that received no input from the platform.

Using data from over 500 million suggested films, study assistants generated over 44,000 pairs of videos, each consisting of one “rejected” video and one later recommended by YouTube. The researchers then evaluated pairings or utilized machine learning to determine whether the recommendation was too similar to the film the user rejected.

“We compared the suggestions YouTube made to those who indicated they were uninterested in particular topics to those made to people who stated they were interested in those same topics. We discovered that YouTube continues to propose material after a user has stated that they are not interested in it, even though that video is unrelated to what they have previously seen “Mozilla announced this in a blog post on Thursday.

Sending Youtube’s “dislike” and “not interested” signals were only “marginally successful” in preventing faulty suggestions. Compared to the baseline control group, it contained12 percent and 11 percent of unsuitable recommendations, respectively. The “don’t recommend channel” and “delete from history” options were marginally more successful, preventing 43 percent and 29 percent of negative suggestions, respectively. However, experts warn the platform’s capabilities are still insufficient for steering away unpleasant content.

The researchers used a Chrome plugin to transmit signals to YouTube, which were then processed by the search engine’s algorithms.

“Even when we showed YouTube that a user had expressly said that they were not interested in a topic, it did not filter out information regarding that topic,” the report continued.

“We noticed that YouTube’s algorithm frequently promotes items a user has expressed disinterest in. This implies that, depending on a user’s interests, YouTube may expose them to information that perpetuates negative prejudices and biases, “According to Mozilla.

The researchers stated, “YouTube should respect the comments users offer about their experience, recognizing it as relevant indications about how people wish to spend their time on the platform.”

Youtube’s dislike feature works“, claims the company

According to YouTube spokesperson Elena Hernandez, these tendencies are purposeful because the site does not attempt to ban all content linked to a specific topic. However, Hernandez questioned the research, claiming that it fails to consider how YouTube’s dislike and recommendations controls are constructed.

“What is most important to us is that individuals can get genuine and trustworthy information,” she explained. “As a result, rather than attempting to delete all incorrect content, our methods make it easier for individuals to access information from reliable sources.”

“Most importantly, our controls do not block out whole topics or opinions, since this might have severe consequences for viewers, such as creating echo chambers,” said Hernandez. “Academic research is encouraged on our platform, which is why we recently increased Data API access through our YouTube Researcher Program.” Because Mozilla’s study does not consider how our systems truly function, we cannot gain many insights.”

Hernandez also claims that Mozilla’s definition of “similar” ignores YouTube’s recommendation system. According to her, the “not interested” option eliminates a specific video, and the “don’t recommend channel” button prohibits the channel from being recommended in the future. It does not intend to halt all content suggestions relating to a specific topic, perspective, or speaker.

Other sites, such as TikTok and Instagram, have included an increasing number of feedback mechanisms for users to train the algorithm, ostensibly to offer them appropriate material. TikTok, for example, uses a Family Safety Mode feature, which requires users to input their birth date. Still, it’s unclear whether this is used to filter content or simply to ensure that users are old enough to register for the app. However, people frequently complain that identical recommendations continue to appear even after indicating that they do not want to view something.

According to Mozilla researcher Becca Ricks, it’s not always apparent what different settings accomplish, and platforms aren’t always explicit about how input is considered.

“In the case of YouTube, I think the platform strikes a good mix between user engagement and user happiness, which is ultimately a tradeoff between promoting material that encourages users to spend more time on the site and content that the algorithm predicts users would appreciate.”, Ricks said. “While the platform can adjust which of these signals receive the greatest weight in its algorithm, our research reveals that user feedback may not always be the most significant.”

Youtube has not yet revealed intentions to delete content suggestions based on similar or other tastes.

#youtube

Fabio Ferreira
Fabio Ferreira
Tech lead and Talent Specialist Acquisition. Helping Saas companies and scrappy startups. Nothing makes me happier than meeting new people, building new relationships, solving issues, and helping the success of enterprises.
Cover Image for What Software Architecture Should Look Like

What Software Architecture Should Look Like

Learn what software architecture is, how to evaluate and improve it. Avoid common pitfalls and start designing smarter.


Fabio Ferreira
by Fabio Ferreira
Cover Image for What is it like to be a Junior Developer?

What is it like to be a Junior Developer?

Discover the ups and downs of being a Junior Developer and learn tips for cooping with yout boss, balance expectations and survive Day 1


Fabio Ferreira
by Fabio Ferreira
Cover Image for The problem with End-to-end Testing

The problem with End-to-end Testing

It sounds like a good idea to implement end-to-end testing on your software. But is it really? Is that the only approach? Let’s find out.


Fabio Ferreira
by Fabio Ferreira
Cover Image for Neural networks for dummies

Neural networks for dummies

Are you new to neural networks and unsure/confused about how it works? Then let’s dive into it and put on our scientist hat for a moment.


M. Muneeb Hashmi
by M. Muneeb Hashmi