Similar recommendations keep coming even when users say they aren't interested in certain types of videos

Clicking on buttons like "not interested", "dislike", and "stop recommending channel" isn't effective in preventing similar content from being recommended. More than half of the recommendations are similar to what a user said they weren't interested in. The buttons didn't make a difference in blocking videos.

Volunteers who used the foundation's browser extension to recommend videos to their friends were enlisted to collect data from real videos. On the back end, users were randomly assigned a group, so different signals were sent to the platform each time they clicked the button, including dislike, not interested, don't recommend channel, and a control group for whom no feedback was sent to the platform.

Research assistants used data from over 500 million recommended videos to create over 44,000 pairs of videos. Researchers used machine learning to determine if the recommendation was too similar to the video a user rejected.

Twelve percent of bad recommendations were prevented by the "dislike" and "not interested" signals compared to the baseline control group. The buttons that prevented 43 percent and 29 percent of bad recommendations were more effective than the tools offered by the platform.

Researchers say that feedback users share about their experience should be treated as signals about how they want to spend their time on the platform.

The platform doesn't try to block all content related to a topic, so these behaviors are intentional The report doesn't take into account how YouTube's controls are designed

The controls do not exclude entire topics or viewpoints, as this could have negative effects for viewers. We welcome academic research on our platform, which is why we recently expanded data access through our researcher program It's hard for us to get many insights because the report doesn't take into account how our systems actually work.

The definition of "similar" fails to take into account how the recommendation system works. The "not interested" and "don't recommend channel" buttons prevent the channel from being recommended in the future. Recommendations of content related to a topic, opinion, or speaker are not stopped by the company.

There are more and more feedback tools for users to use on platforms like TikTok andInstagram. Users complain that recommendations persist even when they don't want to see it. Platforms aren't transparent about how feedback is taken into account and it's not always clear what different controls actually do.

According to Ricks, the platform is balancing user engagement with user satisfaction, which is ultimately a tradeoff between recommending content that leads people to spend more time on the site and content the algorithm thinks people will enjoy. The platform has the power to change which signals get the most weight in its algorithm, but our study shows that user feedback may not be the most important one.