The researchers looked at seven months of activity from over 20,000 participants to find out how people can change their recommendations. They wanted to see how well the controls work.

Every participant installed a browser extension that made it easier to stop watching a video. Every time, hitting it triggered a response.

Dozens of research assistants looked at the videos that were rejected to see how similar they were to the ones that were recommended. There is a negligible effect on the recommendations participants receive. Over the course of seven months, one rejected video spawned an average of 115 bad recommendations.

It has been shown that the practice of recommending videos you will likely agree with can lead to political radicalization. The platform has been criticized for promoting sexually explicit or suggestive videos of children. YouTube has promised to crack down on hate speech, better enforce its guidelines, and not use its recommendation system to promote borderline content.

The study found that despite the negative feedback, the content was still being recommended to users.

Hitting Dislike is the most visible way to give a negative feedback. Both options are advertised as ways to tune the program.

This could have negative effects for viewers, like creating echo chambers, as our controls don't exclude entire topics or viewpoints. The report doesn't take into account how the video sharing site works. That is something that no one outside of the company really knows because of the amount of inputs and the company's limited transparency. The black box is being peered into to better understand its outputs.