According to a new research report, pressing the "dislike" button won't make a huge difference for dissatisfied viewers of the platform.

Users can indicate that they don't want to watch similar videos by using a number of ways. The researchers at the Mozilla Foundation said that all of the controls were not effective. Users continued to receive unwanted recommendations on the largest video site.

According to their report, the "dislike" button reduced similar, unwanted recommendations only 12 percent. In reducing unwanted recommendations, pressing "not interested" was 11 percent effective, and removing a video from one's watch history was 29 percent effective.

With the help of 22,700 participants, the researchers analyzed more than 567 million recommendations for videos on the video sharing site. The tool they used was developed by Mozilla. The data was gathered on the platform.

One of the researchers who conducted the study said that users should be given more control over what they see on the site.

Mr. McCrosky said in an interview that they should listen to what people are saying, instead of just stuffing their throats.

A research participant asked not to recommend a video about a cow trembling in pain because it included an image of a discolored hoof. A graphic image of the end of a cow's leg was included in a recommendation for a video titled "There Was Pressure Building in This Hoof." Videos of guns, violence from the war in Ukraine and Tucker Carlson's show on Fox News were some of the unwanted recommendations.

There is a video on the internet called "A Grandma Ate Cookie Dough for Lunch Every Week" What happened to her bones? For the next three months, the user continued seeing recommendations for similar videos about what happened to people after they ate a lot of things.

One user said it would come back eventually.

Each user has been shown a personalized version of the platform that is based on past viewing behavior and other variables. The site sent people down a rabbit hole of misinformation.

71 percent of the videos that participants had said featured misinformation, hate speech and other bad content were recommended by YouTube.

It is difficult to provide transparency about how the recommendation system works because it is constantly evolving.

A number of signals build on each other to help inform our system about what you like and don't like.