YouTube is more likely to serve problematic videos than useful ones

This study is backed by objective reality, which many of us already experience on YouTube.Sometimes, the recommendation algorithm at streaming video company can send you on a long video binge that lasts hours and is so addictive that you don't notice how time passes. According to the Mozilla Foundation's study, trusting the algorithm can make it more likely that you see videos with sexualized content or false claims than your personal interests.Mozilla discovered that 71% of YouTube's recommended videos had been flagged by users in a study involving more than 37,000 people. To track their YouTube usage over 10 years, volunteers used a browser extension. The extension recorded whether they found the video via YouTube's recommendation, or their own.These videos were called "YouTube regrets" by the study. This refers to any experience that YouTube information may have caused. These Regrets include videos that "champion pseudo-science", promote 9/11 conspiracies and show mistreated animals [and] encourage white supremacy." Mozilla was told by the parents of a 10-year-old girl that she fell into extreme dieting videos while searching for dance content. This led her to limit her eating habits.These videos are recommended because they have the potential to go viral. Videos with potentially dangerous content may be recommended to other users if they are viewed thousands to millions of times.YouTube removed 200 videos that were flagged by the study. A spokesperson for the Wall Street Journal stated that the company had reduced its recommendations for content it considers harmful to less than 1% of all videos viewed. YouTube also stated that it has made 30 changes in the last year to address the problem. The automated system detects and removes 94% of videos that violate YouTube policies before they reach 10 viewers.YouTube is facing the same misinformation policing challenges as other social media sites, even though it may be easy to agree to remove videos that feature violence or racism. YouTube had previously removed QAnon conspiracy theories that it deemed to be capable of causing real harm. However, many similar-minded videos continue to slip by the wayside through arguments for free speech or entertainment purposes.YouTube claims that the algorithm is proprietary and refuses to share any details about it. It is impossible to determine if YouTube is doing everything possible to stop such videos from being circulated via the algorithm.YouTube has made 30 improvements in the last year, which is a great step. However, if YouTube wants to remove harmful videos from its platform, it should let its users see its efforts. This would be a good start towards meaningful action.