youtube
Credit: CC0 Public Domain

Despite the platform's efforts to make it harder to post rules and guidelines, conspiracy theories are still flourishing on the site.

The study was published in the Harvard Kennedy School Misinformation Review.

The comments covered topics such as Bill Gates' hidden agenda, his role in vaccine development and distribution, his body language, and his connection to convicted sex offenders.

The results suggest that the anonymous message boards 4chan and 8kun may have aided in the growth of conspiracy theories.

Previous studies have argued that misinformation is a social phenomenon.

The process of developing a conspiracy theory is a social one according to Dr. Gray. People join the dots by sharing information that they use to build conspiratorial narratives. The current approach to moderation by the social media platforms is not good at detecting this type of social conspiracy theorizing.

Lan Ha is a co-author of the study.

Conspiracy theories and videos on the internet.

New policies and guidelines were put in place by YouTube to limit the spread of misinformation about the COVID-19 epidemic.

The comments feature is unmoderated and has low barriers to entry, with many posts violating the platform's rules, for example, comments that proposed vaccines are used for mass sterilization or to insert microchips into recipients.

The researchers looked at comments from three videos related to COVID-19 that were posted by news media organizations. Between April 5, 2020, and March 2, 2021, between 13500 and 15000 comments were posted on each video featuring Bill Gates.

The study found that the comments for each video were dominated by conspiratorial statements.

Some comments were considered "borderline content" because they did not cross the lines set by its rules.

Comments that raise doubts about Bill Gates's motives in vaccine development and distribution are examples of borderline content. The comments implied that vaccines could be used to control large populations of people.

There are recommendations on videos on the internet.

The platform should consider design and policy changes that respond to "chatter strategies" used by conspiracy theorists to prevent similar outcomes for future high-stakes public interest matters, according to the researchers.

Defending a conspiracy theory is one of the three common strategies. Readers like the comment and can amplify it.

Dr. Gray said that the community-led or human moderation features are needed to detect these kinds of strategies.

The researchers said that for YouTube to address this problem adequately, it needs to redesign the space to provide users with the tools they need to self-moderate effectively.

There are news publishers and commenters.

Content moderation guidelines for news publishers should outline strategies used by conspiracy theorists that are not visible to automated moderation, according to the study. News publishers can turn off comments on high stakes public interest videos to make sure they don't promote conspiracy theories.

Dr. Gray said that the study showed that YouTube needs to redesign the space to provide social moderation infrastructure. The strategies of conspiracy theorists will continue to evade detection systems, pose insurmountable challenges for content creators, and play into the hands of content producers who benefit from and encourage such activity.

More information: Lan Ha et al, Where conspiracy theories flourish: A study of YouTube comments and Bill Gates conspiracy theories, Harvard Kennedy School Misinformation Review (2022). DOI: 10.37016/mr-2020-107