Misinformation thrives on YouTube. Here's how fact-checkers want to stop it.

If it felt so inclined, over 80 fact-checking organizations have come together to list four simple ways that YouTube could combat the spread of misinformation on their platform.

The International Fact-Checking Network wrote an open letter to the CEO of the video sharing website, saying that it was one of the major conduits of online misinformation worldwide.

The letter stated that "YouTube is allowing its platform to be weaponized by unscrupulous actors to manipulate and exploit others, and to organize and fundraise themselves." The Washington Post Fact Checker is one of the organizations signed by the U.S.

"Your company platform has framed discussions about disinformation as a false dichotomy of deletion or not deletion," the letter continued. By doing this, YouTube is avoiding the possibility of doing what has been proven to work: our experience as fact-checkers together with academic evidence tells us that fact-checked information is more effective than deletion.
The International Fact-Checking Network noted that COVID-19 misinformation is the most obvious issue, but that medical misinformation has been hosted on YouTube for years. The letter claims that political misinformation and hate speech have a damaging impact on multiple countries, including the US.

The International Fact-Checking Network wrote that the examples are too many to count. We think that the efforts that the company has made to address this problem are not working and that they have not produced any quality data to prove their effectiveness.

In September of last year, YouTube announced an update to its medical misinformation policy.

The International Fact-Checking Network's invitation to collaborate was not commented on by YouTube, but it said in a statement that it considers the situation to have "more nuance" than simply requiring more fact checking.

"Fact checking is a crucial tool to help viewers make their own informed decisions, but it's one piece of a much larger puzzle to address the spread of misinformation," Elena Hernandez said in a statement.

The International Fact-Checking Network offered four simple suggestions on how it could stop facilitating the spread of misinformation. These are the things.

To commit to "meaningful transparency" on misinformation, it should support independent research and publish its full misinformation moderation policy.

Investing in independent fact-checking, while prominently debunking misinformation and providing context either superimposed on misleading videos or as extra video content.

If you want to prevent the recommendation of videos by creators whose content is flagged as disinformation, you need to prevent the recommendation of videos by creators who monetise their content.

Provide country-specific data and expand its efforts to combat misinformation in other languages. The International Fact-Checking Network noted that misinformation on the internet is not always noticed.

In order to add third-party context in information panels under some videos in some countries, it is working with international publishers. The company said it already has policies against COVID-19 misinformation that "poses a serious risk of egregious harm," hate speech, harassment, and election misinformation, and claims its systems "raise authoritative content and reduce recommendations of borderline misinformation in all countries." The International Fact-Checking Network was given $1 million by the Google News Initiative.

The International Fact-Checking Network's point is that the current policies of YouTube haven't appeared to be very effective, and more needs to be done. It probably shouldn't be telling people to cross the border.

"Over the years, we have invested heavily in policies and products in all countries that we operate in to connect people to authoritative content, reduce the spread of borderline misinformation, and remove violative videos," he said. "We have seen important progress, with keeping consumption of recommended borderline misinformation below 1% of all views, and only about 0.11% of all views being violative content that we later remove." We are always looking for ways to improve and strengthen our work with the fact checking community.

These small percentages add up to a lot when you consider that there are over 2 billion monthly users of YouTube.

"And every day, people watch over a billion hours of video and generate billions of views," says the site.

That means videos with misinformation get a lot of views.