Image for article titled Massive Study Involving YouTube Finds 'Pre-Bunking' Inoculates People Against Misinfo

A large peer-reviewed study published Wednesday says that pre-bunking is the best way to stop people from believing the misinformation they see on the internet. The study was carried out by researchers with the British universities of Cambridge and Bristol, who worked together with YouTube and Jigsaw to conduct a total of seven different experiments. The goal of these experiments was to see if they could convince web users to steer clear of the web's most noxious content.

A field of psychological research that shares the name, inoculation theory, is what inspired the experiments. By using various forms of communication, people can be persuaded not to be swayed by other arguments or beliefs. Pre-bunking is meant to give web users a taste of what online manipulation looks like so that they can protect themselves from it in the future.

The 90-second videos were used to inform viewers about misinformation tactics that they might encounter on the platform. The PSAs were not focused on specific types of content, but on teaching viewers about different types of rhetoric that can be used in misinformation campaigns. The videos warned viewers about certain tricks, such as emotionally manipulated language, false dichotomies, ad hominem attacks, and scapegoating.

Study participants were shown a variety of social media posts and asked to rate them for trustworthiness. The videos seem to have worked. They claim that after viewing the videos, participants were able to identify the rhetoric. The recent findings note was published.

“Across seven high-powered preregistered studies including a field experiment on YouTube, with a total of nearly 30,000 participants, we find that watching short inoculation videos improves people’s ability to identify manipulation techniques commonly used in on- line misinformation, both in a laboratory setting and in a real-world environment where exposure to misinformation is common.”

Jon said that the inoculation worked for people from all walks of life. Liberals and conservatives had the same effect. It worked for people with differing levels of education. A general inoculation against misinformation is based on this.

A Solution with Scale

Supporters of pre-bunking say it is the most effective way to fight misinformation. It's difficult to scale fact-checking, which is one of the most widely used tools in the fight against online bullshit, because of the amount of effort needed to fact-check every single incorrect thing that gets published online. Pre-bunking is meant to make web users aware of all the different types of tactics and narratives that can be found on the internet. Regardless of the specifics of a particular conspiracy theory, viewers will be mentally armed to fight off that kind of information when it pops up.

Researchers said that their method worked so well that they are in the process of launching new "pre-bunking" campaigns that will be used to target specific types of content. A pre-bunking video campaign to counter anti-refugee narratives in Central and Eastern Europe is in the works. The effort will be used to discourage web users from engaging with content that makes them feel bad about their country of origin.

Beth Goldberg said that the findings are exciting because they show that we can scale prebunking far and wide, using ads as a vehicle, and that the pre-bunking videos are effective in an "ecologically valid environment" on social media.

Lingering Questions

There are some questions that you can't help but ponder. A lot could go wrong with the idea of pre-bunking.

Who decides if a narrative is a false or amanipulative? Is it the government. Do you mean a corporation like the one in the picture? There is a panel of experts. Who gets to make a decision about this important function? So much of the misinformation crisis is driven by public distrust in official narratives, how do you keep confidence in that arbiter?

You can see that it hasn't always been smooth when you look at recent examples. The State Department controversially announced that Russia was planning to distribute a professionally produced propaganda video that involved pyrotechnics and "crisis actors." According to the U.S., the video would be used to justify the invasion of Ukraine. An Associated Press reporter was not a fan of the State Department's claims and called out the government for spreading "Alex Jones" style bunkum.

The video didn't come to fruition. The Russians were prevented from releasing their video because of America's "pre-bunking" efforts. Was it because the video didn't exist before? It is not possible to gauge whether the U.S. was actually spreading its own misinformation.

Pre-bunking could be just another way to guide and shape online narratives, because it is distributed by authoritative institutions rather than just some. Pre-bunking is not the only strategy that can be used to combat misinformation, and it has to be done with care and sensitivity to the audience that is receiving it.

We're not telling people what's true and what isn't, that's the point that we've been explicitly trying to make

The platforms have to be looked at. The way in which YouTube tends to send people down toxic content rabbit holes is something that they have a problem with. He said it was commendable that they were attempting to do something about it. If they said, "Well, don't worry about our algorithms, we'll just pre-bunk everything," that would be good. He stresses that pre-bunking is not the only solution.