The preamble to the Constitution states that we the people are more divided than ever. All eyes turn to social media when it's misinformation that leads to this divide. According to research, social gaps may be widened by motivating users to prove their commitment to identities, causes and political parties. It is more difficult to form connections with those on the other side if the world is watching a heated dialogue. The public roster of the club to which you belong can be found on the internet. Following the wrong accounts leads to judgments of association, whereas following a good group of people and sharing their posts leads to a sense of belonging.

The mailbox, television ads, and simple person-to- person contact are some of the more traditional forms of communication. Outside of the online global forum, we tend to maintain messy social circles. Audiences are mostly friends and family with whom you have disagreements. There is less room for disagreement in these brick and mortar relationships.

Social media pulls users in different directions. In this environment, widely disseminated misinformation that circulates in the public sphere creates an opening to cause damage, especially on platforms where users seem most prone to sharing content with the intent of hurting the other side. The new viral channels present a lot of challenges to keep our skepticism sharp and minimize our gullibility.

Advertisement

We don't know what to do. It's natural to give in to pressure to exclude outsiders. How can social media users not fall prey to misinformation that threatens the stability of our nation's elections and our democracy? As voters listen to accusations of malfeasance and misbehavior, this question has come forth in relief.

Jay Van Bavel's research focuses on misinformation in public life. He investigated how social media and other factors exploit the weak spots of our human need to belong and exclude, leading to the spread of false information that has been so prevalent before November's tipping point elections. Van Bavel, an associate professor of psychology and neural science at New York University, spoke with Scientific American about who is most vulnerable to these lies, how they spread and what we can do to avoid being victims of those who benefit from the misinformation.

The transcript of the interview has been changed.

Why is misinformation spreading? How is it having an impact on politics?

About 300 feet of news feed is scrolled through by social media users. It's a lot. Misinformation is designed to grab our attention in a sea of content. We share it because it is relevant or indicative of our affiliation with our political leader. There are a lot of salacious allegations. These type of messages can be very damaging to a candidate.

Advertisement

Older people are more likely to spread misinformation than young people. Older people are more committed to their identities and are more likely to use social media in moderation than younger people. Young people vote at a lower rate than older people.

Some factors in the success of misinformation on the internet.

There is a combination of things. The U.S. is more divided than it has been in 20 years due to out-group hate.

You can sign up for Scientific American's newsletters.

You have politicians. Donald Trump downplayed the risks of the Pandemic during the first months. He knew it was a risk but he didn't want to hurt the economy and have people blame him for it.

There is a group of people around them. Political elites give people clues about who to trust and what to not trust. It starts with senators who are politically active on social media or Fox News and then people like us online, scrolling and sharing or mentioning it to a friend or e-mailing it. If you see it from a family member, you're more likely to trust it.

Advertisement

Some of the leaders are not the source of the misinformation. If it was in favor of Trump, he would amplify it. The stories that are generated by conspiracy theorists are often amplified by elites. The more misinformation is shared, the more incentive there is for people to make money by sharing it. There are a lot of incentives in this.

Which platforms are the most likely to nurture misinformation?

The company is attempting to update its policies to manage misinformation. There are probably more people on Facebook than on the micro-messaging service. The study found that people who got their information from Facebook had the highest levels of vaccine hesitancy. The data suggests that Facebook is a big risk factor.

How do the parties decide what to tell people?

There are themes that people have already heard or believe in that lead to the spread of misinformation. The idea that the election was stolen was one from the last election. During the Pandemic, there were a lot of conspiracy theories about everything from vaccines to masks. The health risks of the Pandemic were downplayed.

Advertisement

There could be surprises for the midterms.

It is difficult to predict a presidential election. It is likely that most information is for specific candidates in the most competitive senate races, such as Georgia and Pennsylvania. It's going to be about the candidates in those races. The architects of misinformation would like to change things up.

Some people are very politically extreme. Republican political actors are mostly right-wing. It's possible that there would be some misinformation from the left. It doesn't tend to be spread as much. Alex Jones and Roger Stone are the usual suspects. People who are politically aligned tend to spread it.

People are vulnerable to influences from misinformation campaigns.

We need to be more cautious about the information we get from our political leaders. Some pieces of misinformation have been shared by me. I didn't realize it was a parody, but it was related to something in the Zeitgeist. I took it down after friends on social media corrected me. I wouldn't get invited to conferences if I started sharing information like that from Alex Jones.

Advertisement

We don't have to do it all ourselves. Peer review is the whole thing with science and it involves scholars pointing out our mistakes. Scientists live in a community that helps them get smarter. To be open-minded about being fact-checked, you need to be embedded in communities.

Is anything else possible?

Prebunking is getting the facts out before misinformation starts to spread. That is similar to a vaccine. When you encounter misinformation in the wild, your brain will know that you are being manipulated and that you have an immunity to it. More people seem to be more skeptical.

The Bad News Game is an online game that teaches how misinformation can spread. Data shows that the game helps people.

Journalists have a role to play in identifying misinformation that will be spread in this cycle. I would encourage journalists to tell the truth about how people are likely to be manipulated based on the last election and give people resources or places they can go to get better information on these issues.

A person with an entrenched belief reacts to fact-checks.

Fact-checks work for most of the time, but the impact they have in studies is very small. Fact-checking didn't seem to work at some times. If something was fact-checked by the other side, people became more entrenched and identified with a particular group. People who drive around in a pickup with five Trump flags or a car with stickers all over it are more susceptible to this kind of entrenchment. It seems that people who don't make it an important part of their identity are less likely to get entrenched.

This will be a big issue. It is only getting bigger. The issue of being more savvy is not the only one. We are embedded in systems that reward misinformation spreaders with money. Alex Jones is a good case to examine. He sells conspiracy theories to his audience and makes a lot of money. People don't like being manipulated and need a better understanding of the fact that they are being manipulated by people who make money off of them.