The following essay is available for re-publication in The Conversation.
Have you ever seen ads for a product on your social media feed? These instances of eerily accurate advertising provide a glimpse into the behind-the-scenes mechanisms that feed an item you search for on a website, like on a social media site or come across while browsing into custom advertising on a social media site.
The mechanisms are being used for more sinister purposes. The threat is how this advertising interacts with the political landscape. As a social media researcher, I see how people use targeted advertising to move people to extreme views.
Advertising to a group of people.
Advertising is powerful. The right ad campaign can help shape or create demand for a new product or rehabilitate the image of an older product. Historically, countries have used similar strategies to wage propaganda wars.
Mass media has a built-in moderating force, but advertising is powerful. Mass media can only move many people in one direction if the middle is willing. People in the middle may not be comfortable with it moving too far or fast.
The detailed profiles the social media companies build for each of their users make advertising even more powerful. The size and value of your home, the year you bought your car, and whether you buy a lot of beer are included in these profiles.
Social media can expose people to ideas as fast as they will accept them. The same mechanisms that can recommend a niche product to the right person or suggest an addictive substance just when someone is most vulnerable can also suggest an extreme conspiracy theory just when a person is ready to consider it.
It is more and more common for friends and family to disagree on important issues. Many people think that social media is a part of the problem, but how are these powerful advertising techniques contributing to the divisive political landscape?
The breadcrumbs were to the extreme.
One important part of the answer is that people associated with foreign governments take extreme positions in social media posts with the goal of sparking division and conflict. The social media algorithms reward content that provokes a response, meaning that these extreme posts take advantage of that.
The answer is that people who want to radicalize others lay out trails of breadcrumbs to more extreme positions.
The same social media radicalization process can be used to recruit Jihadis or Jan. 6 insurrectionists.
You may feel like you are doing your own research, but you are actually following a deliberate radicalization program that is designed to move you toward more and more extreme content. After analyzing over 72 million user comments on over 320,000 videos posted on 349 YouTube channels, researchers found that users migrated from milder to more extreme content.
The result is obvious. Fewer and fewer people are in the middle of the spectrum of views.
How to protect yourself.
What can you do? I recommend a lot of skepticism about social media recommendations. Most people go to social media to look for something specific, and then find themselves looking up from their phones an hour or more later without a clue as to why they read or watch what they just did. It's designed to be addictive.
I have been trying to chart a more deliberate path to the information I want and avoid clicking on things that are recommended to me. If I watch or read what is suggested, I wonder if it is in someone else's best interest.
Second, consider supporting efforts to require social media platforms to offer users a choice of algorithms for recommendations and feed curation, including ones based on simple-to- explain rules.
I recommend that you invest more time in interacting with friends and family on social media. If I find myself needing to forward a link to make a point, that is a warning bell that I do not understand the issue well enough. I might have found myself following a trail toward extreme content rather than consuming materials that are helping me better understand the world.
The Conversation published this article. The original article can be found here.