The major online social platforms released their plans for fighting misinformation in the weeks leading up to the US elections.

Meta will have voting alert, real-time fact-checking in both English and Spanish, and as it did in 2020, it will also be banning new political, electoral and social issue ads. Prebunks are proactively fact-checking content in users' feeds based on search terms and the Explore pages will have election themed pages. Information is being added to search pages for candidates. TikTok will continue to enforce its ban on political advertising even after the election, as it will have a collection of election-related hashtags.

If you couldn't keep this straight in your head, you'd be forgiven. Researchers and fact-checkers share the same feelings.

No platform is in good shape.

Unless Facebook has changed the core functions and design of its platform, I don't think any meaningful changes have been made.

The internet's biggest platforms don't seem to know that the world has changed If polls are to be believed, 45 percent of Americans and 70 percent of Republicans believe in some variation of the "big lie" that Donald Trump told during his campaign. The conspiracy theories of the candidates are more common than ever. Even with new policies pointing to banning voter fraud in 2020, platforms are still full of lies. Conspiracy theories won the war.

The November election will be the first time that many Americans will vote since last year's insurrection, which was planned and broadcasted on many of the same platforms now emphasizing their commitment to democracy. The current policies of the four major platform companies don't reflect this troubling political reality.

"I don't think any platform is in good shape."

What will come next? In a post-insurrection world, how do we moderate the internet? Is it really all that bad? It is possible that we have reached the limit of what individual platforms can do.

There was a room full of computers.

The CEO of Anchor Change and former public policy director for Facebook said she has not seen much new from the platforms regarding the US elections. Harbath is concerned that none of the big tech companies have anything in their election policies that mention coordinating across platforms on internet-wide conspiracy theories.

How does this misinformation spread among all these different applications? How do they interact with each other? Nobody has the ability to look cross- platform at how actors are exploiting the different loopholes or vulnerabilities that each platform has to make up a whole.

Facebook came up with the idea of a misinformation war room after the Cambridge Analytica scandal.

Meta wanted to change the narrative ahead of the US and Brazilian elections. A nerve center the social network has set up to combat fake accounts and bogus news stories ahead of upcoming elections was the reason the company invited journalists to tour it. It looked like a room full of computers with a couple of clocks on the wall showing different times. Less than a month later, the war room would be closed. It lent a sense of place to the work of moderating a large website for the company.

Harbath said that the election war rooms were supposed to centralize the company's rapid response teams and that they often focused on mundane issues like fixing bugs or taking down attempts at voter suppression. The Trump campaign ran an ad about caravans of immigrants at the border, which was an example of war room moderation. There was a lot of debate about whether that ad could run. The ad was eventually blocked.

There has been no transparency about how much content is labeled.

She said that she received a phone call from a presidential candidate's team because their page had gone down. The people in the War Room would be able to immediately respond to that. They had systems in place to make sure that they were going in the correct direction.

Many of that triaging was happening publicly, with analysts and journalists reporting on it. The platform cracked down on "stop the steal" content more than two months after the election was over.

The policy regarding working with government, cybersecurity, and tech industry partners during elections was still accurate for this year, according to a statement from a Meta spokesman. Chambliss wouldn't say which industry peers Meta communicated with, but he did say that the election operations center would be the head of Election Day this year.

To support further research into this and similar cross- internet activities, we are including a list of domain, petitions and Telegram channels that we have assessed to be connected to the operation. We are looking forward to more discoveries from the community.

There are more than one platform now.

There are other reasons to be discouraged. The majority of the current election response involves using filters and artificial intelligence to flag false or misleading content in order to remove more high-level coordinated misinformation. If you are a person who spends 10 hours a day consuming QAnon content in a Facebook Group, you won't see a fact-checking widget. There aren't any actual numbers on how many posts are flagged as misleading or false.

She said that no platform has been transparent about how much content is labeled or how long it takes to put a label on it.

If you are immersed in these alternate realities, you are not only using one platform to consume content and network with other users, but you are also using other platforms as well. None of the others have a set of standards and policies.

Alternative social media platforms like Parler, Truth Social, and Gettr are some of the more popular ones. It's wildly popular.

New ways in which platforms function are being explored. Social media is more than just a place to add friends and share links. It has grown into a huge universe of different platforms and different incentives. The problems these sites are facing are much larger than any one company can handle.

Since January 6th there has been a platform migration.

It is useful now to focus on how different apps deliver content to users, according to a fellow at the Integrity Institute. Distribution-based apps and community-based apps are divided into two groups.

TikTok is a distribution-based app where users consume content from other users. Community-based harms are what versus Facebook has. Yes, right?

The first class of apps, including TikTok, pose a significant challenge during large news events. The first US election where TikTok, not Facebook, is the dominant cultural force will be this year's mid-terms. Meta lost users for the first time this year and TikTok pushed the app out of the top 10 in the Apple App Store.

According to Geurkink, TikTok is the least transparent of all major platforms. According to Geurkink, it is harder to scrutinize TikTok from the outside than it is from the inside.

According to a report published by the team at Mozilla, TikTok's ban on political ads is easy to circumvent and that the platform's new tool that lets creators pay to promote their content has. TikTok changed its policy this month, blocking politicians and political parties from using the platform's monetization tools. The company that makes TikTok reached out to it.

External scrutiny into the platforms is what we've advocated for for a long time. External researchers can do that, but TikTok hasn't allowed that in terms of transparency. The other platforms have done more.

There is a lack of transparency with regards to how the platforms moderate themselves. We don't know how these platforms work as a network. Thanks to Meta, we have a good idea of how these platforms are connected.

The TikTok.com domain was the most viewed domain on Facebook, accounting for over 100 million views. Which kind of throws a wrench into the idea of a single platform controlling their own content. It's not just content coming from other big platforms that creates weird moderation gray areas.

People will think that the social media company is trying to work against what they think is the truth if they believe a false claim.

According to Sara Aniano, an analyst at the Anti-Defamation League's Center on Extremism, fringe right-wing websites are becoming more influential with their content being shared back on Facebook.

Since January 6th there has been a platform migration. People realized that they were getting flagged with warnings on social media. They might have gone to places like Telegram or Truth Social to speak more freely.

The bad actors know that larger platforms will suspend their accounts or put content warnings on their posts, so they have gotten better at moving from platform to platform. It adds to a conspiratorial mindset from their followers when they are banned or have their posts flagged as false.

She said that people will think that the social media company is trying to work against what they think is the truth if they believe a false claim. That is the sad reality of conspiracism, not just leading up to the election, but around everything, around medicine, around doctors, around education, and all the other industries that we have been seeing attacked over and over again.

She said that the recent Arizona primaries, where a conspiracy theory spread about the election being rigged, was an example of how this all works together. A conspiracy theory called #SharpieGate began to go viral in 2020.

On Facebook, the #SharpieGate is hidden. Right-wing publishers still write about it and share their articles on the platform. The hashtag isn't blocked on TikTok, but it is blocked on Twitter. Some people are making videos about it.

The platform is not blocking #SharpieGate content but is demoting it in the platform's search terms. She said that authoritative content is at the top when looking for "#Sharpiegate 2.0." It's important to make sure that the content isn't recommended.

Any attempt at moderation is a positive thing. I wouldn't say that it's pointless. The distrust that has been sowed in the democratic process since 2020 needs to be acknowledged. It can't be solved in a week, it can't be solved in a year, and it might take lifetimes to rebuild the trust.