Inside Meta’s “War Room.”

Meta, the owner of social media platforms, wants you to know that it is serious about fighting lies and bogus claims in the run up to the mid-term elections.

Meta's President of Global Affairs showcased some of the ways it's attempting to beef up security on its platforms to combat hate speech, voter interference, and foreign influence. Larger-scale election integrity changes proposed prior to the 2020 presidential elections which some researchers applauded and others found insufficient are mirrored by Meta. Meta has been criticized on multiple fronts over its handling of recent election-related misinformation in Brazil and other non- U.S. countries.

Meta says it has hundreds of staff focused on the mid-terms. The company said it will not allow ads on its platforms that encourage people not to vote. The second goal is much harder to achieve than the first one. Meta says it will remove misinformation about dates, locations, times and methods of voting, as well as false statements about who is eligible to vote, whether or not a given vote will be counted, and calls for violence in relation to voting.

Meta says it is working with the Cybersecurity and Infrastructure Security Agency and state and local election officials to make sure they are prepared for different scenarios.

Meta has exposed and disrupted dozens of networks that have tried to interfere with the U.S. elections. Proactive sweeps are carried out on the platform to catch banned organizations.

2.5 million pieces of content were removed from Meta in the first quarter. The company invested $5 billion in safety and security last year, which was applauded by it. Meta will not allow new political ads during the last week before the elections. Meta's home page will be used to send notifications regarding voter registration as well as information on how and where to vote.

Recent criticisms of Meta's handling of election information in Brazil could call into question the effectiveness of Meta's safeguards. According to a new report, Facebook was unable to detect explicit election-related information. As part of their study, Global Witness submitted 10 Brazilian Portuguese-language ads, five of which contained blatant election misinformation. According to Global Witness, all of the posts were approved by Facebook.

Jon Lloyd said in a statement that Facebook knows that its platform is used to spread election misinformation. We were appalled to see that they accepted every election ad we submitted in Brazil, despite the fact that they were trying to tackle the issue.

Global Witness says that Facebook approved ads that contained false information about when and where to vote. Similar criticisms surrounding Facebook's handling of political content in several countries have been found by Global Witness.

A Meta spokesman didn't deny the findings but said the company is committed to protecting election integrity in Brazil and around the world.

The spokesman said that they have prepared for the election in Brazil. We have launched tools that promote reliable information and label election-related posts, established a direct channel for the Superior Electoral Court to send us potentially- harmful content for review, and continued collaborating with Brazilian authorities and researchers. In Brazil's previous election, our efforts resulted in the removal of 140,000 posts from Facebook andInstagram for violating our election interference policies.

Since the election of Donald Trump, Facebook and Meta have had to deal with a lot of election misinformation. The platform was condemned by researchers and lawmakers for allowing foreign actors to manipulate the company's news feed. Meta executives have admitted in the past that they could have done more to bolster their platform.

During the 2020 election, Facebook implemented an expansive list of new procedures and policies to try and limit misinformation on the site, but research shows that it didn't stop false information from exploding in popularity. The extra steps Facebook took in 2020 were praised by some groups, but they weren't universally applauded.

If the company hadn't waited until October 2020 to make changes to its algorithm, an estimated 10 billion page views would have been prevented. Between October 2020 and October 2021, those groups tripled their monthly interactions on the platform.

According to The Facebook Papers, a majority of Americans think that Facebook was at least partially to blame for the Capitol Hill attack. The data collected in the immediate aftermath of the attacks showed that President Donald Trump's own account was to blame for a surge in reports of violations of its rules. The documents show that Facebook employees were aware of the fears of U.S. users of being exposed to election- related misinformation.