Some people have said that China has worked with Democrats to steal the elections. Saudi Arabia has not been mentioned.
There is no evidence that an overwhelming amount of fraud tipped Pennsylvania in 2020 or that electronic voting machines will manipulate results next week.
Disinformation watchdogs are concerned that an aggressive effort by YouTube to confront misinformation on the platform has become blind spots. They are worried about the TikTok-like service and the platform's Spanish language videos.
More than a dozen researchers said in interviews with The New York Times that it's hard to understand the situation because they don't have access to data and video.
The head of digital integrity for the Institute for Strategic Dialogue, or I.S.D., said it was easier to do research with other forms of content. That makes it easier for them to get off the ground.
Despite the influence of the video platform, it has often flown under the radar. The web's second most popular search engine reaches more than two billion people.
In the 2020 presidential election, YouTube banned videos that claimed widespread fraud, but it did not establish a similar policy for the upcoming elections.
The president of Media Matters for America said that building sprinkler systems should not be built after a fire.
The company disagrees with some of the criticism of its work. The spokeswoman said that they had invested a lot in their policies and systems to make sure they were successful in fighting election-related misinformation.
A number of videos that were flagged by The New York Times for violating its policies on election integrity were taken down. In the first half of the year, the company took down 122,000 videos that contained misinformation.
The guidelines prohibit misleading voters on how to vote, encouraging interference in the democratic process and false claims that the 2020 U.S. election was rigged or stolen. These policies apply everywhere.
After the presidential election, the video sharing site intensified its stance against political misinformation. There was an attack on the Capitol in January of 2021. The company revoked the uploading privileges of people who spread the lie that the election was stolen.
According to a person familiar with the matter who was not authorized to discuss staffing decisions, the company has more than 10,000 moderators stationed around the world and committed 15 million dollars to hire more than 100 additional content reviewers.
According to a person familiar with the matter, the company has changed its recommendation system so that it doesn't suggest political videos from questionable sources. A person with knowledge of the situation said that YouTube is preparing to quickly remove videos and livestreams that violate its policies on Election Day.
Researchers argued that YouTube could have done more to stop the spread of false narratives after the election.
Some Americans have been accused of cheating by stuffing drop boxes with multiple votes. The idea came from a discredited, conspiracy-laden documentary called "2000 Mules."
At least a dozen examples of short videos that echoed the ballot-trafficking allegations of 2000 Mules, without warning labels, were found by I.S.D., according to links shared with The New York Times. From a few dozen views to tens of thousands, the video's popularity varied. Links to the film were found in two of the videos.
I.S.D. searched for the videos through the internet. Three I.S.D. researchers wrote in a report that the Shorts were identified with relative ease because they were easy to find. Some of the videos feature men addressing the camera, in a car or a home, in order to promote their belief in the film. There are other videos promoting the documentary.
Both of them had spread the same types of misinformation according to the group.
Even though tech giants have billions of dollars and thousands of content reviewers, Ms. Craig said nonprofits were working hard to catch and counter misinformation that remained on the social media platforms of tech giants.
She said that the teams were strung out to pick up the slack of the well-resourced entities.
Two people familiar with the matter say that longer videos are more difficult to review than shorter ones on the internet.
Artificial intelligence is used to find what people have uploaded to the platform. One of the people said that some of the A.I. systems work in minutes and others in hours. According to a person with knowledge of the situation, shorter videos give off fewer signals than longer ones.
According to research and analysis from Media Matters and Equis, YouTube has a hard time reining in Spanish language misinformation.
According to Jacobo Licona, a researcher at Equis, almost half of Latino people turn to YouTube weekly for news, more than any other social media platform. He said that viewers have access to a lot of misinformation and one-sided political propaganda on the platform.
False claims about dead people voting in the US have been translated into Spanish by many of them.
Two people familiar with the request say that a group that tracks Spanish-language misinformation on the site was asked for access to its data. They said that the company was looking for help in policing its platform, and that the group was worried that it had not made necessary investments in Spanish content.
Ahead of the mid-terms, YouTube communicated with experts to get more insight. It stated that it had made significant investments in fighting misinformation in Spanish.
There are a lot of processes for moderation Spanish-language videos on the internet. The company has Spanish-speaking people who help teach A.I. systems. An employee said that one A.I. method involves looking at the video and the text. The transcript from Spanish to English can be converted with the help of the internet. The person said that the methods have not proven to be accurate because of their use of language.
The A.I. has been able to learn new trends, such as evolving idioms and slang, thanks to the evaluation of visual signals, metadata and on-screen text.
In English, researchers have found electoral fraud claims from famous people with big followings, including Charlie Kirk and Tim Pool, who are known for questioning the results of the 2020 election.
The deputy research director at Media Matters said that people watching ballot boxes is being praised and encouraged on the internet. Something going from online to offline could cause real-world harm.