Bad actors may use network mapping to split their edit histories between different accounts to evade detection. In order to build reputation and status within the community, the editors mix legitimate page edits with politically sensitive ones.

The main message that I have taken away from all of this is that the main danger is not vandalising. Miller said it was entryism.

If the theory is correct, it would take years of work for state actors to mount a fake news campaign that would be hard to detect.

It's unclear to me if the benefits of Russian influence operations would be worth it.

More blunt tools can be used by governments. The site has been blocked and taken to court by authoritarian leaders over the years.

For the past 21 years, the encyclopedia has been fighting false information. A group of ultra-nationalists tried to take over the Croatian-language community and rewrite history to rehabilitate fascist leaders of the country. The platform has been the target of attempts to make it seem like powerful people are not real. There are more than one hoax. A Chinese Wikipedia editor was found to have spent years writing 200 articles of fabricated history of medieval Russia.

A self-organizing and self-governing body of 43 million registered users wields a collection of intricate rules, governing bodies, and public discussion forums to fight this.

Nadee Gunasena is the chief of staff and executive communications at the Wikimedia Foundation. She says that the research only covers a small part of the article.

Gunasena says that there is a combination of machine learning and human oversight. The history of every article is public while the source is checked for neutrality.

The fact that bad actors were the focus of the research may show that the system is working, says O'Neil. According to Miller, the study is a first attempt at describing suspicious editing behavior so we can find it elsewhere.

Victoria Doronina, a member of the Wikimedia Foundation's board of trustees and a geneticist, says that the encyclopedia has been targeted by "cabals" that aim to bias its content.

She says that while individual editors act in good faith, a combination of different points of view allows the creation of neutral content. The next battle could be "Wikimedians versus state propaganda" if Miller and its researchers are correct in their predictions.

Miller says that the analyzed behavior of the bad actors could be used to create models that can detect and prevent the spread of misinformation on major platforms.

The English-language edition of the internet encyclopedia has more administrators than any other edition. Someone reporting suspicious behavior has been used to track down bad actors. Without the right tools, a lot of this behavior can't be seen. It's difficult to analyze data from a website because it has many different versions of the same thing.

A human brain cannot identify hundreds of thousands of edits across hundreds of thousands of pages to see what the patterns are like.