Photo by Amelia Holowaty Krales / The Verge

Tech companies operating some of the world's biggest online platforms have signed up to a new EU rule for tackling online misinformation.

In order to stop the spread of fake news and propaganda on their platforms, these firms and others will have to make greater efforts. The European Commission said that the guidelines had been shaped by lessons learned from the COVID19 crisis and Russia's war of aggression inUkraine.

The code has been shaped by COVID misinformation and Russian propaganda

Vra Jourov, the Commission's vice president for values and transparency, said that the new anti-disinformation Code comes at a time when Russia is weaponising misinformation as part of its military aggression against Ukraine.

There are 44 specific commitments contained in the code for companies that target harms from misinformation. These include pledges.

  • create searchable libraries for political adverts
  • demonetize fake news sites by removing their advertising revenue
  • reduce the number of bot networks and fake accounts used to spread disinformation
  • give users tools to flag disinformation and access “authoritative sources”
  • give researchers “better and wider access to platforms’ data”
  • work closely with independent fact-checkers to verify information sources

The EU claims its new code of practice will allow for greater oversight into these operations, despite the fact that many US tech firms have already adopted the same initiatives.

There are some notable absences from the list of signers. Despite the code's focus on demonetizing sources of misinformation by cutting off ads, Apple has not signed up. Telegram, which has been a major battleground for propaganda since the Russian invasion of Ukraine, is not present.

The EU notes that the new rules will be enforced by the DSA, which is a new law.

The new Code of Practice will be backed up by the DSA, according to the EU's commissioner for the internal market. Large platforms that break the Code are at risk of fines of up to 6 percent of their global turnover.

Although the EU is presenting the code as a strong deterrent against misinformation with clear methods of enforcement, it's worth remembering how difficult it is to even gauge the impact of misinformation.

In the code's 31st commitment, signatories agree to "integrate, showcase, or otherwise consistently use fact-checkers' work in their platforms' services, processes, and contents" Each EU member state will be given information on the number of fact-check articles published, the reach of fact-check articles, and the number of content pieces reviewed by platforms signed up to this portion of the code.

It will offer new insight, but it won't give the full picture of fact-checkers' work. Facebook has partnerships with fact-checkers as far back as 2016 but has also been criticized for using partisan groups to verify sources.