Illustration by Alex Castro / The Verge

Another piece of legislation has been agreed on by the EU.

The broad terms of the Digital Services Act, or DSA, which will force tech companies to take greater responsibility for content that appears on their platforms, was agreed on early Saturday morning. Taking stricter action on the spread of misinformation is one of the new obligations. Penalties for non-compliance can be as high as six percent of annual turnover.

The European Commission President Ursula von der Leyen said that the DSA will upgrade the ground-rules for all online services in the EU. The responsibilities of online platforms increase with the size.

“what is illegal offline, should be illegal online”

Margrethe Vestager, the European Commissioner for Competition who has spearheaded much of the bloc's tech regulation, said the act would ensure that platforms are held accountable for the risks their services can pose to society and citizens.

The Digital Markets Act, which was agreed upon in March, should not be confused with the DSA. The DSA deals with how companies police content on their platforms, while the DMA focuses on creating a level playing field between businesses. The DSA is likely to have an immediate impact on internet users.

The effect of these laws will be felt in other parts of the world. The EU's comparatively stringent regulations may be the basis for global tech companies to implement a single strategy to police content. Lawmakers in the US want to rein in Big Tech with their own regulations, but have already begun looking to the EU's rules for inspiration.

The final text of the DSA has yet to be released, but the European Parliament and European Commission have detailed a number of obligations it will contain.

  • Targeted advertising based on an individuals’ religion, sexual orientation, or ethnicity is banned. Minors cannot be subject to targeted advertising either.
  • “Dark patterns” — confusing or deceptive user interfaces designed to steer users into making certain choices — will be prohibited. The EU says that, as a rule, cancelling subscriptions should be as easy as signing up for them.
  • Large online platforms like Facebook will have to make the working of their recommender algorithms (e.g. used for sorting content on the News Feed or suggesting TV shows on Netflix) transparent to users. Users should also be offered a recommender system “not based on profiling.” In the case of Instagram, for example, this would mean a chronological feed (as it introduced recently).
  • Hosting services and online platforms will have to explain clearly why they have removed illegal content, as well as give users the ability to appeal such takedowns. The DSA itself does not define what content is illegal, though, and leaves this up to individual countries.
  • The largest online platforms will have to provide key data to researchers to “provide more insight into how online risks evolve.”
  • Online marketplaces must keep basic information about traders on their platform to track down individuals selling illegal goods or services.
  • Large platforms will also have to introduce new strategies for dealing with misinformation during crises (a provision inspired by the recent invasion of Ukraine).

The DSA will place greater obligations on larger companies. The firms with at least 45 million users in the EU will face the most scrutiny. Tech companies have been lobbying hard to water down the requirements in the DSA.

Although the broad terms of the DSA have been agreed upon by the member states of the EU, the legal language still needs to be finalized and the act officially voted into law. The last step is seen as a formality. The rules will apply to all companies 15 months after the act is voted into law, or from 1 January 2024, whichever is later.