The technology team is led by Liv McMahon.

A drone flying over a fieldImage source, Getty Images
Image caption, A drone being used for agriculture

The European Commission wants to help people who have been harmed by products using artificial intelligence.

The liability directive would make it easier for people to prove their case.

The justice commissioner said it would make a legal framework that was compatible with the digital age.

The directive covers self-driving cars, voice assistants, and search engines.

The Artificial Intelligence Act would be the first law of its kind to set limits on how and when artificial intelligence can be used.

Artificial intelligence systems are trained on large amounts of data or information to allow machines to perform tasks which would be considered a matter of human intelligence

Media caption,

Artificial intelligence is something to watch.

The European Commission published a "presumption of causality" for those who claim to have been injured by the use of artificial intelligence.

The victims won't have to untangle complicated systems to prove their case if there is a link between the performance of the system and the harm.

For a long time, social media firms have hidden the fact that they are not responsible for the content of other people's stuff.

The EU doesn't want to see this happen again, with companies that make drones getting off the hook if they cause harm because they weren't directly behind the controllers.

The clear message is that if your product is set up to cause distress or damage then you need to take responsibility.

This is harsh on a new industry. It's up to the manufacturer of the car if it crashes because of the mechanics. The driver's behavior isn't.

The first test case will be the center of attention if this draft goes through. Europe is chasing the tail of big tech, but is it realistic?

According to the European Commission, high-risk use of artificial intelligence can include infrastructure or products that could affect someone's life and livelihood.

Information disclosure about such products will allow victims to gain more insight into liability, but they will have to protect sensitive information.

Sarah Cameron, technology legal director at law firm Pinsent Masons, said the rules helped clarify liability for artificial intelligence-enabled products for consumers and businesses

The black box effect of artificial intelligence has made it difficult for businesses to adopt it.

It is possible to seek compensation from the provider of the artificial intelligence system or the manufacturer that integrates the system into another product if the system is faulty.

  • Artificial intelligence
  • European Commission
  • Drones
  • European Union