Were you not able to attend the event? You can find all of the summit sessions in our library. Here is the place to watch.

The second-by-second history of everything that happens in the aircraft's systems as well as of the pilots' actions have been priceless in figuring out the causes of crashes.

There is no reason why self-driving cars and robot shouldn't have the same thing. The question is not a hypothetical one.

Federal transportation authorities are looking into a number of crashes involving cars with the autopilot feature. A man changing a tire on the side of the road was hit by a car and killed.

Car companies are increasing their automated driving technologies. Walmart and Ford are teaming up to test self-driving cars for home delivery, and the same companies are testing a fleet ofrobo-taxis.

Read: Governing AI Safety through Independent Audits

Cars, trucks, and robot welders are on the factory floor. Japanese nursing homes provide a variety of services. Walmart is one of the stores that uses robots to clean floors. There are at least six companies that sell robot lawnmowers.

There are more risks with more daily interactions. With those risks in mind, an international team of experts has published a set of governance proposals to better anticipate problems and increase accountability A black box is one of its main ideas.

"When things go wrong, you get a lot of shoulder shrugs," says Gregory Falco, a co-author of the book. The approach would help assess the risks and create an audit trail. The main goal is to make it easier to be accountable.

The new proposals focus on three principles: preparing prospective risk assessments before putting a system to work, creating an audit trail, and promoting adherence to local and national regulations.

Government mandates aren't called for by the authors. They argue that insurers, courts, and customers want companies to adopt their approach. Insurers want to know as much as possible about potential risks before they cover them. An executive with Swiss Re is one of the paper's co- authors. Courts and attorneys need a data trail to determine who should or shouldn't be held responsible for accidents. Customers don't want to be in danger.

Black boxes for self-driving vehicles are being developed by companies because the National Transportation Safety Board has warned them about the data it will need to investigate accidents. The black box for that industry has been mapped out by Falco and a colleague.

The safety issues are much broader than cars. A black box would be useless if a recreational drone slices through a power line and kills someone. The same would happen for a lawnmower. The authors argue that medical devices that use artificial intelligence need to record when they are in use.

The black box data and information obtained through human interviews should be made public, according to the authors. Allowing independent analysts to study those records would allow other manufacturers to make improvements to their systems.

Black box recorders should be included in consumer products that are relatively cheap. The authors argue that risk assessment needs to be incorporated at every stage of a product's development.

Someone needs to give information for all the things that can go wrong when an agent is acting in the open environment. We created a data trail to carry out postmortems and provided people with a road map for how to think about the risks.

Edmund L.Andrews is a writer.

The story was first published on Hai.stanford.edu It's called Copyright 2022.

The VentureBeat community welcomes you.

Data decision makers can share data related insights and innovation.

Join us at DataDecisionMakers to read about cutting-edge ideas and up-to-date information.

You could possibly contribute an article of your own.

Data decision makers have more to say.