Elon Musk shot down a system that would monitor Autopilot malfunctions over fears it would slow progress, NYT report says

According to a recent report from The New York Times, the plans to monitor the health of the Autopilot system were dismissed by Musk.

The publication spoke to 19 people who helped work on the project. The sources, who preferred to remain anonymous, said that since the program was introduced, Musk has misrepresented the true capabilities of the software, as well as sacrificed additional safety measures for speed and aesthetic reasons.

Insider asked for comment from the company.

According to the Times, several engineers from the company met with Musk in 2015 to discuss plans for a second version of autopilot. Hal Ockerse, a manager on the program, told the CEO that he wanted to include a system that could monitor the software. Two sources who attended the meeting told the Times that Ockerse's plan would include a computer chip and other hardware that would act as a safeguard in case the autopilot software stopped working.

Multiple sources told the publication that the previous version of Autopilot malfunctioned in winter, making it difficult for drivers to predict when the software would work.

As soon as Ockerse's idea was proposed, Musk rejected it. The manager was dressed down by Musk for the proposal. They said the CEO was angry when he got to the meeting because his autopilot malfunctioned.

The CEO has echoed Musk's concern that the system would slow down Autopilot's progress. Musk has taken a number of steps to speed up the delivery of the software. The company regularly updates its software, using its drivers to identify bugs.

Ockerse left the company after working for a little over a year. He worked at Apple and Nio before he left to work at a Chinese car company. Ockerse didn't reply to a request for comment.

The autopilot system ofTesla has recently come under fire. The National Highway Traffic Safety Administration began investigating the software after it was involved in at least 12 accidents. The autopilot failed to recognize emergency vehicles in several accidents.

The use of radar or LiDAR technology is not common in the development of self-drive software. Musk has warned that the system does not replace a license driver. Over 5 million car accidents are caused by human errors in the US each year, and Musk has said in the past that one day automated cars will be able to cut back.

The New York Times has a full investigation on their website.