New York Times ad warns against Tesla’s “Full Self-Driving”

A full page advertisement in the New York Times took aim at the software that was sold by the company, calling it the worst software ever sold by a Fortune 500 company, and offering $10,000, the same price as the software itself.

The ad was taken out by The Dawn Project, a recently founded organization aiming to ban unsafe software from safety critical systems that can be targeted by military-style hackers, as part of a campaign to removeTesla Full Self-Driving (FSD) from public roads until it has "1,000 times

The founder of the advocacy group is also the CEO of Green Hill Software, a company that builds operating systems and programming tools for embedded safety and security systems. The company said at the show that BMW's iX vehicle is using its real-time OS and other safety software, and that it also announced the availability of its new over-the-air software product and data services for automotive electronic systems.

Despite the potential competitive bias of The Dawn Project's founder, the advanced driver assistance system that is available toTesla owners is under scrutiny after a series of videos showed flaws in the system.

The NYT ad came just days after the California Department of Motor Vehicles told the company that its test program, which uses consumers and not professional safety operators, doesn't fall under the department's regulations. The California Department of Motor Vehicles regulates the testing of self-drive cars and requires companies like Cruise and Waymo to report crashes and system failures. Those reports have never been issued by the company.

Musk responded vaguely, claiming that the safety of the system has not resulted in injury or accident since it was launched. The NHTSA is investigating a report from the owner of a car that went into the wrong lane while making a left turn and was struck by another car.

Even if that was the first crash of its kind, the ADAS that comes standard on vehicles has been involved in a number of crashes.

The Dawn Project published a fact check of its claims after seeing the NYT ad.

The study avoided videos with a lot of positive or negative titles to reduce bias. The California Department of Motor Vehicles requires human drivers to pass a Driver Performance Evaluation in order to get a driver's license. To pass a driver's test in California, drivers must have 15 or fewer scoring maneuver errors, like failing to use turn signals when changing lanes or maintaining a safe distance from other moving vehicles, and zero critical driving errors, like crashing or running a red light.

The study found that in less than an hour, FSD v10 committed 16 scoring maneuver errors and a critical driving error. It will take another 7.8 years to achieve the accident rate of a human driver at the current rate of improvement, according to the analysis.

The sample size is too small to be taken seriously from a statistical standpoint, and the Dawn Project makes some bold claims that should be taken with a grain of salt. If the seven hours of footage is representative of an average FSD drive, the findings could be indicative of a larger problem with the software and speak to the broader question of whether it should be allowed to be tested on public roads.

The ad states that the families did not sign up for crash test dummies for thousands of cars being driven on public roads.

Federal regulators have begun to take action against the company.

NHTSA sent two letters to the automaker in October, targeting the company's use of non-disclosure agreements for owners who gain early access to the free software development kit, as well as the company's decision to use over-the-air software updates to fix an issue in the Consumer Reports said in the summer that the upgrade to the software didn't appear to be safe enough for public roads and that it would independently test it. Last week, the organization published its test results, which showed that the camera-based driver monitoring system ofTesla fails to keep a driver's attention on the road. CR found that Ford's BlueCruise issues an alert when the driver's eyes are diverted.

Since then, there have been many different versions of the v10 software, and version 11 is coming out in February.

Some online commenters say that the latest version is much better than the previous one, while others say they don't feel confident in using the tech. The newest version of the software is not ready for the general public yet, as shown in a thread on theTeslaMotors.com subreddit page.

The car had to turn left and then accelerate because it took too long for it to turn right onto an empty, straight road.

The system completely ignored an upcoming left turn, one that was to occur at a standard intersection with lights and clear visibility in all directions and no other traffic.

The Dawn Project has a campaign that highlights a warning from the company that may do the wrong thing.

The advocacy group said that anyone can tolerate a safety-critical product on the market. Isn't that the definition of faulty? We need to remove full self- driving from our roads.

Neither The Dawn Project norTesla could be reached for comment.