John Bernal, a former employee of the company, was fired after he shared candid video reviews on his channel on how the company's Full Self Driving system worked in different locations around Silicon Valley.
After Bernal's dismissal, the company cut off his access to the system in his car, despite having no safety issues. He still has a premium driver assistance software. Today's cars do not have technology that makes them self-sufficient.
A set of new driver assistance features that are not finished can be summarized as the FSDBeta option. Automatic steering on city streets is one of them and lets the car navigate around complex urban environments without the driver needing to move the steering wheel. To get a high driver-safety score, customers must first have a high driver-safety score, which costs $12,000 up front or $199 per month in the U.S., and then obtain and maintain a high driver-safety score.
Silicon Valley companies often foster a culture of loyalty because they don't put details into writing why they fired someone. Critics in public are viewed as disloyal.
The company did not respond to the request for comment.
In August 2020 Bernal joined Musk's electric vehicle maker as a data annotation specialist in an office in San Mateo, California. According to records he shared with CNBC, he was dismissed in the second week of February this year after moving into the role of advanced driver assistance systems test operator.
A few months after he started working at the company, Bernal put in an order to buy a Model 3 with a long-range battery. The car was delivered on December 26, 2020.
He bought the car because of the $8,000 perk that was given to employees. Employees had to agree to allow the company to collect internal and external vehicle data.
He started the Artificial Intelligence Addict channel on YouTube to show what the public version of FSD could do.
Most of the videos show Bernal driving around Silicon Valley with a friend in his car, using the newest versions of the software.
There were other people who posted their experiences with the experimental software. Chuck Cook, Kim Paquette, and many others rush to review the new releases on their channels.
The reason for his firing was not included in his separation notice. His car knocked over bollards while he was driving in San Jose in one of his videos.
Bernal says before he was dismissed, managers told him he broke a policy and that his channel was a conflict of interest.
He said that he was always transparent with the public and with his managers. He always listed his employment on his online resume. He said that he had never seen a policy forbidding him from using his own property to create car tech reviews.
A current employee gave a copy of the social media policy to the company. The policy states that the company encourages its employees to use social media in a responsible manner.
He said that the releases he was demonstrating were end- user products.
Some of his videos showed problems with the system.
In March 2021, for example, a video entitled "Oakland - Close calls, Pedestrians, Bicycles!" was posted by the person who calls himself "ai Addict". At 11 minutes and 58 seconds into the video, a vehicle is crossing in front of a Model 3, and the system rolls into an intersection. He narrowly missed hitting the other car.
A quarter million views have been racked up by that video.
A manager from my Autopilot team tried to discourage me from posting any negative or critical content in the future that involved FSDBeta, according to Bernal. They held a video conference with me but didn't write anything down.
According to an analysis of his channel by CNBC, roughly ten of 60 videos he posted revealed flaws. Three of the videos focused on other topics and didn't discuss the FSDBeta, while another three focused on other vehicles and weren't related to the company at all.
The company revoked his access after he was terminated, even though he had not gotten any strikes for unsafe driving or improper use of the system. Before access is revoked, users are allowed several strikes.
His ability to create reviews of the system has been hampered by losing access to his own car. He plans to continue his independent research and reviews, even though he has gained access to other vehicles.
He knew he could get attention by posting honest reviews. He thought that if he was honest, he would be allowed to use the technology, and that he would be told if he needed to stop before he lost his job.
He told CNBC that he still cares about car safety and finding bugs.
Musk has a history of asking employees and customers not to speak publicly about problems with their cars or the business.
Like many large companies,Tesla requires its employees to sign an agreement committing to resolve conflicts with the company without public lawsuits. Sometimes employees can legally challenge and sometimes get released from the mandatory arbitration and go on to have their day in court, but those instances have been rare.
Customers used to be required to sign non-disclosure agreements in exchange for service.
CNBC previously reported that the company asked drivers who were part of the early access program to refrain from posting to social media.
Federal vehicle safety regulators worried that the practice could have a chilling effect. They started a probe into the program.
Musk said at the conference that the company shouldn't have any restrictions like that. He said during an interview with Kara Swisher at the Code Conference that the people who were following the program were not really following it.