Tesla's decision to test its Full Self Driving driver assistance software on untrained drivers on public roads has drawn criticism and scrutiny. This was even before the latest release.
Version 10.3 was released on Saturday night/Sunday morning. There were many release notes. There are changes that include driver profiles that allow for switching between different characteristics to follow distance, roll stops, or exiting lanes. It is supposed to detect brake lights, turn signals and hazard lights from other cars better, as well as reduce false slowdowns and improve offsetting for pedestrians.
Elon Musk tweeted Sunday afternoon that Tesla was having issues with 10.3, so he temporarily rolled back to 10.2.
Some issues with 10.3, so I'm temporarily reverting to 10.2
This is normal with beta software. Public beta is not available as it is impossible to test every hardware configuration under all conditions. Elon Musk (@elonmusk), October 24, 2021
This software doesn't make Tesla cars autonomous. Elon Musk, Tesla CEO, has stated that the full-featured version of the software his company calls Full Self-Driving will at best be able to drive someone from home to work, but will still need supervision. This does not mean that a fully autonomous vehicle is possible.
Many drivers have shared video and impressions about their experiences with the release. However, it is not clear if that aligns with what Tesla wants participants share on social media. Tesla testers state that the rollback update removes FSD beta capabilities completely from their cars.
Many posters claimed that the 10.3 update introduced phantom forward collision warnings (FCW). Other issues included a disappearing Autosteer option and traffic aware cruise control problems (TACC). There was also occasional AutoPilot panic. Although it is not clear how common these issues are or which ones caused the rollback, Musk replied to a tweet regarding the Autosteer/TACC issues and said that the company was working on it.
If it is a problem that is common within the test group, then a Phantom FCW would be enough to cause a rollback. There was a Mazda3 recall in 2019 to fix problems with Smart Braking System detecting objects in cars' paths. Vehicles that suddenly hit the brakes without warning could cause an accident if they are following closely. One problem testers may face is the fact that some claimed that the false FCW incidents have lowered their Tesla safety score enough to prevent them from remaining in beta.
If you are concerned about being a non-participant in the test group, simply by existing near a Tesla with work-in-progress code, this could indicate that the company is working quickly to address problems or that it is dangerous. For Tesla owners who hope the test will expand to include those with lower safety scores, hacker @greentheonly tweeted, To those with low scores who wait for the FSD: don't. It would be terrible if you drove as the app demands. The car drives worse! The car is completely unusable on any type of traffic, and the videos don't do it justice on narrow roads.