The headline above is a direct quote from Elon Musk regarding the latest software update for my 2021 Tesla Model S Plaid (as seen on Elon’s shirt). The trim level "Plaid" is inspired by the Mel Brooks movie "Spaceballs," symbolizing a speed that's ridiculously fast, even surpassing the previously fastest Tesla S model, named "Ludicrous". The Tesla Plaid is the fastest production vehicle available, capable of accelerating from zero to 60 mph in just 1.9 seconds.
Approximately two years ago, my wife, Nancy, and I purchased the Tesla Model S with the Plaid aupgrade, opting for the Tesla FSD (Full Self-Driving) software package. We knew it was beta software, which indicated it wasn’t fully developed or entirely safe. To activate this beta FSD software, Tesla required us to achieve a safe driving score of 100 for 30 consecutive days. However, after a year, Tesla lowered this threshold to a score of 95, subsequently activating our beta FSD.
We regularly receive software upgrades to improve the self-driving feature. In the past month alone, we’ve received three such updates. Each time, we take our Tesla Model S Plaid out to assess the reliability and safety of the newly enhanced software. The latest software upgrade included the quote that headlines this article: “Tesla Full Self-Driving may do the wrong thing at the worst time”. Unfortunately, this has proven to be accurate. Nancy and I have experienced several “close calls”, such as nearly being rear-ended when our Tesla Model S Plaid FSD abruptly applied the brakes for no apparent reason. This strong language hadn't been used in previous software updates, though we were always reminded that we were dealing with beta software and advised to keep both hands on the wheel and eyes on the road.
A recurrent issue is the inability of the cameras, lasers, and radar sensors to accurately detect road lines, traffic signs, construction indicators, and pedestrians’ intentions to cross the street or otherwise. Much of this difficulty can be attributed to poor road maintenance in South Florida. However, I believe our highway infrastructure is suboptimal in nearly all states, with construction signs and orange cones often placed haphazardly by highway workers.
Tesla's rationale for allowing Nancy, myself, and thousands of other Tesla drivers to navigate in a Tesla steered by an incomplete, potentially unsafe autonomous driving system, is the valuable data gathered each time the software narrowly avoids causing an accident. When our Tesla behaves unpredictably, such as stopping abruptly on the highway for no reason, we are obliged to disengage the FSD software immediately. Upon doing so, a message flashes on the display screen, asking, "Why did you have to disengage the FSD?" Frankly, we've been so shaken up by these incidents that we haven't yet reported the software glitches to Elon.
Wall Street has started scrutinizing Tesla Full Self-Driving, and the media have followed suit. This attention could adversely affect Tesla's sales and stock value, explaining the increasingly stern warnings about the autonomous abilities of Teslas within software updates.
Despite these setbacks, I still believe Tesla will soon refine their software to the point that it will outperform the average motorist in terms of driving skill and safety. I am confident that this milestone is less than five years away. But for now, Tesla Full Self-Driving isn't ready for primetime.