Tesla Defends Autopilot Safety Features in Letter to U.S. Senators

Last month, Democratic Senators Richard Blumenthal and Ed Markey shared their “significant concerns” with Tesla’s Autopilot and Full Self-Driving (FSD) systems, writing a letter to the automaker’s CEO Elon Musk.
Tesla responded to the U.S. senators last week, detailing Autopilot and FSD features that allow owners to “drive safer than the average driver in the U.S.,” reports Reuters.
The automaker’s letter response was written by Rohan Patel, Tesla’s senior director, public policy and business development. He noted both Autopilot and FSD “require the constant monitoring and attention of the driver.” The systems can perform “some but not all of the Dynamic Driving Tasks” that a human driver would normally do.
Both senators told Reuters in a statement the Tesla letter was not good enough, saying it was “just more evasion and deflection from Tesla. Despite its troubling safety track record and deadly crashes, the company seemingly wants to carry on with business as usual.”
The senators stated in their letter to Musk, “complaints and investigations paint a troubling picture: Tesla repeatedly releases software without fully considering its risks and implications, creating grave dangers for all on the roads.”
Tesla “understands the importance of educating owners on the capabilities of Autopilot and FSD Capability,” wrote Patel in the letter.

When Tesla owners engage Autopilot and FSD, in-car warning messages state drivers need to be prepared to take control at all times and with hands on the steering wheel.
Tesla Autopilot will seek torque feedback in its steering wheel from owners to keep operating, and if there is no response, the system will disengage. If there is still no feedback, Autopilot will slow the vehicle down to a complete stop, and turn on hazard lights.
In Tesla’s latest Vehicle Safety Report for Q4 2021, the automaker explained, “we recorded one crash for every 4.31 million miles driven in which drivers were using Autopilot technology (Autosteer and active safety features). For drivers who were not using Autopilot technology (no Autosteer and active safety features), we recorded one crash for every 1.59 million miles driven.”
“By comparison, NHTSA’s most recent data shows that in the United States there is an automobile crash every 484,000 miles,” emphasized Tesla in January.
“Autopilot enables your car to steer, accelerate and brake automatically within its lane. Current Autopilot features require active driver supervision and do not make the vehicle autonomous,” explains Tesla’s website.
I’d like them to list all those incidents of “troubling safety track record and deadly crashes”. I pay really close attention to this and I haven’t heard of anything at all since Tesla released FSD Beta and with over 60k testers now on the roads that is beyond remarkable. I am also pretty confident that with all the media salivating in advance with even a hint of Tesla involvement, they would be all over it and the entire world would hear about that “self driving autopilot Tesla car that killed 100+ people”, when with a little real fact checking they might’ve discovered that it was really just a waddling of ducks and FSD saw them and let them cross the road safely. /s Ha! Anecdotes and innuendos without specific claims are harder to deal with except of course if they just get tossed because of their inanity. That would be easy enough.
The letter cited here is referencing both FSD and Autopilot, with the NHTSA currently investigating something like a dozen instances of Tesla vehicles on Autopilot crashing into emergency vehicles at first responder scenes and lawsuits filed by first responders involved in those accidents working their way through the legal system. We did hear about it from the media, if you were watching all of this unfold over the last handful of years.
The company now acknowledges these systems require constant supervision but were selling a different story not that long ago starting with the 2016 demo video stating that the subject Tesla vehicle was being driven by itself and the driver “was in the seat only for legal reasons.” The system names are still misleading at best today — FSD is not “Full Self-driving” because it requires and will continue requiring constant supervision — and driver attentiveness was not encouraged, with cabin camera monitoring only being recently and badly implemented.
When I bought Tesla model3 with FSD back in Sept 2020, the sales already told me it’s only partial FSD, and I never expect it’s full anyway. Only fools would think it can fully self-drive.
Keep in mind that emergency vehicles are in a _lot_ of accidents a year, thousands of accidents with hundreds of deaths. That a dozen over 5 years, with no deaths, involved Teslas is not particularly bad – that’s fewer than you’d expect statistically given Tesla’s market share.
Tesla always said that FSD and Autopilot required constant supervision. They have never said that it didn’t require supervision – their claim is that in the future it will become “full self driving”. Driver attentiveness was always required, requiring hands on steering wheel with alerts. They added the cameras monitoring where you’re looking as an additional layer of attention monitoring.
Yes, I’m aware of those particular incidents that is why I wanted them identified because they can’t be debunked or dealt with until they are. The software used at the time has all been upgraded and I believe the chances of those events happening again have been eliminated, or at least well in the process of being so. Yes, we all know that it is only the general public who are confused by the names, “Autopilot” and Full Self Driving”, as anyone who is willing to shell out $12k will have read all the warnings and fine print or they won’t be able to access it. This shouldn’t be about what HAS happened, if there are legitimate claims I’m sure the courts will deal with them, but more concerned with what they ARE doing now. There are over 60k testers on the road now, more than a thousand of them over a year of Beta testing under their belts and not one accident, millions of miles driven in all conditions in every state and city in the US and now in Canada too. Musk and Tesla’s first concern is safety which is why they have the 4 safest vehicles NHTSA has ever tested and their approach to FSD isn’t any less committed. The narrative that these vehicles are killing machines, and that they have horrible accident rates is just not true, completely exaggerated FUD. Tesla is doing this the only way it can be done to reach complete, Full Self Driving, as a BETA version with growing numbers of users and a deepening AI learning/teaching structure. It’s okay to be skeptical of new tech and investigate the crap out of it if you want but we need to be reasonable about it and if the data shows they are doing it then lets stop calling out old stuff that is mostly inapplicable anyways like that unfortunate incident in Texas a while back that killed two guys that were drinking, sped into a tree and the on site sheriff and media called an obvious case of Autopilot gone wrong when it was later determined to not even be turned on. I’m a biased fanboy I admit but in this case the facts need to be reviewed by the regulatory bodies responsible and not by a couple of ignorant politicians or a FUD filled media.