California Senator Wants DMV to Investigate Tesla’s Full Self-Driving Beta

Lena Gonzalez, chair of the California Senate’s Transportation Committee, on Tuesday sent a letter to California Department of Motor Vehicles (DMV) Director Steve Gordon, raising concerns over Tesla’s Full Self-Driving (FSD) beta and urging him to investigate — reports The Los Angeles Times.

Full Self-Driving is Tesla’s suite of advanced driver assistance features, available as a $10,000 USD add-on or a $199 USD/month subscription and currently in a public beta.

A number of governmental institutions and regulators have recently questioned just how safe it is to essentially let experimental autonomous driving software run rampant on American streets, in addition to its problematic branding.

“I have seen a number of videos of Tesla vehicles operating with FSD engaged where it appears that serious driving errors were made and collisions were avoided only because of swift action by the driver,” Gonzalez says in her letter.

Gonzalez went on to describe FSD’s apparent poor performance, and said that while she is only privy to FSD beta safety and performance data she has been able to gather from videos of the technology in action, the DMV “has the knowledge to assess these situations.”

The California DMV was appointed the state’s chief autonomous-driving regulator by Legislature in 2012.

Gonzalez went on to ask Gordon and the DMV for answers to the following questions:

  • “What is your assessment of the FSD beta trials?”
  • “Is there a danger to the public?”
  • “If the DMV finds the beta program unsafe, how does the DMV plan to address any potential concerns?”

The DMV has said that it is reviewing Gonzalez’s letter.

Tesla’s FSD and Autopilot technologies have landed the electric vehicle (EV) pioneer into quite a bit of hot water as of late. Earlier this year, the U.S. government opened an investigation into Tesla Autopilot over multiple crashes involving first responder vehicles.

FSD beta technology has fortunately not resulted in any reported deaths or serious injuries, but Jennifer Homendy, head of the National Traffic Safety Board (NTSB), the federal government’s crash investigator, said, “It shouldn’t require a fatality for regulators and politicians to take action” on Tesla’s FSD deployment.

Tesla on Saturday started rolling out Full Self-Driving beta 10.6.1 to the public, wrapped up in firmware update 2021.36.8.10.

Any beta testers of the software are required to have full control of the vehicle at all times, as any miscues can result in lower Safety Scores, which could mean being booted from the software trial.

Want to see more of our stories on Google?

Add Tesla North as a Preferred Source on Google

P.S. — Buying a new Tesla? Click here to save $1,000 USD, while supporting independent news.

Help support us by shopping on Amazon here.

Links in this post are affiliate links, so we earn a tiny commission at no charge to you. Thanks for supporting independent media!

Subscribe
Notify of
guest
2 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Threader
Threader
4 years ago

They don’t need to scrutinize FSD beta as it is very small sample size that are currently using it. They should focus on Vision Only AP and Adaptive Cruise control that is in use in every Tesla sold since last summer. Extreme Phantom braking and constant slow downs for no apparent reason putting at risk Tesla drivers and passenger from rear end collision and other very sketchy highway situations.

Bill Johnson
Bill Johnson
4 years ago

It seems to me (I’m just a poor semi ignorant little schmuck) that this all comes down to continued Tesla FUD and hatred. Most of us agree that FSD is coming and will be here sometime in our lifetimes (depending on how old you are). With that said, how many of us expect it to just pop out of a box of Cracker Jacks and how many of us expect it to have a fairly long learning curve? There is only one way to achieve FSD that will operate anywhere on any road and that is with AI. Coding doesn’t work because there is an endless amount of coding to keep up with the endless number of edge cases, not to mention constantly shifting geo-mapped areas. So, AI a la Tesla will teach the Neural Nets to figure out what to do and that is currently what all the Beta drivers are helping the system do. Take them off the road and the AI learning process is killed and it’ll be back to the sand box versions that everyone but Tesla seems to think is the best way to go. If Tesla was experiencing accidents at any scale in FSD mode, I might be more sympathetic to pouring taxpayer’s money into basically a witch hunt against them. But the fact is that they aren’t experiencing an increase in accidents, quite the opposite seems to be true with Tesla’s Beta Scores seeming to reduce accidents by over 30% according to their data. There can and will be an endless array of “what ifs” and “it could be dangerous and unsafe” but the reality is that these are infantile excuses for not pursuing a better future. Tesla will figure it out or the lawsuits will close them down. (Actually I believe Musk is far too safety conscious to let it go that far, if he thought it was unsafe, he’d shut it down himself).

2
0
Would love your thoughts, please comment.x
()
x