Tesla’s Full Self-Driving Software Under NHTSA Investigation
The National Highway Traffic Safety Administration (NHTSA) has opened an investigation into Tesla’s Full Self-Driving (FSD) software, covering 2.4 million vehicles.
The review follows reports of four crashes, including one fatal incident, where FSD was engaged during low-visibility conditions such as sun glare, fog, or airborne dust. Drivers are told to always be ready to take over before using FSD.
“The Tesla vehicle fatally struck a pedestrian. One additional crash in these conditions involved a reported injury,” said the NHTSA.
Tesla emphasizes on its website that FSD requires active driver supervision and is not intended to function as fully autonomous technology—yet.
The probe focuses on Tesla models from 2016 to 2024, including the Model S, Model X, Model 3, Model Y, and the new Cybertruck. NHTSA will assess whether FSD’s engineering controls effectively detect and respond to visibility challenges. It’s also asking if Tesla has rolled out software updates addressing these conditions and will evaluate their impact on safety, according to Reuters.
This inquiry is part of the agency’s standard process to gather more data before determining if a recall is necessary. Tesla, however, has already taken proactive steps, “recalling” over 2 million vehicles in December 2023 to add new safeguards to its Autopilot system, all through over-the-air software updates.
Yesterday, Tesla launched its second free 30-day trial of FSD for customers in Canada and the USA.