Tesla Full Self-Driving Beta ‘Lacks Safeguards’, Says Consumer Reports

Tesla released its long-anticipated Full Self-Driving (FSD) beta software version 9 last week, which removes the use of radar and replaces it with camera-based sensors – which some have said puts other drivers on public roads at risk.

Consumer Reports reported on Monday that its auto experts were “concerned” about the use of Tesla FSD beta version 9 on public roads, going on to say that pedestrians, cyclists, and others had become part of an ongoing experiment which “they did not consent to.”

Senior Director of the publication’s Auto Test Center, Jake Fisher, claims the software isn’t providing the necessary safety protocol to beta testers, despite the fact that they’re the ones footing the bill.

Fisher said, “Videos of FSD beta 9 in action don’t show a system that makes driving safer or even less stressful.” Fisher continued, “Consumers are simply paying to be test engineers for developing technology without adequate safety protection.”

Even Tesla CEO Elon Musk warned FSD 9 users to be cautious, encouraging them to “be paranoid,” since there’s bound to be “unknown issues.” Still, Tesla’s FSD has always run in beta on public roads, even in its radar days, and the more drivers use the service, the more details the neural network is capable of learning – effectively improving the software as time goes on.

While the publication has yet to run its own independent test with the software, it plans to when its trial Model Y receives the update. However, the publication has been exposed to a number of videos of the software, from which it has cited concerns about scraping bushes, missing turns, and even heading towards parked cars, each of which emphasizes the driver’s need to be vigilant when using the service.