Tesla Autopilot Can Be ‘Tricked’ by Phantom Images: Researcher
Tesla’s driver-assistance service, called Autopilot, has been known for having some safety issues in the past, though the company plans to release its full self-driving beta sometime soon.
Now, Tesla Autopilot cameras are mistaking stop signs and other momentarily flashed images for obstructions, causing Teslas to stop completely, as reported by Wired.
Yisroel Mirsky, a researcher for Ben Gurion University and Georgia Tech said, “The attacker just shines an image of something on the road or injects a few frames into a digital billboard, and the car will apply the brakes or possibly swerve, and that’s dangerous.” He continued, “The driver won’t even notice at all. So somebody’s car will just react, and they won’t understand why.”
Mirsky worked with an Israel-based research team that used phantom images to attempt to trick semi-autonomous driving assistance programs, like Tesla’s Autopilot. The group found that they can simply add a few frames to an electronic billboard to trick the driver-assistance program, tricking multiple Teslas to stop unexpectedly.
Most of the tests were performed on Autopilot’s newest version, HW3. Most of the images used only needed to appear for 0.42 seconds to trick the Tesla.
Hopefully, the tests can help Tesla address the problems and prevent them from affecting future updates of the software. Some owners have been dealing with a phantom braking issue, which CEO Elon Musk says has been fixed already.