Tesla’s Autopilot system relies on vision rather than LIDAR, which means it can be tricked by messages on billboards and projections created by hackers. Security researchers have demonstrated how Tesla’s Autopilot driver-assistance systems can be tricked into changing speed, swerving or stopping abruptly, simply by projecting fake road signs or virtual objects in front of them. Their hacks worked on both a Tesla running HW3, which is the latest version of the company’s Autopilot driver-assistance system, and the previous generation, HW2.5. The most concerning finding is that a fake road sign only needs to be displayed for less than half a second, in order to trigger a response from Tesla’s system. In one example cited by the researchers, a ‘Stop’ sign hidden within a fast food commercial successfully caused a Tesla running in Autopilot mode to stop, despite the command only flashing on-screen for a fraction of a second.

Source: newsweek