Hackers Can Use Fake Road Signs on Digital Billboards to Spoof Tesla's Autopilot

Researchers say that using phantom images and road signs for split-seconds, Tesla's and Mobileye's autopilot system can be fooled to push the brakes

Malicious actors come up with novel tricks to hack a device and potentially harm its victims. While hacking self-driving cars has been limited to the silver screen, in reality, they might have found a way to potentially cause traffic jams or even accidents using nothing but digital billboards.

Imagine your electric car that has self-driving autopilot capability suddenly stops on its own at the Times Square or anywhere with massive digital billboards. Your car will not only be slammed by some other car, you might also get severely injured.

How Did They Do It?

Apparently, fooling the autopilot mechanism of self-driving cars is not very difficult. According to Yisroel Mirsky, a researcher at the Ben Gurion University of the Negev in Israel, all a hacker needs to do is taking over a digital billboard and display a giant stop shine only for a split second. It will enable the car's autopilot to push the brakes.

Tesla Model X
A Tesla Model X's autopilot can be fooled with fake images on hacked billboard Wikimedia Commons

"The attacker just shines an image of something on the road or injects a few frames into a digital billboard and the car will apply the brakes or possibly swerve, and that's dangerous," he told Wired, adding that the split-second display won't be noticed by the human driver but it will be caught by the car's autonomous system.

"The driver won't even notice at all. So, somebody's car will just react, and they won't understand why," said Mirsky. He will present the research paper at the ACM Computer and Communications Security conference.

How to Fool Tesla's Autopilot?

Mirsky and his team did an experiment previously on Tesla cars that have limited self-driving capabilities. During the tests, they spoofed Tesla's autopilot with stop signs flashing on digital billboards for 500 milliseconds. The Tesla Model X car that they used in testing stopped after seeing the image.

In the test, they had a similar experience with Intel's Mobileye 630 devices and Mobileye autonomous cars. They could spoof Tesla autopilot and Mobileye could be spoofed with fake road signs, speed limits and even human figures. It took only 0.42 seconds to spoof Tesla while the Mobileye took only 0.13 seconds.

At night when projections are clearly visible, the team projected fake human figures to mimic pedestrians and both Tesla and Mobileye cars stopped. They could also trick the Mobileye system with a fake speed limit, Wired reported.

For the second round of research, Mirsky proposed a scenario where a hacker hijacking a digital billboard connected to the internet for malicious intent. If they can display a stop sign or fake speed limit, Tesla's autonomous system will be spoofed. A similar experiment was conducted on Tesla's latest Autopilot version of HW3 and it yielded similar results.

Self-driving
LiDAR sensors are also vulnerable to spoofing but it's more difficult than fooling cameras with fake images (representational image) Pixabay

Is LiDAR Safe from Attacks?

Unlike most of his competitors like General Motors (GM), Ford Uber and Waymo, Tesla CEO Elon Musk isn't in favor of relying on LiDAR (Light Detection and Ranging) for self-driving capabilities. Instead, his company is developing a neural network for complex processing of on-road objects and signs using cameras and various sensors.

However, since LiDAR uses lasers to calculate distance, showing fake road signs won't work. "It's measuring distance and velocity information. So, these attacks wouldn't have worked on most of the truly autonomous cars out there," said Charlie Miller, lead autonomous vehicle security architect at Cruise, which is owned by GM.

LiDAR sensors, although, aren't full proof either. As LiDAR depends on emitting light to calculate distance, hackers can spoof the system shining light at it and compromise its capabilities. However, it's still difficult to spoof LiDAR than cameras as the hacker will have to shoot light signals precisely at the nano-second level, reported GCN.

Related topics : Cybersecurity Elon musk
READ MORE