Split-Second ‘Phantom’ Images Can Fool Tesla’s Autopilot Split-Second ‘Phantom’ Images Can Fool Tesla’s Autopilot
Safety concerns over automated driver-assistance systems like Tesla’s usually focus on what the car can’t see, like the white side of a truck that... Split-Second ‘Phantom’ Images Can Fool Tesla’s Autopilot

Safety concerns over automated driver-assistance systems like Tesla’s usually focus on what the car can’t see, like the white side of a truck that one Tesla confused with a bright sky in 2016, leading to the death of a driver. But one group of researchers has been focused on what autonomous driving systems might see that a human driver doesn’t—including “phantom” objects and signs that aren’t really there, which could wreak havoc on the road.

Researchers at Israel’s Ben Gurion University of the Negev have spent the last two years experimenting with those “phantom” images to trick semi-autonomous driving systems. They previously revealed that they could use split-second light projections on roads to successfully trick Tesla’s driver-assistance systems into automatically stopping without warning when its camera sees spoofed images of road signs or pedestrians. In new research, they’ve found they can pull off the same trick with just a few frames of a road sign injected on a billboard’s video. And they warn that if hackers hijacked an internet-connected billboard to carry out the trick, it could be used to cause traffic jams or even road accidents while leaving little evidence behind.

“The attacker just shines an image of something on the road or injects a few frames into a digital billboard, and the car will apply the brakes or possibly swerve, and that’s dangerous,” says Yisroel Mirsky, a researcher for Ben Gurion University and Georgia Tech who worked on the research, which will be presented next month at the ACM Computer and Communications Security conference. “The driver won’t even notice at all. So somebody’s car will just react, and they won’t understand why.”

In their first round of research, published earlier this year, the team projected images of human figures onto a road, as well as road signs onto trees and other surfaces. They found that at night, when the projections were visible, they could fool both a Tesla Model X running the HW2.5 Autopilot driver-assistance system—the most recent version available at the time, now the second-most-recent —and a Mobileye 630 device. They managed to make a Tesla stop for a phantom pedestrian that appeared for a fraction of a second, and tricked the Mobileye device into communicating the incorrect speed limit to the driver with a projected road sign.

In this latest set of experiments, the researchers injected frames of a phantom stop sign on digital billboards, simulating what they describe as a scenario in which someone hacked into a roadside billboard to alter its video. They also upgraded to Tesla’s most recent version of Autopilot known as HW3. They found that they could again trick a Tesla or cause the same Mobileye device to give the driver mistaken alerts with just a few frames of altered video.

The researchers found that an image that appeared for 0.42 seconds would reliably trick the Tesla, while one that appeared for just an eighth of a second would fool the Mobileye device. They also experimented with finding spots in a video frame that would attract the least notice from a human eye, going so far as to develop their own algorithm for identifying key blocks of pixels in an image so that a half-second phantom road sign could be slipped into the “uninteresting” portions. And while they tested their technique on a TV-sized billboard screen on a small road, they say it could easily be adapted to a digital highway billboard, where it could cause much more widespread mayhem.

The Ben Gurion researchers are far from the first to demonstrate methods of spoofing inputs to a Tesla’s sensors. As early as 2016, one team of Chinese researchers demonstrated they could spoof and even hide objects from Tesla’s sensors using radio, sonic, and light-emitting equipment. More recently, another Chinese team found they could exploit Tesla’s lane-follow technology to trick a Tesla into changing lanes just by planting cheap stickers on a road.

But the Ben Gurion researchers point out that unlike those earlier methods, their projections and hacked billboard tricks don’t leave behind physical evidence. Breaking into a billboard in particular can be performed remotely, as plenty of hackers have previously demonstrated. The team speculates that the phantom attacks could be carried out as an extortion technique, as an act of terrorism, or for pure mischief. “Previous methods leave forensic evidence and require complicated preparation,” says Ben Gurion researcher Ben Nassi. “Phantom attacks can be done purely remotely, and they do not require any special expertise.”

Source link