Self-driving cars don’t see the world in the same way that humans do. For navigational purposes, they use a light and radar system, known as LIDAR, to construct a 3-D map of the world instead of a field of vision.
Like all computer systems, self-driving cars face the threat of network attacks—hackers could subvert the control system by distorting the vehicle’s sensors, making vehicles appear closer or farther away than they actually are, which has the potential of creating mayhem on the roadways—and many have made peace with that fact. But now there’s another potential threat.
This week, the world was introduced to a new, unique vulnerability related to self-driving cars: ghost objects. Security researcher Jonathan Petit boasts that his budget laser-and-pulse generator, which can be built for only $60, can spoof echoes of a car, pedestrian, or wall in a LIDAR’s 3-D map, making the autonomous system hallucinate things that aren’t there. The device can spoof objects within a 383-yard (350-meter) range and simulate multiple objects simultaneously.
“It’s kind of a laser pointer, really. And you don’t need the pulse generator when you do the attack,” he told Spectrum. Petit will be presenting the full details of the spoofing device at a Blackhat security conference in November.
If self-driving cars are to ever enter commercial mainstream, manufacturers will have to find a way to insure that mischievous teens armed with laser pointers aren’t capable of summoning a traffic jam—or worse—at will.
A number of countermeasures to these spoofing tactics have been proposed, and self-driving car developers, who keep a tight lid on their technical advances, have likely already put some of them into practice.
“One way to counter this might be, to employ multiple LIDARs. If you look on one LIDAR and you look at the other and you don’t see it, you can ignore [the phantom object],” said Ryan Gerdes, a researcher at Utah State University with extensive experience testing autonomous driving technology.
The phantom objects generated by the laser points are closer to a 2-D panel than a detailed hologram, and another sensor or two would do the job.
But LIDARs don’t come cheap—the Velodyne LIDARs used by Google comes at $70,000 a pop—and multiplying the number of sensors per vehicle could dramatically undercut its commercial appeal. A more practical solution would be a filter system: cut out the frequency of lights produced by the laser pointers—assuming, optimistically, that they fit within a narrow band.
A more robust sensory system would integration stereoscopic cameras in addition to LIDAR—as the VisLab did with their experimental autonomous vehicle—but both the cameras and a software system that can integrate the disparate data channels would cost a hefty amount. Indeed, the astronomical amount needed to make a plausibly secure autonomous driving system is a looming problem automakers will inevitably have to reckon with.
“They really need to consider the adversarial viewpoint. These autonomous vehicles in general are not just cybervehicles, they’re cyberphysical systems.” Gerdes said. “They have to know something about the environment they’re in. If they don’t, it doesn’t how good the tracking algorithms are, you'll have crashes.”
Autonomous systems could also take a leaf out of the book of existing collision avoidance systems, which track the velocity and distance of objects around a vehicle, and discount objects that appear out of thin air.
“It doesn’t work if it just falls out of the sky, if the object just appears, that’s not a real object [to the collision avoidance system]” Gerdes said.
Whatever countermeasures automakers do feature in their future self-driving systems—and they’re coming soon, Google wants to make it happen within 5 years—the first commercial products will likely be offered at an exorbitant prices, just like the first electric vehicles offered by Tesla. However soon self-driving cars become available, the average driver will have to wait a little longer.