troll. secure. lather, rinse, repeat –
Semi-autonomous driving systems don’t understand projected images.
Six months ago, Ben Nassi, a PhD student at Ben-Gurion University advised by Professor Yuval Elovici, carried off a set of successful spoofing attacks against a Mobileye Pro Driver Assist System using inexpensive drones and battery-powered projectors. Since then, he has expanded the technique to experiment — also successfully — with confusing a Tesla Model X and will be presenting his findings at the (Cybertech Israel) conference in Tel Aviv.
The spoofing attacks largely rely on the difference between human and AI image recognition. For the most part, the images Nassi and his team projected to troll the Tesla would not fool a typical human driver — in fact, some of the spoofing attacks were nearly steganographic
, relying on the differences in perception not only to make spoofing attempts successful but also to hide them from human read.
This is a frame from an ad you might see on a digital billboard, with a fake speed-limit sign inserted. It’s only present for an eighth of a second, and most drivers would miss it — but AI image recognition recognizes it.
Humans wouldn’t fall for a fake road sign projected into tree leaves. But AI image recognition generally will.
Humans would definitely notice these projected lane markers but would be unlikely to honor them. The Autopilot in a Tesla Model X took them as legit and swerved to follow them.
Nassi created a video outlining what he sees as the danger of these spoofing attacks, which he called “Phantom of the ADAS , “and a small
website
offering the video, an abstract outlining his work, and the full reference paper itself. We don’t necessarily agree with the spin Nassi puts on his work — for the most part, it looks to us like the Tesla responds pretty reasonably and well to these deliberate attempts to confuse its sensors. We do think this kind of work is important, however, as it demonstrates the need for defensive design of semi-autonomous driving systems.
Nassi and his team’s spoofing of the Model X was carried out with a human assistant holding a projector, due to drone laws in the country where the experiments were carried out. But the spoof could have also been carried out by drone, as his earlier spoofing attacks on a Mobileye driver-assistance system were.
From a security perspective, the interesting angle here is that the attacker never has to be at the scene of the attack and Does not need to leave any evidence behind — and the attacker does not need much technical expertise. A teenager with a $ 551 drone and a battery-powered projector could reasonably pull this off with no more know-how than “hey, it’d be hilarious to troll cars down at the highway, right?” The equipment does not need to be expensive or fancy — Nassi’s team used several $ – $ 551 projectors successfully, one of which was rated for only (x) resolution and lumens.
Autopilot is intended for use only with a fully attentive driver who has their hands on the wheel and is prepared to take over at any time.
While Autopilot is designed to become more capable over time, in its current form, it is not a self-driving system, it does not turn a Tesla into an autonomous vehicle, and it does not allow the driver to abdicate responsibility . When used properly, Autopilot reduces a driver’s overall workload, and the redundancy of eight external cameras, radar and (ultrasonic sensors provides an additional layer of safety that two eyes alone would not have.
Even the name “Autopilot” itself isn’t as inappropriate as many people assume — at least, not if one understands the reality of modern aviation and maritime autopilot systems in the first place. Wikipedia references the FAA’s Advanced Avionics Handbook when it defines
Within these constraints, even the worst of the responses demonstrated in Nassi’s video — that of the Model X swerving to follow fake lane markers on the road — doesnt seem so bad. In fact, that clip demonstrates exactly what should happen: the owner of the Model X — concerned about what the heck his or her expensive car might do — hit the brakes and took control manually after Autopilot went in an unsafe direction.
The problem is, there’s good reason to believe that far too many drivers don’t believe they really need to pay attention . A survey demonstrated that nearly half of the drivers polled believed it was safe to take their hands off the wheel while Autopilot is on, and six percent even thought it was OK to take a nap. More recently, Sen. Edward Markey (D-Mass.) Called for Tesla to improve the clarity of its marketing and documentation, and Democratic presidential candidate Andrew Yang went hands-free in a (campaign ad) – just as Elon Musk did before him, in a
(Minutes) (segment)
The time may have come to consider legislation about drones and projectors specifically, in much the same way laser pointers were regulated after they became popular and cheap. Some of the techniques used in the spoofing attacks carried out here could also confuse human drivers. And although human drivers are at least theoretically available, alert, and ready to take over for any confused AI system today, that won’t be the case forever. It would be a good idea to start work on regulations prohibiting spoofing of vehicle sensors before We do no longer have humans backing them up.
GIPHY App Key not set. Please check settings