in ,

How a $ 300 projector can fool Tesla's Autopilot, Ars Technica

How a $ 300 projector can fool Tesla's Autopilot, Ars Technica
    

      troll. secure. lather, rinse, repeat –

             

Semi-autonomous driving systems don’t understand projected images.

      

      

Six months ago, Ben Nassi, a PhD student at Ben-Gurion University advised by Professor Yuval Elovici, carried off a set of successful spoofing attacks against a Mobileye Pro Driver Assist System using inexpensive drones and battery-powered projectors. Since then, he has expanded the technique to experiment — also successfully — with confusing a Tesla Model X and will be presenting his findings at the (Cybertech Israel) conference in Tel Aviv.

The spoofing attacks largely rely on the difference between human and AI image recognition. For the most part, the images Nassi and his team projected to troll the Tesla would not fool a typical human driver — in fact, some of the spoofing attacks were nearly steganographic

, relying on the differences in perception not only to make spoofing attempts successful but also to hide them from human read.

                                                  
  • website

    offering the video, an abstract outlining his work, and the full reference paper itself. We don’t necessarily agree with the spin Nassi puts on his work — for the most part, it looks to us like the Tesla responds pretty reasonably and well to these deliberate attempts to confuse its sensors. We do think this kind of work is important, however, as it demonstrates the need for defensive design of semi-autonomous driving systems.

    Nassi and his team’s spoofing of the Model X was carried out with a human assistant holding a projector, due to drone laws in the country where the experiments were carried out. But the spoof could have also been carried out by drone, as his earlier spoofing attacks on a Mobileye driver-assistance system were.

    From a security perspective, the interesting angle here is that the attacker never has to be at the scene of the attack and Does not need to leave any evidence behind — and the attacker does not need much technical expertise. A teenager with a $ 551 drone and a battery-powered projector could reasonably pull this off with no more know-how than “hey, it’d be hilarious to troll cars down at the highway, right?” The equipment does not need to be expensive or fancy — Nassi’s team used several $ – $ 551 projectors successfully, one of which was rated for only (x) resolution and lumens.

    This is the full “Phantom of the ADAS” video. The effect of projected lane markers on a Model X in Autopilot mode, at 2: , are particularly interesting.

    What do you think?

    Leave a Reply

    Your email address will not be published. Required fields are marked *

    GIPHY App Key not set. Please check settings

    Anatomy of a Scam Pitch Deck, Hacker News

    Cadillac enhances Super Cruise, adds lane change on demand, Ars Technica

    Cadillac enhances Super Cruise, adds lane change on demand, Ars Technica