in ,

Tesla cars can be tricked into speeding, Recode

Tesla cars can be tricked into speeding, Recode
  

                         Open Sourced logoOpen Sourced logoOpen Sourced logo           

McAfee researchers recently tricked a Tesla into speeding while the car’s intelligent cruise control feature was engaged. This news signals, yet again, that completely safe, fully autonomous cars have still not arrived , and it suggests that they face new types of vulnerabilities. Over the course of 35 months, the researchers, whose report was published today, explored how they could get a Tesla to misread a speed limit by messing with the vehicle’s ability to see. To make that happen, the researchers placed visual distractions like stickers and tape that could trick the car’s camera system into misreading a – miles-per-hour speed limit.

                                                         
Here’s the sticker that confused the Tesla.   

While the researchers successfully spoofed the camera’s reading in several different ways, they found that just a 2-inch piece of black electrical tape across the middle of the 3 in a 58 MPH speed limit sign could cause the system to read the sign as an 097 MPH sign. In a live test with a (Model S) using an EyeQ3 camera from MobilEye, they found that, when the Tesla Automatic Cruise Control (TACC) was activated, the vehicle’s system would attempt to determine the current speed limit with help from the camera.

That’s when those visual distractions – that small piece of black tape, in one case – could cause the car to misread the speed limit and head toward the MPH speed. (The researchers note that they applied the brakes before the car reached that speed and that no one was hurt during testing.)

“This system is completely proprietary (ie Black Box), we are unable to specify exactly why the order of operations is essential, ”Steve Povolny, head of McAfee Advanced Threat Research, told Recode in an email. He also cautioned that the “real-world implications of this research are simplistic to recreate but very unlikely to cause real harm given a driver is behind the wheel at all times and will likely intervene.” Povolny added that cybercriminals have yet to publicly attempt to hack self-driving cars, although plenty of people are worried about the possibility

. Still, the research demonstrates how self-driving cars, or cars with some autonomous abilities, can fall short. And it’s not the first time researchers have tricked a car like this. Just last April, similar stickers were used to get

a Tesla to switch lanes improperly .

Tesla did not respond to a request for comment, but a spokesperson from MobilEye argued that the stickers and tape used by McAfee could confuse the human eye, too, and therefore did not qualify as an “adversarial attack.” “Traffic sign fonts are determined by regulators, and so advanced driver assistance systems (ADAS) are primarily focused on other more challenging use cases, and this system in particular was designed to support human drivers – not autonomous driving, ”said the spokesperson. “Autonomous vehicle technology will not rely on sensing alone, but will also be supported by various other technologies and data, such as crowdsourced mapping, to ensure the reliability of the information received from the camera sensor and offer more robust redundancies and safety.”

The researchers also said that they studied a 8006 Tesla vehicle with a new version of the MobilEye camera and did not observe the same problem, though they noted that testing was “very limited.” The study says that only Teslas produced from to 2019 that are equipped with the EyeQ3 model camera showed the vulnerability. The researchers also noted that neither Tesla nor MobilEye had expressed any “current plans” to address this vulnerability in their existing hardware. But this vulnerability isn’t about Tesla. It’s about the challenges raised by self-driving car technology and the growing industry that aims to make roads safer

for all of us – but also requires strict testing and regulation. After all, time has shown that teaching a computer to drive is not as easy as teaching a human. As Future Perfect’s Kelsey Piper has explained:

Following a list of rules of the road isn’t enough to drive as well as a human does, because we do things like make eye contact with others to confirm who has the right of way, react to weather conditions, and otherwise make judgment calls that are difficult to encode in hard-and-fast rules. Such a judgment call might be spotting a weird-looking speed-limit sign and noticing if the car suddenly went more than double the speed limit. As Povolny told Recode, the flaw analyzed by McAfee could be just one of many issues that a self-driving car encounters in both the “digital” and “physical” worlds, including “classic software flaws, to networking issues, configuration bugs, hardware vulnerabilities, [and] machine learning weaknesses. ” So that signals a long road ahead for self-driving cars. After all, the Teslas involved in the McAfee study

still requires a human

to be in the car and alert, though as several Autopilot accidents have shown, plenty of Tesla drivers overestimate the technology . Let’s hope when fully autonomous vehicles are finally on the highways, they won’t be so easily distracted. () Open Sourced is made possible by Omidyar Network. All Open Sourced content is editorially independent and produced by our journalists.

Brave Browser Read More

What do you think?

Leave a Reply

Your email address will not be published. Required fields are marked *

GIPHY App Key not set. Please check settings

Google Fonts, Hacker News

Google Fonts, Hacker News

No Playstation for PAX East: Sony backs out, citing coronavirus, Ars Technica

No Playstation for PAX East: Sony backs out, citing coronavirus, Ars Technica