Malc0de CyberNet
Fulfill your daily dose of Security & Tech News.

Self-Driving Cars Can Be Hacked By Just Putting Stickers On Street Signs

University of Washington
0 172

Ever since self-driving cars became a hot topic of interest, methods of exploitation have inevitably crept in. There have been reports of vulnerabilities in the software. Demonstrations on possible ways of hijacking a car remotely and disabling certain features have also come to light.

However, recent studies show a simple way to confound a self-driving car’s algorithm. Small changes on a road sign can trigger misidentifying of signs and thus lead to accidents.

I

The researchers also performed the same exact test on a RIGHT TURN sign and found that the cars wrongly classified it as a STOP sign two-thirds of the time.

Further, they applied small stickers on a STOP sign, which led the AI to believe they were all street art.

The worst part is, as long as the scope of the AI is not broadened to include such anomalies, vulnerability remains. Any change inflicted on road signs, be it manual or due to some natural event, could potentially affect the a car’s judgment.

“Attacks like this are definitely a cause for concern in the self-driving-vehicle community,” said Tarek El-Gaaly, a senior research scientist at autonomous vehicle startup Voyage. “Their impact on autonomous driving systems has yet to be ascertained, but over time and with advancements in technology, they could become easier to replicate and adapt for malicious use.”