Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror

Submission + - You Can Trick Self-Driving Cars by Defacing Street Signs (bleepingcomputer.com)

An anonymous reader writes: A team of eight researchers has discovered that by altering street signs, an adversary could confuse self-driving cars and cause their machine-learning systems to misclassify signs and take wrong decisions, potentially putting the lives of passengers in danger. The idea behind this research is that an attacker could (1) print an entirely new poster and overlay it over an existing sign, or (2) attach smaller stickers on a legitimate sign in order to fool the self-driving car into thinking it's looking at another type of street sign.

While scenario (1) will trick even human observers and there's little chance of stopping it, scenario (2) looks like an ordinary street sign defacement and will likely affect only self-driving vehicles. Experiments showed that simple stickers posted on top of a Stop sign fooled a self-driving car's machine learning system into misclassifying it as a Speed Limit 45 sign from 67% to 100% of all cases. Similarly, gray graffiti stickers on a Right Turn sign tricked the self-driving car into thinking it was looking at a Stop sign.

Researchers say that authorities can fight such potential threats to self-driving car passengers by using an anti-stick material for street signs. In addition, car vendors should also take into account contextual information for their machine learning systems. For example, there's no reason to have a certain sign on certain roads (Stop sign on an interstate highway).

This discussion was created for logged-in users only, but now has been archived. No new comments can be posted.

You Can Trick Self-Driving Cars by Defacing Street Signs

Comments Filter:

To avoid criticism, do nothing, say nothing, be nothing. -- Elbert Hubbard

Working...