The safety of autonomous cars, debate still confuse Stop

The safety of autonomous cars, debate still confuse Stop

Unlike current is selling the autonomous car as safer than a traditional, researchers at the University of Washington (USA) have shown that some stickers can be enough to confuse an autonomous vehicle If a hacker tries.

In a study of these researchers showed that an autonomous vehicle Stop signal recognized as a limitation of 45 mp / h and at a sign right turn he was able to interpret it as a Stop (thereby stopping the running) or as an added lane.

In this way, the researchers cast doubt on the widespread idea that an autonomous car is safer than a traditional e urge builders to put more effort into cybersecurity cars, as an autonomous car can be a really dangerous weapon if you do not read the signs as it should.

How an autonomous car hacks?

The research also explains how they got the car testing not be able to connect signals with its correct meaning. For starters, you should know that operation of an autonomous car is based on a detector that collects information about what is happening around pedestrians, road markings or signs, among others. Such information is classified in a system which analyzes what is happening and take the most appropriate decision.

These researchers point out that if a hacker is able to access the file and add the image of a sign with stickers strategically placed next to a contrary meaning to that initially should have the autonomous car will behave incorrectly, as it has happened in his study. In this way, they have managed to confuse the vehicle at all times that tested the stop sign doctored and 73 percent of the time in which it has been featured signal right turn with the stickers on.

You may also like:

Will not autonomous cars like horses in 20 years?

Audi Summit 2017: Reflections on autonomous cars and artificial intelligence

Hyperlane: high-speed rail for autonomous cars