Google Screened

Driverless Cars Cannot Differentiate Between Projections and Objects

Cybercrime is a growing concern as driverless cars become more common. Automated vehicles controlled by computers present opportunities for hackers to exploit the system and cause accidents. Cyber security experts have noted that modern cars present an urgent national security issue, as attackers from hostile states can use them as weapons. According to a new study, these attackers may be able to control both semi-autonomous and fully autonomous vehicles through holographic projections, which the car cannot differentiate from physical objects.

Distinguishing Between Real and Fake Objects

The study, published by the International Association for Cryptologic Research, shows that driverless cars can be tricked by holographs. This presents yet another threat that is potentially more widespread as it does not require the same level of technological skills or knowledge as hacking into a car’s computer system. Researchers from Ben-Gurion University of the Negev’s (BGU) Cyber Security Research Center in Israel found that driverless cars stopped when they detected projections, such as street signs and lane markers. Their research mirrors that of researchers from South Carolina in 2016 who tricked a Tesla’s autopilot sensors into turning off the shelf radio, sound, and light-emitting tools.

The study shows how attackers can exploit driverless vehicles without physically being on the scene by either projecting a phantom via a drone with a portable projector or hacking a digital billboard positioned near roads to present a phantom. The experiments show that a car’s advanced driving assistance system (ADAS) considers the phantoms to be real objects, thus triggering the brakes, steering, and notification/alert systems. Researchers in the study demonstrated how such an attack could occur on even the most advanced driver-assistance systems.

Flawed Object Detectors

Up until now, developers of driverless vehicles have not addressed these concerns. According to the study’s lead author, the flaw is not attributable to bugs or poor coding errors, but rather to inadequate training; the computer systems currently err on the side of safety when it comes to 2D objects, which can lead to costly mistakes.

Researchers are now developing a system that can distinguish between real and fake objects. Researchers from Ben-Gurion offer a model that can analyze and detect objects’ context, surface, and reflected light. They hope that deployment of these new and improved systems will reduce attackers’ ability to conduct phantom attacks on driverless vehicles.

Delaware Car Accident Lawyers at McCann Dillon Jaffe & Lamb, LLC Advocate for Those Injured by Autonomous Vehicles

If you were injured in a car accident, contact a Delaware car accident lawyer at McCann Dillon Jaffe & Lamb, LLC. Several fatal cases have demonstrated the need for clearer laws and policies regarding self-driving technology, however the laws surrounding driverless vehicles and liability are still evolving. For a free consultation, contact us online or call us at 302-888-1221. Located in Wilmington, Delaware, we represent clients throughout the state, including Dover, Middletown, and Newark.