Self-driving technology is a technological wonder that controls a car to keep it within the lanes using radar detectors and cameras to scan surroundings. It brakes, accelerates and passes other cars without the driver’s assistance. However, like most technology, it is likely to fail in some occasions or can be misused by drivers—causing major car accidents.
Uber and Google are in the lead of the self-driving car industry and claim that fully automated vehicles will be available to the public in the near future. However, laws are yet to disclose who would be liable for self-driving car accidents—will it be the manufacturer or the individual behind the wheel?
Although level-5 automated vehicles are yet to be released to the public, autopilot cases can give us a sense of how the law might treat self-driving car accidents in the future.
Autopilot Traffic Accident Cases
A driver was found passed out in his stopped car on the San Francisco Bay Bridge early this year. This man was later arrested for driving under the influence of alcohol, although he stated to police officers that he was not driving because he had his car on autopilot. The California Highway Patrol said that this driver’s alcohol level was .16%—double the legal limit of .08%.
Joshua Brown didn’t have as much luck as the San Francisco driver. In May 2016, Brown made history when he became the first driver to die in a self-driving car. As Brown’s vehicle was set on autopilot, his car failed to detect a tractor-trailer as it came out of an intersection. The vehicle collided into the trailer and killed Brown in the process.
Who Is Liable for Self-Driving Accidents?
Federal and state regulators clearly state that drivers should be alert at all times when driving a vehicle to respond to potential hazards. When a vehicle is put in autopilot, a driver should always be ready to control the vehicle in case of an emergency.
Tesla’s spokesperson claims that their autopilot feature was designed to be used only when a driver is fully capable of driving a vehicle. Under no circumstances was autopilot designed to be used under the influence of alcohol. There may be unique cases where the automated system can be held responsible and criminally liable when traffic laws have been violated.
Unfortunately, current federal regulations have not adapted to this new technology. Current regulations treat drivers and vehicles as separate entities, but in a self-driving vehicle, the driver is the vehicle. Responsibility needs to apply to both human drivers and automated technology.
Without clearly creating regulations for drivers with self-driving cars, it’s much less clear who can be held liable for a self-driving car accident. The Obama administration called for “conscious and intentional” decisions while conducting a vehicle— but self-driving cars are incapable of intentional decision-making like we are. For such reasons, this law can be problematic since self-driving cars are only capable of the algorithm they are programmed with. This new technology can confuse drivers; like the San Francisco driver that thought he was not liable for drinking and driving.
If you’re injured in a self-driving vehicle accident, call a Pennsylvania car accident attorney as soon as possible—we’re on the borders of a new frontier, and you’ll need a legal guide to help you navigate it.