Self-driving technology is a technological wonder that controls a car to keep it within the lanes using radar detectors and cameras to scan surroundings. It brakes, accelerates, and passes other cars without the driver’s assistance. However, like most technology, it is likely to fail in some occasions or can be misused by drivers—causing major car accidents.
Uber and Google are in the lead of the self-driving car industry and claim that fully automated vehicles will be available to the public in the near future. However, laws are yet to disclose who would be liable for self-driving car accidents—will it be the manufacturer or the individual behind the wheel?
Although level-5 automated vehicles are yet to be released to the public, autopilot cases can give us a sense of how the law might treat self-driving car accidents in the future.
Autopilot Traffic Accident Cases
A driver was found passed out in his stopped car on the San Francisco Bay Bridge early this year. This man was later arrested for driving under the influence of alcohol, although he stated to police officers that he was not driving because he had his car on autopilot. The California Highway Patrol said that this driver’s alcohol level was .16%—double the legal limit of .08%.
Joshua Brown didn’t have as much luck as the San Francisco driver. In May 2016, Brown made history when he became the first driver to die in a self-driving car. As Brown’s vehicle was set on autopilot, his car failed to detect a tractor-trailer as it came out of an intersection. The vehicle collided into the trailer and killed Brown in the process.
Who Is Liable for Self-Driving Accidents?
There is little doubt that each state will look at driverless accidents and insurance a little differently (as they currently do). However, it will be hard to argue that a person “behind the wheel” should be held liable for an autonomous car running into other vehicles. For a car to truly be considered driverless, it must mean that a driver does not have to exist. If a driver does not exist, then it stands to reason that all parties within an autonomous car will be considered passengers. The car itself will be the driver; therefore, the car will be the thing that is held liable for the accident.
Federal and state regulators clearly state that drivers should be alert at all times when driving a vehicle to respond to potential hazards. When a vehicle is put on autopilot, a driver should always be ready to control the vehicle in case of an emergency.
Tesla’s spokesperson claims that their autopilot feature was designed to be used only when a driver is fully capable of driving a vehicle. Under no circumstances was autopilot designed to be used under the influence of alcohol. There may be unique cases where the automated system can be held responsible and criminally liable when traffic laws have been violated.
Unfortunately, current federal regulations have not adapted to this new technology. Current regulations treat drivers and vehicles as separate entities, but in a self-driving vehicle, the driver is the vehicle. Responsibility needs to apply to both human drivers and automated technology.
Without clearly creating regulations for drivers with self-driving cars, it’s much less clear who can be held liable for a self-driving car accident. The Obama administration called for “conscious and intentional” decisions while conducting a vehicle— but self-driving cars are incapable of intentional decision-making like we are. For such reasons, this law can be problematic since self-driving cars are only capable of the algorithm they are programmed with. This new technology can confuse drivers; like the San Francisco driver that thought he was not liable for drinking and driving.
Cars Cannot Be Held Liable Through Personal Injury Claims
The obvious truth is that cars cannot be held liable. However, every car accident claim does need to be placed on someone. So, if a car is a reason an accident occurred, in today’s world, the manufacturer will be the party held liable. For example, say a new car comes to a sale lot. If a potential buyer gets in the car, drives around the block, and gets into an accident because the car’s breaks were faulty, the driver of the car would not be liable. The responsible party in this scenario would be the manufacturer for allowing a broken car to be sent to a car lot. This scenario will be similar to autonomous car liability, but there is one key difference with driverless cars: the addition of software companies.
The Responsibility of Car Manufacturers & Software Companies
At this point in time, car manufacturing companies are not the only corporations that are trying to create autonomous vehicles. There are a number of well-established software companies. Google, for example, is putting its hat in the ring for creating self-driving software. There are also small start-up companies that are trying to create technological advancements that will help driverless cars break the barrier between science fiction and reality.
If and when autonomous cars are created, the two parties who will be held responsible for personal injury claims will be either the manufacturer or the software companies. Even in automated cars, parts will break. When parts do break, the manufactures will be held responsible. In the event that a passenger is injured from a car breaking, the liable party will be the manufacturer of the car.
However, the other entity that could be held liable for other self-driving car accident cases will be the software companies that created the autonomous car capabilities. Driverless software will glitch; when it does malfunction, passengers and insurance companies will be coming after the developers.
Still, there is sure to be a “driver override” function in self-driving cars where a human driver can take over. In the event that they crash a car, they would be responsible—even if cars were otherwise autonomous. This is clearly a major shift from the insurance and personal injury claims of today.
If you’re injured in a self-driving vehicle accident, call a Pennsylvania car accident attorney as soon as possible—we’re on the borders of a new frontier, and you’ll need a legal guide to help you navigate it.