
Driverless vehicles have been a vision of the future for generations. From the earliest attempts by radio in the 1920s, to the cartoon vision of flying automation in the Jetson’s, to the first promising attempts in the 80s by both DARPA and Mercedes-Benz, we’ve been searching for a way to remove the human element from vehicle navigation almost from the day we learned how to drive.
Now that envisioned future is on the visible horizon. Insurers are assessing the risks this new industry will create, legislators are considering the implications for roadway safety, and auto manufacturers are racing to be the first to launch the product to the market.
Fully autonomous vehicles promise enhanced safety and reduced roadway risk but open up uncharted risk territories. With our current vantage point, what do we know about the legal challenges of a driverless commute?
Defining Automation
Automation is defined by Merriam-Webster as “the technique of making an apparatus, a process, or a system operate automatically,” or through the actions of a machine rather than a person. The National Highway Traffic Safety Administration has established automation levels from 0-5.
We’ve been enjoying Level 2 for some time with cruise control and ABS. Level 3 is where our current safety technology lives, containing cars with the ability to automatically brake or warn you if another vehicle is too close or safely parallel park. The level of automation is greater here, but it works in tandem with an active driver. Level 4 allows for human intervention but rarely requires it, so the human driver will still be a necessary part of driving as a supervisor.
Geofencing, or restricting the locations that autonomous driving features can be used to places like designated and restricted highways, will be used to create safe zones for the use of the automatic features in vehicles with this level of automation. Level 5 vehicles are fully autonomous, able to function independently of human interaction in any scenario or environment.
Driverless Cars Today
Driverless cars are still primarily in a testing phase, but they are being used on public roadways. In some cases, in direct contradiction to the wishes of the local government and at great risk to public safety. In Arizona, a pedestrian was killed in an accident caused by a driverless Uber.
In California, autonomous vehicles unable to accurately account for pedestrians and cyclists were on San Francisco roadways against state and local government wishes. 21 states currently have laws on the books regulating, to some extent, the operation of automated vehicles in their state. Arizona’s laws were perhaps the least strict in the nation, arguably resulting in the Uber-caused pedestrian death even if the state is likely protected from any tort lawsuits.
The Arizona Uber case created interesting concerns for liability. Who was responsible for that crash? The observing driver-passenger who did not stop the Uber in time to avoid the collision? Uber for developing the vehicle? The state for its underwhelming regulation?
Where can the victim’s family go for restitution, or can they at all? California and Nevada have already weighed in, determining that the operator is ultimately liable for any loss related to the automated vehicle’s use.
Nevada and Florida have determined that a manufacturer is immune from liability on any aftermarket adaptation that transforms a standard vehicle into an automated vehicle. California, Nevada, and Florida all require manufacturers of automated vehicles to carry $5 million in liability insurance.
Legal Questions for the New World of Automated Driving
The American Bar Association asserts that the number one obstacle to more autonomous vehicles on the road are the legal questions surrounding liability in case of an accident. Here are just a handful of the questions that may come up in court with more waiting to be revealed in time.
Is It a Service or a Product?
There are those arguing that an automated system is a service, rather than a manufactured product, and thus should not be subject to civil justice protections. Rather, contract law would limit liability. This would allow greater control over how the products are used and generate profits for the manufacturer. For now, product liability law will likely continue to be the governing rule set that is used to settle cases.
Is No-Fault Implication the Right Choice for Autonomous Vehicles?
To address the growing concerns that manufacturers will be forced to deal with increasing liability costs due to the use of these vehicles, RAND Corporation published a guide encouraging the use of no-fault auto insurance to streamline the process of compensating victims while minimizing litigation and reducing financial risk for manufacturers.
No-fault is not without its own issues, but it could still prove to be an effective solution to autonomous vehicle liability due to the unique nature of the advancement.
Are Manufacturers Responsible for Crashes Caused by Hackers?
Continuing with cyberattacks, consider a scenario in which a driverless vehicle is taken over from afar by a malicious user and causes an accident. In the courtroom, the manufacturer will face overwhelming scrutiny over its IT security procedures like patching schedules and access control.
If a vulnerability is discovered on a Monday and the patch delivered on a Friday, is the manufacturer responsible for any injuries in that time frame? Can a manufacturer be reasonably expected to prevent any and all digital intrusions into their high-tech vehicles?
How Is the Automobile Industry Preparing?
You’ll see plenty of examples of car manufacturers refining their Level 3 automation in 2018 and 2019 models, but the leap from Level 3 to Level 4 has been rife with legal and technical challenges that have forced manufacturers like Volvo to push back their autonomous vehicle plans and shy away from overly ambitious reaching towards Level 5 without first mastering the previous tier.
Still, the plans for Level 4 are not too far off. Ford is planning to put out a self-described Level 4 vehicle to the masses by 2021. Volvo pushed back its plans for autonomous vehicles, but the marketing for its 2021 release claims that you can nap instead of supervising – a claim that seems ill-advised for a vehicle in the first generation of Level 4 autonomous vehicles.
In the end, they will likely treat it the same way as they treat their other product liability concerns and weigh the potential sales of the feature against the possible damages from lawsuits. When the first suits do occur, you can expect them to include more expert witnesses on programming and IT security in anticipation of plaintiffs using their own.