Innovation

When your driverless car crashes, who will be responsible? The answer remains unclear

In the era of self-driving cars, insurance will be radically transformed, shifting to cover the tech that powers the vehicles. But when a driverless car gets in a wreck, who's at fault?

googlecarsensorviewcrash-2.jpg

This graphic shows what Google's self-driving car "saw" through its sensors just before it collided with a city bus.

Image: Screenshot by Wayne Cunningham/Roadshow

By 2021, major automakers—as well as tech companies like Google, Baidu, and Apple—are likely to unveil some form of autonomous driving capability. In Pittsburgh, Uber is now offering the public a chance to hail a ride in their self-driving fleet. And as we see more driverless cars on the road, the burden of driving will shift away from human drivers and onto the machine. The issue of liability is going to play an important role in the advance of driverless vehicles—especially how fast and how widely the movement takes shape.

When it comes to safety, the move towards fully-autonomous technology is a positive development. A report from KPMG predicts an 80% drop in accident frequency by 2040—which means that insurance is about to have a major transformation, and many companies are scrambling to reevaluate their business strategies.

But what happens when something goes wrong?

"Liability is a poorly understood word," said Bryant Walker Smith, one of the leading experts in the legal aspects of autonomous driving. "It can refer to criminal liability (who is convicted of a felony or misdemeanor), quasi-criminal liability (who gets the speeding ticket), and civil liability (who has to pay for the harm they caused to someone else). In addition, only rarely are any of these forms of liability binary: Just because one actor is liable doesn't mean that another actor isn't."

It's important to note that although we're seeing driverless cars like Uber's taxis on the road, all of the current vehicles with this technology, available to the public, have an "operator" behind the wheel.

In the case of Tesla's Autopilot, for example, the driver has all of the legal obligations and liabilities of an ordinary human driver, Smith said.

SEE: Uber's self-driving car to hit the streets of Pittsburgh (ZDNet)

And, to make it more complicated, different states have different rules. "Each state is regulated by a state insurance commissioner and the National Association of Insurance Commissioners," said Joe Schneider, insurance analyst at KPMG. "It's a coordinating body, but the actual regulatory environment is dictated on a state-by-state basis."

In the case of an accident in Pennsylvania, which is a "soft no fault-state, the driver might have some civil liability in a crash," said Smith. "However, if they didn't act in a negligent way, they are unlikely to be criminally prosecuted."

If there were an Autopilot accident, victims would likely "be much more interested in suing Tesla. Tesla would be civilly liable for the driver's actions," said Smith. "The company would also be civilly liable for any direct negligence—by, for example, deploying a system without sufficient testing."

SEE: Autonomous driving levels 0 to 5: Understanding the differences (TechRepublic)

Additionally, if Tesla was found to deploy a system that is unsafe, it could potentially be held criminally liable, Smith said.

And as we approach full-autonomy, the burden of liability will likely shift completely to the software that runs it.

"What you're [currently] insuring is driver risk," said Schneider. "Now we're talking about insuring the sensors and the algorithms and the software that is operating the vehicle because the person is now no longer really in charge in a fully autonomous world." Currently, Schneider said, we have private passenger auto insurance—but in the future, it will shift to product liability insurance coverage.

"We're entering a whole new world of assessing who's at fault in an accident and where the ultimate liability and risk ultimately falls," he said.

And, so far, there aren't many precedents. Mike Ramsey, Gartner analyst, pointed out there's currently only a single instance where we know the robot actually caused the accident, which happened when a Google prototype pulled out into a lane and hit a bus. In that case, Google was responsible.

SEE: Video reveals the moment Google's self-driving car slams into a bus (ZDNet)

But other situations are murkier. For instance, in the Tesla Autopilot fatality in April, the NHTSA is investigating whether Autopilot failed. But there are other considerations—the driver did not heed Tesla's instruction to remain alert, and did not apply brakes before hitting the semi-truck. Additionally, the truck driver may be at fault as well for cutting across a highway where cars were driving at fast speeds. "It's very hard to understand how they could hold Tesla responsible for doing something wrong in this case," said Ramsey.

Smith thinks that states will have decide how to adapt to these changes. "Some states and developers will enact new laws to provide more clarity." The NHTSA, which had planned to unveil guidelines this summer, has pushed back its announcement—which will likely have a big impact on states and developers.

"The public will play an important role in shaping both social and legal expectations for these vehicles," said Smith. "That's why companies like Uber should be publicly sharing their safety philosophies—how they define, measure, document, and monitor the reasonable safety of their vehicles now and into the future."

SEE: How driverless cars will transform auto insurance and shift burden onto AI and software (TechRepublic)

"Until you completely remove that human element from the front seat, it's not a black-and-white situation," said Schneider. "If you're in the back seat of the car and no one's in front of the car, when you get in an accident, it's highly improbable that it would be your fault."

Ramsey sees two schools of thought when it comes to insuring driverless vehicles: "The car will be insured based on some sort of metric of a liability, or it will be a no fault insurance assigned to the vehicle."

"It's kind of hard to know," Ramsey said. "What will likely happen is you'll see different insurance companies take different routes."

Schneider agreed that it is unclear how insurers will respond to a driverless car ecosystem.

"As a whole," he said, "the industry is not prepared for autonomous vehicles."

Also see

About Hope Reese

Hope Reese is a Staff Writer for TechRepublic. She covers the intersection of technology and society, examining the people and ideas that transform how we live today.

Editor's Picks

Free Newsletters, In your Inbox