Who Bears the Weight of Driverless Decisions?

Accountability in an autonomous world

True autonomous vehicles will likely be safer than humans on the road. There's a lot of hype around producing this science-fiction-turned-reality, with over 50+ companies approved for on-the-road testing.

But they're not going to be perfect, so what happens when the inevitable occurs?

Let's say in a 'worst case scenario', an autonomous vehicle hits and kills a pedestrian who was legally crossing the road.

If a human was at the wheel, that would be vehicular manslaughter. But in this scenario, with a truly self-driving car (not Telsa's glorified autopilot), a person would not be behind the wheel.

There's a critical disconnect here between passenger and digital driver; most decisions the car makes are now out of human control. It would be as if I were driving you, a passenger in the back seat.

If I cause an accident, you're not liable for negligence. Except now I've been replaced by an algorithm. So who is?

There's 4 immediate options:

  1. The car
  2. The company that manufactured the self-driving car
  3. The government
  4. You

The Car

Let's immediately discount the vehicle. You cannot hold the car responsible because you cannot punish it or extract recompense in any way.

Perhaps, one day, if the AI powering the car is sentient, and granted moral status and legal rights and responsibilities, then we can talk. But until then…

The Company

Next up, the company. If you get into a car currently powered by top self-driving companies such as Waymo, Cruise, Nuro, or Argo AI, they're generally still under beta-testing and so the passenger is not currently personally responsible.

Yet the manufacturer is still more likely to be at fault than the passenger long-term, as they designed, developed, and tested the car that caused the accident.

However, some changes may need to be made to the way cases are handled if product liability claims became common substitutes for auto collision. This is because product liability is expensive, difficult to pursue (requiring 'expert testimony'), and can be mitigated by comparative and contributory negligence laws.

While the law may need to evolve with technology, it's not unreasonable to think injured parties may pursue legal action against the company that produced the defective autonomous vehicle.

Alternatively, the manufacturer could automatically assume liability, but include that within the cost of the car.

For example, Volvo said it would be responsible for accidents caused by vehicles in Volvo's fully autonomous mode. While this would resolve the question of responsibility, this significant price hike (of potentially between $10-15,000) would also make driverless cars less desirable.

It is probable that the company that produced the car will take on a greater responsibility for what happens after it leaves the dealership, but to what extent is yet unclear.

The Government

Our government has several incentives to subsidize and support the self-driving industry.

Over 3,100 people were killed, and 424,000 injured due to distracted driving in 2019. Driverless cars could reduce these numbers significantly, leading to safer traveling for ordinary citizens and fewer accidents for law enforcement to field.

Additionally, autonomous vehicles could bring $488 billion in annual savings from reducing accidents and another $158 billion in savings due to reduced fuel costs for our economy.

Companies like Nuro have been pushing for new laws and altered requirements that better fit the new road order of AI-powered cars. For example, a driverless car may not need rearview mirrors.

Some safety laws have already been passed and put into effect; 'manufacturers of vehicles with automated driving systems must report any and all crashes promptly' as of June 2021.

It's possible that the government could pass legislation that helps with liability for self-driving cars. However, I currently view this as unlikely. Rather, they'll probably continue to invest heavily in autonomous vehicle research and policy.

You

Finally, you could be responsible. You didn't directly cause the accident, but you did choose to ride in a driverless car. 'Negligence' could mean using self-driving features in inappropriate situations, failing to follow safety rules, or not installing important updates.

It's possible there will be 'terms of use' that assign liability for traffic accidents to the passenger.

However, in any scenario in which self-driving cars are mainstream, I doubt people will be held responsible for car accidents to the same extent as if they had been at the wheel. Adoption would be severely hampered if in order to use such technologies you could potentially be tried for (any degree of) murder (as happened in a 2018 case with an Uber tester ).

This may lead to an entire industry surrounding a new kind of car insurance for using autonomous vehicles.

The Society of Automative Engineers defines 6 levels of autonomous vehicles, from Level 0 (ordinary, human-driven car), to Level 5 (fully self-driving car). Level 1 includes adaptive cruise control, Level 2 adds advanced driver-assistance systems, and 3-5 increase the car's control over acceleration, breaking, and direction.

According to Progressive, cars with partial automation such as Tesla's Autopilot fall under Level 3: 'Conditional Driving Automation'.

Insurance plans for these semi-autonomous vehicles remain similar to or above ordinary premiums due to being new tech, with high potential for damages. Elevated mandatory minimums may even cause premiums to increase (temporarily?) for liability and property damage.

Once owning fully-autonomous vehicles becomes possible, car insurance could drop in cost as the manufacturer assumes more liability. This in turn could lead to faster adoption of autonomous vehicles. 35% of people surveyed responded they'd buy a self-driving car if it came with lower insurance rates.

Companies such as Koop Technologies, Rivian, and Direct Line are already working to fill the emerging market of autonomous auto insurance. One in particular, Avinew, works to offer people coverage through usage-based functionality.

The human in the vehicle has always been the primary candidate for liability up until this point. It will be interesting to see if that changes tremendously.

So, who's responsible?

Ultimately, there may be a decision tree integrating a combination of possible entities to be held responsible based on context.

Based on my (admittedly limited) knowledge of the space, this is my current prediction for what will happen when an accident occurs:

While this is fairly high-level, our policies need to cover more than just fatal accidents, from a fender-bender to hitting a mailbox. Assigning blame after-the-fact is troublesome if there are no regulations already in place.

A clear understanding of responsibility will be necessary for autonomous vehicles to become mainstream. What we end up seeing implemented (soon?) could be incredibly different from what I outlined.

Much like the original automobile, autonomous vehicles have the ability to change the world for the better. As we innovate in our technology, we must also innovate in related policy.