Who’s to Blame When Self-Driving Cars Kill?

 In News and Press, Vehicle Safety

The first U.S. felony case involving a driver using a partially automated driving system began recently in California. Limousine driver Kevin George Aziz Riad had his 2016 Tesla Model S on Autopilot when it ran a red light, crashed into a Honda Civic, and killed two people in 2019.

Edward Walters, an adjunct professor at the Georgetown University law school, posed the key question in a recent article in Automotive News: “Who is at fault, man or machine?”

According to Walters, an expert in self-driving vehicles and the law, the prosecutors will have difficulty proving the driver is guilty because “some parts of the task are being handled by Tesla.”

Yet Tesla is not facing charges in the case. Legal experts say that’s because it’s even harder to make a criminal case stick against a company.

Autopilot claims face global pressure

Instead, legal entities worldwide are moving to curb Tesla’s marketing of the Autopilot system. In 2020, a court in Munich banned Tesla from describing its system as “full potential for autonomous driving” and “Autopilot exclusive” in its German advertising.

I agree with this ruling, and so does the Society of Automotive Engineers. SAE developed the auto industry’s accepted Levels of Driving Automation, using a 0-5 scale.

Level 5 is a completely self-driving vehicle. Think of KITT, the Pontiac Trans Am Firebird in the 1980s series Knight Rider, that could plan a route and drive it, all by itself.

By comparison, Tesla’s Autopilot is at Level 2. At that level, the technology package can control the vehicle speed, braking, and steering. However, the driver needs to pay attention and be ready to assume control of the driving if something goes haywire.

A big caveat for car buyers & investors

Tesla also offers more extensive Autopilot systems called Enhanced Autopilot and Full Self-Driving Capability. However, all its systems come with the caveat, as stated on Tesla.com: “The currently enabled Autopilot, Enhanced Autopilot, and Full Self-Driving features require active drive supervision and do not make the vehicle autonomous.”

In other words, the system is actually assisting, not replacing the driver.

Tesla’s self-driving claims are also facing scrutiny from the U.S. Department of Justice. Similar to the German judge’s opinion, the federal investigation center on the idea of misleading the public. Namely, both car buyers and Tesla investors.

Right now, most of us are not Tesla owners and probably pay little attention to what Tesla calls its system, whether Autopilot, Self-Driving, or Robo-car. And our interest in the tragic Riad case centers mainly on its novelty. However, the serious charges that Riad faces should give us pause. Someday in the near future, we may be driving vehicles with more partially automated features, get in a fender bender or a more serious crash, and face the blame game. Then it will be up to the insurance companies to decide who’s at fault: man or machine.


Working together to create a world where everyone walks away from a crash.