A Lesson in Automation: Woman Beats Machine

 In autonomous driving, Blue Cruise, Car Safety Features, Car Safety Rating Systems, driverless cars, Vehicle Safety

A federal study nearly two decades ago found that human drivers were flawed. In fact, the study said 94 percent of all crashes had some degree of human error, though it avoided saying humans “caused” the accidents.

This study spurred safety experts and carmakers to push for automation. The idea is that self-driving cars will do better than humans.

But a recent crash in Texas suggests otherwise. It has the industry asking, “What Would a Human Do?”

Smart vehicle, deadly choices

In late February, a woman driving at night in San Antonio saw a 1999 Honda CR-V stopped in the middle of a 3-lane highway. The car’s lights were off. The woman swerved into the right lane to avoid hitting the Honda.

The next vehicle to come upon the parked car wasn’t as smart. It was a 2022 Mustang Mach-E, occupied by a 44-year-old driver who was using Ford’s Blue Cruise self-driving system. The Mustang rear-ended the Honda with enough force to overturn it. As a result, the driver inside the Honda died. The Mustang driver walked away with minor injuries.

The National Transportation Safety Board (NTSB) is investigating. The Board studies crashes like this one to determine the cause, the role of automation, and what can be done to improve such systems. In this case, they’ll also look at why the human driver could avoid the stopped vehicle, but the self-driving Mustang could not.

Expect the Unexpected

Ford markets Blue Cruise as a hands-free, driver-assist technology for highway usage. Commercials show drivers, even a priest, sitting behind the wheel and enjoying a hands-free ride. The message: “Relax, the car will drive itself.”

However, Ford does restrict where the driver can activate the system based on its location on the roadways. Automation engineers use such limits to reduce the likelihood of unexpected events.

Automated systems hate unexpected events, known in the industry as edge cases. These situations rarely occur but, when they do, they pose a risk to people and property. Examples include a skateboarder who zooms into an intersection, a utility pole that falls across the road, and a sudden haboob.

A stopped, unlit vehicle at night in the middle of a highway classifies as an edge case. Ford engineers will need to figure out why the human outperformed Blue Cruise.

The NTSB report will provide them with vital information. Did the Blue Cruise camera or radar system not see the stopped car? Was the vehicle travelling too fast? Or too close? Did the system recognize the obstacle, but didn’t have enough time or space to react? And why was the Honda stopped and unlit in the middle of the road?

Blending Human with Machine Learning

Machines learn from data. The Texas tragedy will undoubtedly become another test case in the database used to train future automated systems.

On the other hand, humans learn from experience. Opponents of vehicle automation will argue that the technology is not ready for the road. Poor performance in real, on-the-road edge cases provide their evidence.

Failures like the crash in Texas happen when we replace human error with machine error. Success will be when we blend the best of human experience with that of automation to make our roads safer.

(This blog appeared as an opinion piece in Ahwatukee Foothill News.)

 

Working together to create a world where everyone walks away from a crash.

GET YOUR AUTO GRADE