Share
November 6, 2019 12:30 PM, EST

Self-Driving Uber in Crash Wasn’t Designed to See Jaywalkers

Uber Self-Driving Seth Wenig/Associated Press

[Stay on top of transportation news: Get TTNews in your inbox.]

Uber Technologies Inc.’s self-driving test car that struck and killed a pedestrian last year wasn’t programmed to recognize and react to jaywalkers, according to documents released by U.S. safety investigators.

The U.S. National Transportation Safety Board on Tuesday released more than 400 pages of reports and supporting documents on the March 2018 crash that killed 49-year-old Elaine Herzberg as she walked her bicycle across a road at night in Tempe, Ariz.

The documents painted a picture of safety and design lapses with tragic consequences but didn’t assign a cause for the crash. The safety board is scheduled to do that at a Nov. 19 meeting in Washington.

RoadSigns: A Transport Topics podcast

 

In our second episode of RoadSigns, Season 4, we ask: Will Automated Steering Make Trucks Safer and Drivers Happier? The introduction of active steering features in commercial trucks represents the next evolutionary stage for the onboard safety and collision mitigation systems available on the market today. But how will this affect the driver experience? Host Seth Clevenger speaks with Dan Williams, director of ADAS and autonomy at ZF Group, and Jason Roycht, vice president and regional business leader for commercial vehicles at Bosch North America. Listen to a snippet above, and get the full program by going to RoadSigns.TTNews.com.

“We deeply value the thoroughness of the NTSB’s investigation into the crash and look forward to reviewing their recommendations once issued after the NTSB’s board meeting later this month,” the company said in a statement. The company said it regrets the incident and has made critical improvements to prioritize safety.

The case is being closely watched in the emerging industry of self-driving vehicles, a technology that has attracted billions of dollars in investment from companies such as General Motors Co. and Alphabet Inc. in an attempt to transform transportation.

The NTSB data highlights the need for more rigorous standards on how self-driving vehicles can be tested on public roadways, Jason Levine, executive director of the Center for Auto Safety advocacy group, said in an interview. Currently, there are no federal rules, and individual states have set a variety of criteria.

“These are life and death consequences, not video game reset buttons for software developers,” Levine said. “I think they were playing fast and loose with people’s lives, and Elaine Hertzberg has paid the price.”

The report said the car’s sensors detected Hertzberg and her bicycle, but its computer failed to recognize the hazard.

“The system design did not include a consideration for jaywalking pedestrians,” it said.

Herzberg was crossing the road outside of a crosswalk.

The Uber vehicle’s radar sensors first observed Herzberg about 5.6 seconds prior to impact before she entered the vehicle’s lane of travel, and initially classified her as a vehicle. But the system changed its classification of her as different objects several times and failed to predict that her path would cross the lane of the self-driving test SUV, according to the NTSB.

Uber made extensive changes to its self-driving system after several reviews of its operation and findings by NTSB investigators. The company told the NTSB that the new software would have been able to correctly identify Herzberg and triggered controlled braking to avoid her more than 4 seconds before the original impact, the NTSB said.

The safety driver behind the wheel of the car was watching a video on a mobile device and didn’t see Herzberg in time. Less than five months before the accident, Uber had cut back to a single safety driver in its test vehicles. Other companies, such as GM’s Cruise affiliate, use two.

Uber vehicles operating in autonomous mode had been involved in 37 crashes prior to the fatal accident, the NTSB said. The vast majority weren’t the fault of the Uber technology. But in one case, the car struck a bent post marking a bicycle lane and in another, it didn’t react to a rapidly approaching vehicle, and the safety driver swerved and hit a parked car, NTSB said.

The safety driver involved in the accident told investigators that “sometimes the vehicle would swerve towards a bicycle.”

The Uber Advanced Technologies Group unit that was testing self-driving cars on public streets in Tempe didn’t have a stand-alone safety division, a formal safety plan, standard operating procedures or a manager focused on preventing accidents, according to NTSB.

Instead, Uber had company-wide values it promoted to its employees, such as “do the right thing,” the NTSB said. The company, in its statement, said that it had also had safety policies and procedures though not a formal safety plan.

Want more news? Listen to today's daily briefing: