The Uber Technologies Inc. self-driving test vehicle that killed a pedestrian in Arizona earlier this year may have been able to avoid the crash had the ride-hailing company not disabled Volvo Cars’ safety system, according to a safety group.
In our second episode of RoadSigns, we ask: How will the next levels of automation be deployed? Hear a snippet from Chuck Price, vice president of product at TuSimple, above, and get the full program by going to RoadSigns.TTNews.com.
In a report Aug. 7, the Insurance Institute for Highway Safety criticizes Uber for turning off Volvo’s collision-avoidance technology in the XC90 sport utility vehicle that struck and killed a woman in Tempe on March 18. The insurer group’s chief research officer, David Zuby, vouched for the effectiveness of Volvo’s system, saying it would have prevented or mitigated the crash.
“I think it’s possible that, had the system been able to intervene, the fatality may not have occurred,” Zuby said in a phone interview. “I would argue that if developers of self-driving technology really intend to make our roads safer, they had better make sure they have the best crash-avoidance systems in place before they go out on the road.”
The fatality spurred Uber’s suspension of public road testing with its self-driving vehicles, and raised questions about the safety of the company’s technology and its protocols with regards to use of human backup drivers. Police said in June that the woman behind the wheel in the Uber SUV was streaming the popular television show “The Voice” on her mobile phone in the moments before the crash.
The U.S. National Transportation Safety Board investigated the incident and said in a preliminary report in May that sensors on Uber’s SUV detected the female pedestrian, who was crossing a street at night outside a crosswalk. But Uber told NTSB investigators that automatic emergency braking maneuvers weren’t enabled while the vehicle was under computer control, to reduce the potential for “erratic vehicle behavior.” The company left braking up to the safety driver and didn’t design its system to alert the human operator.
“Uber decided to forgo a safety net in its quest to teach an unproven computer-control system how to drive,” Zuby said in IIHS’s report.