Self-Driving Cars Must Meet 15 Benchmarks in US Guidance

Image
Daimler AG

The Obama administration’s proposed guidelines for self-driving cars, to be formally unveiled Sept. 20, include 15 benchmarks automakers will need to meet before their autonomous vehicles can hit the road.

The companies will have to show how their virtual drivers will function, what happens if they fail and how they’ve been tested, according to a preview by the U.S. Transportation Department. The automakers must make vehicle performance assessments public so regulators and other companies can evaluate them.

“It’s in their vested interest to be as upfront and transparent as possible,” Transportation Secretary Anthony Foxx said Sept. 19 on a call with reporters. “There’s market risk in putting a product out there that doesn’t meet the expectations of the public.”

RELATED: Self-driving trucks? truckers say: hit the brakes



Companies that have invested in developing the vehicles, including Tesla Motors Inc., General Motors Co. and Google parent Alphabet Inc., say federal leadership is needed to keep states from passing their own contradictory laws. The Self-Driving Coalition for Safer Streets, whose members include Uber Technologies Inc. and Lyft Inc., supports standardizing automated car policies among the states, counsel and spokesman David Strickland said in a statement.

At the same time, companies have urged regulators to use a light touch, so as not to kill off innovation — a pleading the administration appears to have heeded.

Ford Motor Co. called the administration’s guidelines “thoughtful” and an attempt to ensure the United States continues to innovate.

RELATED: Volvo tests autonomous truck in underground mine

“Importantly, the guidance will help establish the basis for a national framework that enables the safe deployment of autonomous vehicles,” according to a statement from the Dearborn, Michigan-based company. “Strides in this technology have the potential to improve safety on our roads and reduce congestion in urban areas.”

Questions about self-driving car safety were elevated in July, when a fatal crash involving a Tesla vehicle was made public. The incident happened May 7, when the Model S was being driven by the car’s “autopilot” system. The car failed to distinguish between a white truck blocking the road and the brightly lit sky, Tesla said.

The new guidelines include recommendations for states to pass legislation on introducing self-driving cars safely on their highways. They suggest that states should continue to license human drivers, enforce traffic laws, inspect vehicles for safety and regulate insurance and liability. The federal government, according to the guidelines, should set standards for equipment, including the computers that could potentially take over the driving function. It will also continue to investigate safety defect and enforce recalls.

RELATED: Otto’s autonomous trucks drive almost 24/7 to refine technology of tomorrow

President Obama wrote in an op-ed in the Pittsburgh Post-Gazette that automated vehicles have the potential to dramatically reduce the number of people who die on the roads. The administration’s guidance will promote safety, he wrote.

“If a self-driving car isn’t safe, we have the authority to pull it off the road,” Obama wrote. “We won’t hesitate to protect the American public’s safety."

Portions of the proposed guidelines will be effective immediately. Other elements will go into effect after public comments are received and analyzed. The government said it will update its self-driving car guidelines annually.

Earlier this year, the Transportation Department said it would allow automakers that can demonstrate they have developed a safe autonomous vehicle to apply for exemptions from certain rules. It marked a new approach to auto regulations designed to ensure the government doesn’t stand in the way of technological progress.

Regulators promised a quick response to companies that ask for interpretations of safety regulations applied to new autonomous features that seem to fall through the cracks of current rules. In one of the first applications of that policy, the National Highway Traffic Safety Administration said in February that Google’s artificial intelligence system would be considered a driver under federal rules.

“We’ve envisioned a future where you can take your hands off the wheel, and the wheel out of the car,” said Jeff Zients, director of the White House’s National Economic Council. “Your commute becomes productive and restful rather than exhausting.”

Mark Rosekind, NHTSA’s administrator, has said the self-driving car plan would be key to the agency’s attempts to reduce human error, which the agency estimates is a factor in 94% of fatal car crashes. Those crashes killed more than 35,000 people in the United States last year.

Earlier this year, Transportation Secretary Anthony Foxx announced a $4 billion grant over 10 years to fund pilot projects with automated vehicles. That proposal hasn’t gone anywhere in Congress, which would have to approve the funds.

The guidelines issued Sept. 20 attempt to clarify how current rules and regulations, formed in the 1960s, will be applied to emerging technology. The Transportation Department plans on issuing interpretation letters explaining how emerging technologies can comply with current law, promising to respond to company requests in 60 days.

The new rules include a path for going fully driverless by removing the requirement that a human serve as a backup, according to two people familiar with the rulemaking. Bryan Thomas, a NHTSA spokesman, declined to comment on that issue before the formal announcement.

The development is important because some state regulators, including California, have proposed that humans must be ready to take over in robot cars at a moment’s notice. Google’s self-driving car project and others have objected, saying that limitation could stifle development of the technology because it would require robot rides to have steering wheels, gas and brake pedals, at least in the test phase.

Federal safety regulators appear ready to follow the precedent they already set for Google earlier this year, when the agency recognized its self-driving software as the “driver” of its fully autonomous test vehicles. The new federal rules are just proposals and much could change, said the people, who asked not to be identified revealing internal discussions. But it would be welcome by companies like Google, Uber and Ford, which plan to deploy fully autonomous vehicles within the next five years.

General Motors expressed support for the effort to speed the deployment of the vehicles, which it said could dramatically improve safety.

"We welcome the effort, will review the guidance and look forward to continuing the constructive dialogue on how to safely deploy AVs as quickly as possible," the company said in a written statement.