NTSB: Distracted Operator, Uber Blamed for Fatal Self-Driving Crash

NTSB members applauded Uber for cooperating with its nearly two-year long investigation, but they cited an “ineffective safety culture” at the company that played a central role in the fatal crash.

2019-11-self-driving-uber-after-deadly-crash-NTSB-11222019.png

The Detroit News

By Keith Laing

WASHINGTON -- Distraction caused by a personal cellphone call is the probable cause of a 2018 crash in which an Uber self-driving car struck and killed a pedestrian, the National Transportation Safety Board ruled this week.

But Uber’s safety policies also are to blame, the federal agency said.

Elaine Herzberg was hit by a 2017 Volvo XC90 SUV that was being operated autonomously by Uber in Tempe, Ariz., around 10 p.m. on March 18, 2019, according to police in the Phoenix suburb.

Police said the vehicle was in self-drive mode with a safety operator behind the wheel when Herzberg, who was walking a bicycle outside of a crosswalk, was struck. The 49-year-old woman died after being taken to a local hospital.

The crash was believed to be the first fatal pedestrian accident involving an autonomous vehicle.

The NTSB voted unanimously Tuesday that the probable cause of the crash was “the failure of the vehicle operator to monitor the environment and the operation of the automated driving system because she was visually distracted throughout her trip by her personal cellphone.”

Three shortcomings by Uber contributed to the crash, the board concluded: the company’s inadequate safety risk assessment procedures; ineffective oversight of vehicle operators; and lack of adequate mechanisms to address complacency by operators as the cars drove themselves.

The NTSB said the pedestrian who was killed in the crash was impaired by drugs, which also played a role in the crash.

Uber did not immediately respond to a request for comment.

The Tempe crash roiled the debate about self-driving cars in Washington. Uber suspended all testing of self-driving cars for four months before resuming testing in Pittsburgh in July 2018. The company shuttered its self-driving testing program in Arizona and laid off close to 300 workers there in May 2018.

Consumer advocates seized on the incident to urge tougher regulations of self-driving cars.

NTSB members applauded Uber for cooperating with its nearly two-year long investigation, but they cited an “ineffective safety culture” at the company that played a central role in the fatal crash. And they said that should serve as a warning.

The crash was not only about Uber’s self-drive testing in Arizona, NTSB Chairman Robert Sumwalt said: “This crash was about testing the development of automated driving systems on public roads. Its lessons should be studied by any company in any state.”

NTSB board member Jennifer Homendy took the White House to task for resisting calls from safety advocates to make it mandatory for automakers to submit safety assessments of their self-driving test programs. She said it was “laughable” that the Trump administration has argued it does not have the legal authority to force carmakers to make safety assessments public.

“I mean you might as well say, ‘We would like your assessments, but really you’re not required to do it,’ so why do it?” she said.

The Trump administration released an updated version of self-driving guidelines in December that doubled down on the idea that safety assessments should be voluntary.

Homendy, a former Democratic staffer for the U.S. House Transportation and Infrastructure Committee, said the Trump administration should make it mandatory for carmakers to submit safety assessments before they begin testing self-driving cars on public roads.

“I wrote laws for over 14 years; there’s a big difference between the words ‘should,’ ‘encouraged to’ and ‘shall,’” she said.

Jamie Court, president of Consumer Watchdog, a nonprofit public interest group, said, “When you have a car that’s not programmed to deal with jaywalkers, you certainly need an alert system.”

The problem is we don’t have a standard under the law to hold corporations and their computer programmers accountable when people die,” he continued. “This case is a metaphor for whenever there’s a human being in a car, they’re going to be blamed when there’s a death because you can’t blame a corporation or a computer program. We need better laws if we’re going to have corporate computers driving cars.”

The Tempe Uber crash was the latest in a series of self-driving accidents that have raised questions about the safety of the current state of development.

In 2017 in Tempe, a self-driving Volvo operated by Uber crashed into a car that failed to yield. That crash also prompted Uber to temporarily suspend its self-driving testing, although the Uber vehicle was found to not be responsible for the accident and there were no injuries.

In 2016, a Tesla Model S that was operating with its automated driving system activated crashed into a semi-trailer rig turning left in front of it, killing the driver in what was believed to be the first U.S. death in a vehicle driving in semi-autonomous mode.

(c)2019 The Detroit News

Visit The Detroit News at www.detnews.com

Distributed by Tribune Content Agency, LLC.

McClatchy-Tribune News Service

RECOMMENDED FOR YOU