NTSB: Tesla Autopilot, distracted driver caused fatal crash

WASHINGTON (AP) — Tesla’s partially automated driving system steered an electric SUV into a concrete barrier on a Silico...

WASHINGTON (AP) — Tesla’s partially automated driving system steered an electric SUV into a concrete barrier on a Silicon Valley freeway because it was operating under conditions it couldn’t handle and because the driver likely was distracted by playing a game on his smartphone, the National Transportation Safety Board has found.

The board made the determination Tuesday in the fatal crash, and provided nine new recommendations to prevent partially automated vehicle crashes in the future. Among the recommendations is for tech companies to design smartphones and other electronic devices so they don’t operate if they are within a driver’s reach, unless it’s an emergency.

Chairman Robert Sumwalt said the problem of drivers distracted by smartphones will keep spreading if nothing is done.

“If we don’t get on top of it, it’s going to be a coronavirus,” he said in calling for government regulations and company policies prohibiting driver use of smartphones.

Much of the board’s frustration was directed at the National Highway Traffic Safety Administration and to Tesla, which have not acted on recommendations the NTSB passed two years ago. The NTSB investigates crashes but only has authority to make recommendations. NHTSA can enforce the advice, and manufacturers also can act on it.

But Sumwalt said if they don’t, “then we are wasting our time. Safety will not be improved. We are counting on them to do their job.”

For Tesla, the board repeated previous recommendations that it install safeguards to stop its Autopilot driving system from operating in conditions it wasn’t designed to navigate. The board also wants Tesla to design a more effective system to make sure the driver is always paying attention.

If Tesla doesn’t add driver monitoring safeguards, misuse of Autopilot is expected “and the risk for future crashes will remain,” the board wrote in one of its findings.

Tuesday’s hearing focused on the March 2018 crash of a Tesla Model X SUV, in which Autopilot was engaged when the vehicle swerved and slammed into a concrete barrier dividing freeway and exit lanes in Mountain View, Calif., killing Apple engineer Walter Huang.

Just before the crash, the Tesla steered to the left into a paved area between the freeway travel lanes and an exit ramp, the NTSB said. It accelerated to 71 mph and crashed into the end of the concrete barrier. The car’s forward collision avoidance system didn’t alert Huang, and its automatic emergency braking did not activate, the NTSB said.

Also, Huang did not brake, and there was no steering movement detected to avoid the crash, the board’s staff said.

NTSB staff members said they couldn’t pinpoint exactly why the car steered into the barrier, but it likely was a combination of faded lane lines, bright sunshine that affected the cameras, and a closer-than-normal vehicle in the lane ahead of the Tesla.

The board also found that Huang likely would have lived if a cushion at the end of the barrier had been repaired by California transportation officials. That cushion had been damaged in a crash 11 days before Huang was killed.

Recommendations to NHTSA included expanded testing to make sure partially automated systems can avoid running into common obstacles such as a barrier. The board also asks that NHTSA evaluate Autopilot to determine where it can safely operate and to develop and enforce standards for monitoring drivers so they pay attention while using the systems.

NHTSA has told the NTSB it has investigations open into 14 Tesla crashes and would use its enforcement of safety defects to take action if needed.

The agency issued a statement saying it will review the NTSB’s report and that all commercially available vehicles require human drivers to stay in control at all times.

“Distraction-affected crashes are a major concern, including those involving advanced driver assistance features,” the statement said.

Sumwalt said at the start of Tuesday’s hearing that systems like Autopilot cannot drive themselves, yet drivers continue to use them without paying attention.

“This means that when driving in the supposed ‘self-driving’ mode, you can’t read a book, you can’t watch a movie or TV show, you can’t text and you can’t play video games,” he said.

Under questioning from board members, Robert Molloy, the NTSB’s director of highway safety, said the NHTSA is taking a hands-off approach to regulating new automated driving systems like Autopilot. Molloy called the approach “misguided,” and said nothing is more disappointing than seeing recommendations ignored by Tesla and NHTSA.

“They need to do more,” he said of the federal highway safety agency.

Autopilot is designed to keep a vehicle in its lane and keep a safe distance from vehicles in front of it. It also can change lanes with driver approval. Tesla says Autopilot is intended to be used for driver assistance and that drivers must be ready to intervene at all times.

Sumwalt said the board had made recommendations to six automakers in 2017 to stop the problem and only Tesla has failed to respond.

Teslas can sense a driver applying force to the steering wheel, and if that doesn’t happen, it will issue visual and audio warnings. But monitoring steering wheel torque, “is a poor surrogate measure” of monitoring the driver, Ensar Becic, the NTSB’s human performance and automation highway safety expert told the board.

Messages were left Tuesday seeking comment from Tesla.

Sumwalt said the NTSB had called for technology more than nine years ago to disable distracting functions of smartphones while the user is driving, but no action has been taken.

Don Karol, the NTSB’s project manager for highway safety, told the board that the staff is recommending that cell phone companies program phones to automatically lock out distracting functions such as games and phone calls while someone is driving. The staff also recommends that companies enact policies to prevent use of company issued cell phones while workers are driving.

Tesla has said Autopilot was put out initially in “beta,” meaning it was being tested and improved as bugs were identified, Karol told the board.

That brought a response from Vice Chairman Bruce Landsburg, who said if the system has known bugs, “it’s probably pretty foreseeable that somebody’s going to have a problem with it. And then they (Tesla) come back and say ‘oh, we warned you.’”

25 February 2022, 23:46 | Views: 235

Add new comment

For adding a comment, please log in
or create account

0 comments