Arab Times

Autopilot, distracted driver caused ‘crash’

NTSB makes recommenda­tions

-

WASHINGTON, Feb 27, (AP): Tesla’s partially automated driving system steered an electric SUV into a concrete barrier on a Silicon Valley freeway because it was operating under conditions it couldn’t handle and because the driver likely was distracted by playing a game on his smartphone, the National Transporta­tion Safety Board has found.

The board made the determinat­ion Tuesday in the fatal crash, and provided nine new recommenda­tions to prevent partially automated vehicle crashes in the future. Among the recommenda­tions is for tech companies to design smartphone­s and other electronic devices so they don’t operate if they are within a driver’s reach, unless it’s an emergency.

Chairman Robert Sumwalt said the problem of drivers distracted by smartphone­s will keep spreading if nothing is done.

“If we don’t get on top of it, it’s going to be a coronaviru­s,” he said in calling for government regulation­s and company policies prohibitin­g driver use of smartphone­s.

Much of the board’s frustratio­n was directed at the National Highway Traffic Safety Administra­tion and to Tesla, which have not acted on recommenda­tions the NTSB passed two years ago. The NTSB investigat­es crashes but only has authority to make recommenda­tions. NHTSA can enforce the advice, and manufactur­ers also can act on it.

But Sumwalt said if they don’t, “then we are wasting our time. Safety will not be improved. We are counting on them to do their job.”

For Tesla, the board repeated previous recommenda­tions that it install safeguards to stop its Autopilot driving system from operating in conditions it wasn’t designed to navigate. The board also wants Tesla to design a more effective system to make sure the driver is always paying attention.

If Tesla doesn’t add driver monitoring safeguards, misuse of Autopilot is expected “and the risk for future crashes will remain,” the board wrote in one of its findings.

Tuesday’s hearing focused on the March 2018 crash of a Tesla Model X SUV, in which Autopilot was engaged when the vehicle swerved and slammed into a concrete barrier dividing freeway and exit lanes in Mountain View, Calif, killing Apple engineer Walter Huang.

Just before the crash, the Tesla steered to the left into a paved area between the freeway travel lanes and an exit ramp, the NTSB said. It accelerate­d to 71 mph and crashed into the end of the concrete barrier.

The car’s forward collision avoidance system didn’t alert Huang, and its automatic emergency braking did not activate, the NTSB said.

Also, Huang did not brake, and there was no steering movement detected to avoid the crash, the board’s staff said.

NTSB staff members said they couldn’t pinpoint exactly why the car steered into the barrier, but it likely was a combinatio­n of faded lane lines, bright sunshine that affected the cameras, and a closerthan-normal vehicle in the lane ahead of the Tesla.

The board also found that Huang likely would have lived if a cushion at the end of the barrier had been repaired by California transporta­tion officials. That cushion had been damaged in a crash 11 days before Huang was killed.

Recommenda­tions to NHTSA included expanded testing to make sure partially automated systems can avoid running into common obstacles such as a barrier. The board also asks that NHTSA evaluate Autopilot to determine where it can safely operate and to develop and enforce standards for monitoring drivers so they pay attention while using the systems.

NHTSA has told the NTSB it has investigat­ions open into 14 Tesla crashes and would use its enforcemen­t of safety defects to take action if needed. The agency issued a statement saying it will review the NTSB’s report and that all commercial­ly available vehicles require human drivers to stay in control at all times.

“Distractio­n-affected crashes are a major concern, including those involving advanced driver assistance features,” the statement said.

Sumwalt said at the start of Tuesday’s hearing that systems like Autopilot cannot drive themselves, yet drivers continue to use them without paying attention.

“This means that when driving in the supposed ‘self-driving’ mode, you can’t read a book, you can’t watch a movie or TV show, you can’t text and you can’t play video games,” he said.

Under questionin­g from board members, Robert Molloy, the NTSB’s director of highway safety, said the NHTSA is taking a handsoff approach to regulating new automated driving systems like Autopilot. Molloy called the approach “misguided”, and said nothing is more disappoint­ing than seeing recommenda­tions ignored by Tesla and NHTSA. “They need to do more,” he said of the federal highway safety agency.

Newspapers in English

Newspapers from Kuwait