For years, U.S. investigators have been calling for more automation on motor vehicles, such as sensors that slam on the brakes to prevent a crash.
At the same time, the National Transportation Safety Board, in its probes of transportation mishaps, has warned that such devices may also have a down side: the technology can confuse operators if it’s poorly designed or lead to complacency that breeds its own hazards.
Now, for the first time in a highway accident, those two potentially contradictory themes will be put to the test as the NTSB opens an investigation into a fatal accident involving a Tesla Motors Inc. sedan that was driving with a feature called Autopilot enabled.
“It’s very significant,” said Clarence Ditlow, executive director of the Center for Auto Safety advocacy group in Washington. “The NTSB only investigates crashes with broader implications.”
The Safety Board will be sending a team of five investigators to Florida next week, agency spokesman Christopher O’Neil said Friday.
While the U.S. National Highway Traffic Safety Administration is conducting its own review of the May 7 incident, the NTSB wants to take a more comprehensive look at whether the crash reveals any systemic issues with driverless car technology, O’Neil said. NHTSA is a regulatory agency and the NTSB is an independent investigative body that only has the power to make policy recommendations.
NEW YORK, NY - JULY 05: The inside of a Tesla vehicle is viewed as it sits parked in a new Tesla showroom and service center in Red Hook, Brooklyn on July 5, 2016 in New York City. The electric car company and its CEO and founder Elon Musk have come under increasing scrutiny following a crash of one of its electric cars while using the controversial autopilot service. Joshua Brown crashed and died in Florida on May 7 in a Tesla car that was operating on autopilot, which means that Brown's hands were not on the steering wheel. (Photo by Spencer Platt/Getty Images)
“It’s worth taking a look and seeing what we can learn from that event, so that as that automation is more widely introduced we can do it in the safest way possible,” O’Neil said.
A 40-year-old Ohio man died when his 2015 Tesla Model S struck an 18-wheeler on a highway near Williston, Florida, according to a Florida Highway Patrol statement. The Model S drove under the truck’s trailer, shearing off its top.
The Autopilot, a semi-autonomous feature that can guide the vehicle in certain conditions, didn’t notice the white side of the tractor trailer as it turned in front of the car against a brightly lit sky so the brake wasn’t applied, according to Tesla. The system may have confused the truck with an overhead highway sign.
The crash was the first with a known fatality in more than 130 million miles of Autopilot driving, according to the carmaker.
Ditlow said that the NTSB rarely opens investigations into highway accidents, so the announcement that it was looking at the Tesla crash is significant.
“They’re not looking at just this crash,” he said. “They’re looking at the broader aspects. Are these driverless vehicles safe? Are there enough regulations in place to ensure their safety?”
“And one thing in this crash I’m certain they’re going to look at is using the American public as test drivers for beta systems in vehicles. That is simply unheard of in auto safety,” he said.
Tesla has installed the software for Autopilot on all 70,000 of its cars since October 2014 even though it is still a so-called beta version. Tesla said in a June 30 blog post that vehicle owners must acknowledge that the system is new technology that is “still in a public beta phase” before it will switch it on.
“Autopilot is by far the most advanced driver-assistance system on the road, but it does not turn a Tesla into an autonomous vehicle and does not allow the driver to abdicate responsibility,” the company said. “Since the release of Autopilot, we’ve continuously educated customers on the use of the feature, reminding them that they’re responsible for remaining alert and present when using Autopilot and must be prepared to take control at all times.”
The NTSB opens highway investigations about 25 to 30 times a year, according to O’Neil. By comparison, it is required by law to investigate the more than 1,000 aviation accidents a year.
The NTSB has for decades called on vehicle manufacturers to install more features to automatically prevent accidents. After a Wal-Mart Stores Inc. truck struck a limo van carrying comic Tracy Morgan, the NTSB examined why the truck’s automatic braking system wasn’t switched on at the time of the impact.
Accidents involving automation also have been a growing issue in aviation and other transportation modes, according to NTSB case files.
The board concluded that an autopilot feature on the Boeing Co. 777 that crashed short of the runway in San Francisco on July 6, 2013, contributed to the accident that killed three. Pilots didn’t realize that they had accidentally shut off a feature that normally ensured they would maintain a safe speed, allowing the plane to get dangerously slow.
“We have learned that pilots must understand and command automation, and not become over-reliant on it,” NTSB Chairman Christopher Hart said after the board reached its conclusions. “The pilot must always be the boss.”