A Tesla Model 3 car involved in a fatal crash with a semitrailer in Florida was operating on the company’s semiautonomous Autopilot system, federal investigators have determined.
The car drove beneath the trailer, killing the driver, in a March 1 crash that is strikingly similar to one that happened on the other side of Florida in 2016 that also involved the use of Autopilot.
In both cases, neither the driver nor the Autopilot system stopped for the trailers, and the roofs of the cars were sheared off.
The crash, which remains under investigation by the National Transportation Safety Board and the National Highway Traffic Safety Administration, raises questions about the effectiveness of Autopilot, which uses cameras, long-range radar and computers to detect objects in front of the cars so as to avoid collisions. The system’s features also including keeping a car in its lane, changing lanes and navigating freeway interchanges.
Tesla Inc. has maintained that the system is designed only to assist drivers, who must pay attention at all times and be ready to intervene.
In a preliminary report on the March 1 crash, the NTSB said that preliminary data and video from the Tesla car show that the driver turned on Autopilot about 10 seconds before the crash on a divided highway with turn lanes in the median. From less than eight seconds until the time of the crash, the driver’s hands were not detected on the steering wheel, the NTSB report stated.
Neither the data nor the videos indicated the driver or the Autopilot system braked or tried to avoid the trailer, the report stated.
The Model 3 was going 68 mph when it hit the trailer on U.S. 441, and the speed limit was 55 mph, the report said. Jeremy Beren Banner, 50, was killed.
Tesla said in a statement Thursday that Banner did not use Autopilot at any other time during the drive before the crash. Vehicle logs show that he took his hands off the steering wheel immediately after activating Autopilot, the statement said.
Tesla also said that it’s saddened by the crash and that drivers have traveled more than 1 billion miles while using Autopilot.
“When used properly by an attentive driver who is prepared to take control at all times, drivers supported by Autopilot are safer than those operating without assistance,” the company said.
The circumstances of the Delray Beach crash are much like one that occurred in May 2016 near Gainesville, Fla. Joshua Brown, 40, of Canton, Ohio, was traveling in a Tesla Model S on a divided highway and using the Autopilot system when he was killed.
Neither Brown nor the car braked for a tractor-trailer, which had turned left in front of the Tesla and was crossing its path. Brown’s Tesla also went beneath the trailer and its roof was sheared off. After that crash, Tesla Chief Executive Elon Musk said the company made changes in its system so radar would play more of a role in detecting objects.
David Friedman, who was acting head of the NHTSA in 2014 and is now vice president of advocacy for Consumer Reports, said he was surprised the agency didn’t declare Autopilot defective after the Gainesville crash and seek a recall. The Delray Beach crash, he said, reinforces that Autopilot is being allowed to operate in situations it cannot handle safely.
“Their system cannot literally see the broad side of an 18-wheeler on the highway,” Friedman said.
Tesla’s system was too slow to warn the driver to pay attention, unlike systems that Consumer Reports has tested from General Motors Co. and other companies, Friedman said. GM’s Super Cruise driver assist system operates only on divided highways with no median turn lanes, he said.
Tesla needs a better system to more quickly detect whether drivers are paying attention and warn them if they are not, Friedman said.
“Tesla has for too long been using human drivers as guinea pigs. This is tragically what happens,” he said.
To force a recall, the NHTSA must do an investigation and show that the way a vehicle is designed falls outside industry standards. “There are multiple systems out on the roads right now that take over some level of steering and speed control, but there’s only one of them that we keep hearing about where people are dying or getting into crashes. That kind of stands out,” Friedman said.
The NHTSA said Thursday that its investigation is continuing and its findings will be made public when it’s completed.
The Delray Beach crash also casts doubt on Musk’s statement that Tesla will have fully self-driving vehicles on the roads next year. Musk said last month that Tesla had developed a powerful computer that could use artificial intelligence to safely navigate the roads with the same camera and radar sensors that are now on Tesla cars.
“Show me the data,” Friedman said. “Tesla is long on big claims and short on proof. They’re literally showing how not to do it by rushing technology out.”
In a 2017 report on the Gainesville crash, the NTSB wrote that design limitations of Autopilot played a major role. The agency said that Tesla told Model S owners that Autopilot should be used only on limited-access highways, primarily interstates. The report said that despite upgrades to the system, Tesla did not incorporate protections against use of the system on other types of roads.
The NTSB found that the Model S cameras and radar weren’t capable of detecting a vehicle turning into its path. Rather, the systems were designed to detect vehicles they are following to prevent rear-end collisions.
An earlier version of this article identified the car involved in the March 1 crash as a Tesla Model S. It was a Tesla Model 3.