The NTSB can’t change federal or local policy, but it can make recommendations. It had plenty of those. The board reiterated two recommendations it issued during a 2017 investigation of an Autopilot-related death—to which Tesla has not formally responded. It asked that Tesla limit drivers to using Autopilot in road and weather situations where it can be used safely. And it urged the company to update its driver monitoring system so it can “more effectively sense” how engaged the person behind the wheel is with driving.
Tesla did not respond to a request for comment. But just a few days after the crash, the company released details about the incident—a serious no-no in NTSB investigations, which typically take months or years to complete—and effectively blamed Huang for the event. The company said Huang had not touched the wheel for six seconds before the vehicle slammed into the concrete lane divider and that “the driver had about five seconds and 150 meters of unobstructed view of the concrete divider … but the vehicle logs show that no action was taken.” Because Tesla released that information, the NTSB took the unusual step of formally removing Tesla as party to the investigation.
The NTSB also said that California’s highway agencies contributed to the incident by failing to fix the concrete barrier—something it concluded would have saved the drivers’ life.
The NTSB had recommendations for the NHTSA, too. It urged the agency to come up with new ways to test advanced driver assistance features like forward collision warning, and to look into Tesla’s Autopilot system in particular. The NTSB also asked the NHTSA to work with industry to create performance standards for driver assistance features, which critics say can lure humans into a sense of complacency even though they’re not intended to take over the driving task.
Board member Jennifer Homendy slammed federal regulators for their approach to new tech like Autopilot, which the NHTSA has championed as an effort to keep new vehicles reasonably priced and accessible to more drivers. “NHTSA’s mission isn’t to sell cars,” she said.
In a statement, an NHTSA spokesperson said the agency would “carefully review” the NTSB’s report, the final version of which will be issued in coming weeks. It also pointed to agency research setting out industry best practices for driver assistance technology.
The NTSB also laid some blame for Huang’s crash at the feet of his employer, Apple. Employers should have distracted-driving policies for their workers, the board said, prohibiting them from using devices while operating company-owned vehicles and from using their work devices while operating any vehicle. (Huang was operating his own vehicle at the time of the crash, but his phone was Apple-owned.) And the panel called on mobile device makers, including Apple, Google, Lenovo, and Samsung, to create tech that would lock drivers out of distracting applications while driving. The ask “is not anti-technology,” NTSB chair Robert Sumwalt said in his opening statement. “It is pro-safety.” Apple said it expects its employees to follow the law.
After a 2016 crash involving Autopilot, Tesla changed how the feature works. Now, if a driver using Autopilot doesn’t put pressure on the wheel for 30 seconds, warnings will beep and flash until the car slows itself to a stop. Last year, the company updated Autopilot again with new warnings for red lights and stop signs.
In response to a video that went viral last year appearing to show a driver sleeping while using Autopilot on a highway, US senator Ed Markey (D–Massachusetts) recommended Tesla rename and rebrand Autopilot to make clear its limitations and add a back-up system to make sure the driver remains engaged while behind the wheel. In response to Markey, Tesla said it believes that drivers who misuse Autopilot are “a very small percentage of our customer base” and that many online videos of drivers misusing the feature “are fake and intended to capture media attention.”
By comparing crash data from Tesla drivers with Autopilot engaged against data from those who only use more basic safety features like forward collision warning and automatic emergency braking, Tesla has concluded that it customers are 62 percent less likely to crash with Autopilot engaged.
Still, NTSB wanted to emphasize one incontrovertible fact. “You do not own a self-driving car,” Sumwalt said Tuesday. “Don’t pretend that you do.”
More Great WIRED Stories