Tesla’s Autopilot software, an advanced driver assistance feature, is in the news again. And not in a good way.
Over the weekend, a short video of a person sitting in the back seat of a driverless Tesla operating on public roads in California caught the internet’s attention. The six-second video shows a man staring out the window from the back of the Tesla that’s driving down the road. There’s nobody in the driver’s seat. California Highway Patrol said it was searching for the man behind the “unusual incident.” The man was later arrested.
This latest viral video may not even be the first time this particular person has tried the stunt and is especially concerning since two people died in a Tesla crash in Texas last month. Following the crash, local authorities suggested that no one was in the driver’s seat, causing speculation that the vehicle was being operated through its driver assistance feature Autopilot, a claim that Tesla CEO Elon Musk and other executives at the company disputed. A preliminary report released Monday by the National Transportation Safety Board (NTSB) said that, in a test, Autopilot’s Autosteer feature was not available on that part of the road. The NTSB is continuing to research the accident, and the Department of Transportation’s National Highway Traffic Safety Administration (NHTSA) is conducting its own investigation.
Still, the event highlights the dangerous, ongoing confusion over Tesla’s autonomous driving capabilities and how people are using them. All new Tesla vehicles come with all the sensors and cameras the company says it needs to deliver autonomous driving features, though the technology is not quite the same as more elaborate setups you might see in self-driving cars from companies like Waymo. In fact, Tesla drivers can buy both the Autopilot and Full Self-Driving features as software upgrades.
There even seems to be some confusion between Musk and Tesla over what the self-driving features can do. A newly published public records report shows Tesla officials saying that Elon Musk has been overpromising the autonomous abilities of Tesla cars. Musk said in January that he’s “extremely confident” that Tesla cars will reach full autonomy by the end of this year. He’s made similar statements over the past five years.
Ongoing concerns about Tesla highlight how lawmakers and regulators are struggling to keep up with self-driving technology that’s showing up in cars that aren’t quite fully autonomous. While states make their own rules for the testing of self-driving vehicles, federal standards for commercially available vehicles are set by the NHTSA. The body can also exempt a certain number of vehicles from these standards for the purpose of testing self-driving cars.
But there’s still ongoing debate about how the government should approach the increasingly autonomous features popping up in our everyday cars. Now some members of Congress are pushing the Transportation Department to do more, and through new proposed legislation, lawmakers are broadening the agency’s role in order to evaluate the safety and efficacy of new features, like pedestrian avoidance and driver monitoring. Last week, Rep. Bobby Rush (D-IL) proposed new legislation that would force the agency to study crash avoidance tech, following up on legislation reintroduced this year that would force companies with advanced driver assistance tech to monitor that drivers are actually paying attention.
But as long as car companies, like Tesla, continue to push out new, ever-more-autonomous features — without clear regulatory standards — people will be driving in a potentially dangerous gray zone.
Self-driving car tech, briefly explained
While fully autonomous cars that don’t need a human driver behind the wheel are still in development, plenty of semi-autonomous features are already available in the vehicles that are on the road. These tools use different types of sensors to observe what’s happening on the road, and then employ sophisticated computing power to make decisions for the vehicle.
The transition to fully autonomous vehicles isn’t happening all at once. It’s happening gradually as individual features that require the driver to do less and less get rolled out. The NHTSA sorts autonomy into six levels, where Level 0 has no autonomous features and Level 5 is fully autonomous and doesn’t require a driver.
“Right now, the automation systems that are on the road from companies such as Tesla, Mercedes, GM, and Volvo, are Level 2, meaning the car controls steering and speed on a well-marked highway, but a driver still has to supervise,” explained Vox’s Emily Stewart in 2019. “By comparison, a Honda vehicle equipped with its ‘Sensing’ suite of technologies, including adaptive cruise control, lane keeping assistance, and emergency braking detection, is a Level 1.”
Sorting out and enforcing the dividing line between these various levels of autonomy has proven complicated and can give people a false sense of security in these cars’ capabilities. Tesla’s Autopilot feature, in particular, has been a source of confusion.
Autopilot allows the car to operate itself within a given lane, combining a cruise control feature and an auto-steering feature. In the recently published documents that showed the gap between what Elon Musk has said in public about Autopilot’s capabilities and what the feature can actually do, the California Department of Motor Vehicles said that “Tesla is currently at Level 2.” Since at least 2016, Musk has been saying that every new Tesla could drive itself, a claim he’s repeated many times. Tesla officials have said privately that what Musk says about Autopilot and full self-driving capabilities for Tesla’s vehicles does not “match engineering reality.” (Waymo, which is owned by Google’s parent company Alphabet, dropped the term “self-driving” earlier this year and committed to using “more deliberate language” in its marketing.)
Autopilot currently requires drivers to pay attention and keep their hands on the steering wheel. But drivers can end up overrelying on the tech, and it appears some have figured out ways to avoid Tesla’s related safety features. There have been multiple videos showing people riding alone in the back seat of Tesla vehicles, and people have been caught asleep at the wheel presumably with Autopilot engaged. There is also a growing list of Autopilot-related crashes.
At the same time, Tesla has moved to beef up Autopilot’s autonomous capabilities by adding a feature for automatic lane changing and is now rolling out the full self-driving feature in beta mode to a small group of drivers. The company promises to make its cars fully autonomous and plans a broad release later this year. But it’s not clear that Autopilot is entirely safe. The NHTSA is investigating 23 crashes that may have involved Tesla Autopilot. Tesla, which dissolved its PR department last year, did not reply to Recode’s request for comment.
Federal agencies like the NHTSA are supposed to be taking the lead on setting standards for evaluating autonomous features. However, in April, Sens. Richard Blumenthal (D-CT) and Ed Markey (D-MA) urged the agency to “develop recommendations for improving automated driving and driver assistance systems” and “implement policy changes that stop these preventable deaths from occurring.” They’re not alone; other members of Congress have also been thinking about creating new rules, like expanding the number of self-driving exemptions the NHSTA can give.
Even car manufacturers have signed on to the idea that the NHSTA could do more. The Alliance for Automotive Innovation, a trade group that represents carmakers like Ford and General Motors, says that forward collision warnings, automatic braking, and lane assistance tech need to be evaluated by regulators and included in NHSTA’s new car rating system.
Lawmakers want murky standards improved
Lawmakers, safety advocates, and even representatives of the industry are demanding more discerning federal standards to govern autonomous features, including crash avoidance features and driver assistance tools built into cars that are already on the road. These critics are specifically calling for more research from the Transportation Department, a task they say is important even before fully self-driving cars are on the road.
“Before we get to autonomous technology that can do everything that people can do, there’s a real opportunity to introduce lifesaving technology into vehicles that people will still be driving,” said Jason Levine, the executive director of the Center for Auto Safety, a nonprofit focused on vehicle safety.
The NHTSA has created testing protocols for some features, like collision warnings and automatic emergency braking. It has also requested public comment on what autonomous vehicle safety rules should be. But the agency has yet to create any national standards for how well crash avoidance and driver assistance features ought to perform, according to Ensar Becic, an investigator for highway safety for the NTSB.
Still, more cars are being equipped with increasingly autonomous features. As automakers debut more and more advanced driver and safety features and inch toward more self-driving abilities, NHSTA has recommended more and more of these tools. But there’s also growing concern that the agency isn’t providing enough information about how well these tools should actually work.
“Manufacturers are out there advertising their different versions of this technology, without any true sense of oversight,” Levine added.
Now lawmakers think the NHTSA and the Transportation Department as a whole should have a role in more stringently evaluating this tech. Last month, Sens. Markey, Blumenthal, and Amy Klobuchar (D-MN) reintroduced the Stay Aware for Everyone Act, which would require the Department of Transportation to look at how driver assistance tools, like Tesla’s Autopilot, are impacting driver disengagement and distraction, and would mandate that companies institute driver monitoring tools to make sure drivers are paying attention to the road.
“With NHTSA often slow to act and auto manufacturers rushing to put new autonomous features in cars, this bill and other congressional action that puts public and driver safety first is necessary,” Blumenthal told Recode. He’s also urging President Joe Biden to fill the vacancy for NHTSA administrator to “ensure our country’s top auto safety agency has the leadership needed as this new technology rapidly advances.”
Others also want a better system for regulating how well these autonomous features perform. The legislation Rush, the Democratic representative from Illinois, introduced last week with his Republican co-sponsor Larry Bucshon (R-IN) would order Transportation Secretary Pete Buttigieg to commission a study on the safety of crash avoidance features and how well these systems identify pedestrians and cyclists with different skin tones. The bill, called the Crash Avoidance System Evaluation Act, comes after research from the Georgia Institute of Technology finding that people with darker skin tones are less accurately detected by technology that could be used in self-driving cars.
“We certainly do not want to unleash vehicles on our nation’s streets and highways that can’t guarantee all Americans, all pedestrians, all bicyclists that they are protected equally,” Rush told Recode. “I am concerned … the technology can’t guarantee that I have the same protection against being harmed by a self-driving vehicle as someone who has a darker skin tone or a lighter skin tone.” Rush’s proposal, Levine added, would force the agency to make this key type of safety information public.
In February, the NTSB chair wrote to the NHTSA urging the agency to develop performance standards for collision avoidance features, like vehicle detection and emergency braking.
“We know that creating new motor vehicle safety standards or revising old ones to bring up to date is very time-consuming and very resource-intensive,” said Will Wallace, the manager for safety policy at Consumer Reports. “This is an agency that is chronically underfunded. The agency doesn’t have anywhere near the resources that it needs to protect the public effectively. It’s incumbent on Congress to give the agency what it really needs.”
Lack of detailed requirements for these kinds of autonomous tools puts the US behind other parts of the world, including new car rating systems in Japan, Australia, and Europe. The US’s new car assessment program doesn’t rate these advantaged technologies, explained Becic of the NTSB.
Neither automatic braking nor lane assistance features are designed to allow a car to operate without a driver’s full attention. And, again, the public availability of fully autonomous cars is still years away. Some think that moment may never arrive. Still, these features set a foundation for what regulating roads full of self-driving vehicles could eventually involve. Figuring out how to regulate autonomous car features is important not just for cars that already offer them — it’s key to building a future where the roads are safe for everyone.
Clarification: The story has been updated to include the information that, following publication, the NTSB said that its preliminary research found that Autopilot’s Autosteer function couldn’t be used during a test in the crash location and that it had not made conclusions about the crash. The story has also been updated to note that the man who operated a Tesla without someone in the driver’s seat was arrested.