Federal Investigation Launched on Tesla's Automated Driving System Recall of 2 Million Electric Vehicles
Between a Rock and a Hard Place: Tesla's Autopilot Faces New Scrutiny
Tesla is once again under the microscope, this time from U.S. auto safety regulators, over its Autopilot driver assistance technology. The National Highway Traffic Safety Administration (NHTSA) is investigating Tesla's recent recall of over 2 million electric vehicles, citing concerns about the software fix and incidents where vehicles using the remedy were involved in "crash events".
This probe comes as the NHTSA wraps up a nearly three-year investigation into Autopilot, which included examining 956 crashes. In 467 of those incidents, the NHTSA's Office of Defects Investigation (ODI) found that Tesla's software didn't adequately demand driver attention, despite boosting driver confidence.
The regulators pointed out a "mismatch" between the system's demands and drivers' actual attention, leading to preventable crashes and foreseeable misuse. At least 13 crashes under investigation by the NHTSA resulted in one or more fatalities and "many more" severe injuries. The ODI noted that "driver misuse of the system appeared to play a role" in these incidents.
The NHTSA reported that Tesla has implemented software updates to address the regulatory body's concerns, but these updates haven't been made part of the recall. The NHTSA is also concerned about part of Tesla's remedy requiring the vehicle owner to opt-in and the ability for the driver to reverse the update.
The recall applies to model year 2012-2024 Model Y, X, S, 3, and Cybertruck EVs in the U.S. equipped with Autopilot. Tesla recalled 3,878 Cybertruck electric pickups last week due to faulty accelerators.
In December, Tesla admitted that Autopilot's controls "may not be sufficient to prevent driver misuse" and could increase crash risks. Tesla disagreed with the NHTSA but pledged to "incorporate additional controls and alerts to those already existing on affected vehicles" to encourage drivers to stay focused on the road.
Consumer Reports argued in February that the recall may have made warnings and driver notifications more visible, but they actually take drivers' attention away from the road. In its report, the nonprofit stated, "When we covered the camera and kept one hand resting on the steering wheel, the vehicle did not limit Autopilot use or give any warnings to pay attention. The driver could be asleep or completely distracted, and the car wouldn't warn them as long as they are holding the wheel."
Tesla's driver assistance programs flunked a leading auto safety nonprofit's tests earlier this year. Unlike other systems advertised by other automakers, Tesla's marketing – and its CEO – imply that drivers don't need to pay full attention to the road.
Tesla has faced accusations of false advertising from California's Department of Motor Vehicles and investigations by the state attorney general office over its marketing practices. The U.S. Department of Justice has also issued subpoenas related to Tesla's Full Self-Driving technology.
This article was originally published by Quartz.
Insights:- Numerous crashes involving Tesla vehicles on Autopilot have occurred, with a common occurrence being that Autopilot disengages vehicle control less than one second before impact.- Tesla has been criticized for redacting important details from crash reports involving Autopilot or Full Self-Driving.- Tesla's reliance on camera and radar systems for Autopilot has been criticized as less effective in complex environments compared to LiDAR systems.- There are concerns that Tesla's software might not adequately monitor driver attention or intervene when necessary.- The recent layoffs at NHTSA might delay or complicate the development of regulatory frameworks for autonomous vehicles, potentially weakening oversight capabilities.
- The National Highway Traffic Safety Administration (NHTSA) is encouraging Tesla to increase the demand for driver attention in its Autopilot technology, as they found software deficiencies that boost driver confidence without adequately demanding attention.
- The future of Tesla's Autopilot tech faces new challenges, as the NHTSA is investigating Tesla's recent software crashes and recalls, questioning the effectiveness of the tech in complex environments.
- Tesla is currently under investigation by the NHTSA for several crashes involving its Autopilot technology, with many resulting in fatalities and severe injuries.
- While Tesla has made software updates to address NHTSA's concerns, concerns remain about the recall's implementation, particularly the requirement for vehicle owners to opt-in and the option to reverse the update.