NHTSA asks Tesla why it didn’t initiate a recall as required when it pushed safety-related software update
A Tesla operating in the vehicle’s driver-assist system known as Autopilot struck a police car March 17, 2021 in Michigan, officials said in a tweet.
Michigan State Police
A federal vehicle safety authority is asking Tesla to explain why it didn’t initiate a recall as required when it pushed a safety-related software update to customers in September.
The update enabled Tesla vehicles to better detect emergency vehicle lights in low-light conditions, according to a letter from the National Highway Traffic Safety Administration to Tesla published to the government agency’s website on Wednesday.
Tesla’s “Emergency Light Detection Update” was delivered via an over-the-air software update to customers’ cars a few weeks after NHTSA initiated a probe into possible safety defects with Tesla Autopilot, the company’s standard driver assistance package.
Tesla also sells a premium version of its driver assistance system under the brand name FSD, or Full Self-Driving, for $10,000 up front or $199 per month. None of Tesla’s systems make their cars safe for use without a human driver behind the wheel at all times. They are “level 2” driver assistance systems, not fully autonomous vehicle technologies.
As CNBC previously reported, NHTSA identified around a dozen collisions that involved Tesla drivers crashing into first responders’ vehicles while they were parked on the side of the road, typically at night or in the predawn hours. In each of the incidents identified by NHTSA, the Tesla drivers had Autopilot or traffic aware cruise control features engaged before the crash. One of the crashes resulted in a fatality.
NHTSA wants to know if Autopilot defects or design issues contributed to or caused those crashes. And now they also want to know if Tesla’s software update effectively served as a stealth recall.
If the agency deems Autopilot defective, it could mandate a recall and impact Tesla’s public image. Such a finding could also inspire greater urgency around rating and regulating driver assistance systems such as Tesla’s.
Currently, NHTSA issues an annual New Car Assessment Program rating on the crashworthiness of vehicles sold in the U.S. NCAP ratings list features that are included in each vehicle, but the agency doesn’t yet rate the safety of or limit the use of driver assistance systems such as Tesla’s.
As part of its Tesla probe, NHTSA is evaluating 12 other carmakers’ comparable systems.
Gregory Magno, chief of the NHTSA’s vehicle defects division, told Tesla’s Director of Field Quality Eddie Gates in the new letter that automakers are required to notify NHTSA within five business days when they become aware of, or should have become aware of, safety issues in their vehicles that necessitate a fix.
Over-the-air software updates are covered by current federal recall laws, Magno said.
The agency also asked Tesla for details on its expanding FSD Beta program.
The program gives Tesla owners who are not trained safety drivers a chance to test prerelease software and new driver assistance features on public roads in the U.S. The FSD Beta software does not make Tesla vehicles driverless and has not been debugged enough for general use and wide release.
Among other things, NHTSA requested detailed records on how Tesla rates and selects participants in the experimental, early access program.
Recently, Tesla added a “beta button” that allows any customer to request access to an FSD Beta download. It also released an insurance calculator that gives drivers seeking FSD Beta access a “safety score.”
Tesla owners who scored 100 over a week of driving at least 100 miles were given access to FSD Beta this week, expanding the program by about 1,000 people, according to CEO Elon Musk, who remarked on the number at an annual shareholder meeting last week.
Vehicle safety advocates, including the National Transportation Safety Board, have called on NHTSA to regulate systems such as Tesla’s Autopilot, FSD and FSD Beta sooner rather than later.