Tesla’s Autopilot IMAGE

Federal authorities say a “critical safety gap” in Tesla’s Autopilot system contributed to at least 467 collisions, 13 resulting in fatalities and “many others” resulting in serious injuries.

The findings come from a National Highway Traffic Safety Administration analysis of 956 crashes in which Tesla Autopilot was thought to have been in use. The results of the nearly three-year investigation were published Friday.


Tesla’s Autopilot design has “led to foreseeable misuse and avoidable crashes,” the NHTSA report said. The system did not “sufficiently ensure driver attention and appropriate use.”

NHTSA’s filing pointed to a “weak driver engagement system,” and Autopilot that stays switched on even when a driver isn’t paying adequate attention to the road or the driving task. The driver engagement system includes various prompts, including “nags” or chimes, that tell drivers to pay attention and keep their hands on the wheel, as well as in-cabin cameras that can detect when a driver is not looking at the road. 

According to the NHTSA Office of Defects Investigation data, 13 fatal collisions evaluated in the probe resulted in the deaths of 14 people.

The agency also said it was opening a new probe into the effectiveness of a software update Tesla previously issued as part of a recall in December. That update was meant to fix Autopilot defects that NHTSA identified as part of this same investigation.

The voluntary recall via an over-the-air software update covered 2 million Tesla vehicles in the U.S., and was supposed to specifically improve driver monitoring systems in Teslas equipped with Autopilot.

NHTSA suggested in its report Friday that the software update was probably inadequate, since more crashes linked to Autopilot continue to be reported.

In one recent example, a Tesla driver in Snohomish County, Washington, struck and killed a motorcyclist on April 19, according to records obtained by CNBC and NBC News. The driver told police he was using Autopilot at the time of the collision.

The NHTSA findings are the most recent in a series of regulator and watchdog reports that have questioned the safety of Tesla’s Autopilot technology, which the company has promoted as a key differentiator from other car companies

On its website, Tesla says Autopilot is designed to reduce driver “workload” through advanced cruise control and automatic steering technology.

Tesla has not issued a response to Friday’s NHTSA report and did not respond to a request for comment sent to Tesla’s press inbox, investor relations team and to the company’s vice president of vehicle engineering, Lars Moravy.

Following the release of the NHTSA report, Sens. Edward J. Markey, D-Mass., and Richard Blumenthal, D-Conn., issued a statement calling on federal regulators to require Tesla to restrict its Autopilot feature “to the roads it was designed for”

On its Owner’s Manual website, Tesla warns drivers not to operate the Autosteer function of Autopilot “in areas where bicyclists or pedestrians may be present,” among a host of other warnings.

“We urge the agency to take all necessary actions to prevent these vehicles from endangering lives,” the senators said.  

Earlier this month, Tesla settled a lawsuit from the family of Walter Huang, an Apple engineer and father of two, who died in a crash when his Tesla Model X with Autopilot features switched on hit a highway barrier. Tesla has sought to seal from public view the terms of the settlement.

In the face of these events, Tesla and CEO Elon Musk signaled this week that they are betting the company’s future on autonomous driving.

“If somebody doesn’t believe Tesla’s going to solve autonomy, I think they should not be an investor in the company,” Musk said on Tesla’s earnings call Tuesday. He added, “We will, and we are.”

Musk has for years promised customers and shareholders that Tesla would be able to turn its existing cars into self-driving vehicles with a software update. However, the company only offers driver assistance systems and has not produced self-driving vehicles to date.

He has also made safety claims about Tesla’s driver assistance systems without allowing third-party review of the company’s data.

For example, in 2021, Elon Musk claimed in a post on social media, “Tesla with Autopilot engaged now approaching 10 times lower chance of accident than average vehicle.”

Philip Koopman, an automotive safety researcher and Carnegie Mellon University associate professor of computer engineering, said he views Tesla’s marketing and claims as “autonowashing.” He also said in response to NHTSA’s report that he hopes Tesla will take the agency’s concerns seriously moving forward.

“People are dying due to misplaced confidence in Tesla Autopilot capabilities. Even simple steps could improve safety,” Koopman said. “Tesla could automatically restrict Autopilot use to intended roads based on map data already in the vehicle. Tesla could improve monitoring so drivers can’t routinely become absorbed in their cellphones while Autopilot is in use.”

Tesla’s Autopilot system has contributed to at least 467 vehicle crashes, with 14 resulting in deaths and many others that caused serious injuries, according to federal authorities who say there is a “critical safety gap” in the technology.

The U.S. Department of Transportation is investigating Tesla’s December 2023 recall, which pulled more than 2 million vehicles from the road, to determine whether the company’s updates to its Autopilot driving systems were sufficient in preventing driver distractions. The department analyzed 956 crashes that were alleged to involve the automaker’s driver assistance technology.
The department’s National Highway Traffic Safety Administration on April 26 posted documents to its website that suggest an additional 20 crashes occurred since Tesla’s recall, concerning investigators. The more than 2 million vehicles that Tesla recalled represent nearly all the vehicles that Tesla had sold at that point.

Tesla’s December 2023 recall affected all of its vehicles with Autopilot or driver assistance systems. This includes Tesla’s Model 3, Model X, Model S, and Model Y cars, as well as its Cybertruck. The Autopilot’s driver monitoring systems are supposed to detect the force from drivers’ hands on the steering wheel and send alerts when they are absent.

The safety administration asked the automaker to recall its vehicles after a two-year investigation into Tesla’s Autopilot system. Specifically, the agency probed multiple instances of Teslas crashing into roadside emergency vehicles while using the Autopilot systems.

The automaker said the “prominence and scope of the system’s controls may be insufficient to prevent driver misuse.”

safety, and “involved extensive crash analysis, vehicle evaluations, and assessment of vehicle control authority and driver engagement technologies.”

The administration’s Office of Detects Investigation found 13 crashes that involved one or more fatalities and several others that caused serious injuries in which “foreseeable driver misuse” of the Autopilot system played a key role.

For owners of vehicles that use driver assistance technology, Tesla sent an online software update to increase driver warnings, including when a driver’s hands leave the steering wheel. The agency found 20 additional crashes after Tesla sent the update.

Autopilot Allegedly Kills Motorcyclist

A 2022 Tesla Model S using the Autopilot driving system was allegedly linked to a motorcyclist’s death in Seattle, Washington, on April 19.

The vehicle owner told a Washington State Patrol trooper after the crash that he was using the driver assistance technology while looking at his phone and driving the car.

The next thing he knew there was a bang and the vehicle lurched forward as it accelerated and collided with the motorcycle in front of him,” the police officer wrote in the affidavit.

The trooper arrested the 56-year-old driver for investigation of vehicular homicide after the driver said he was using the Autopilot mode while operating the car inattentively and using his cell phone as it moved forward, “putting trust in the machine to drive for him,” according to the document.

Authorities found the motorcyclist, Jeffrey Nissen, 28, underneath the car, and he was pronounced dead at the scene.

However, authorities are still investigating the crash and have not verified whether the driver was using Autopilot at the time of the incident.

The federal agency probing Tesla said it was looking for defects in Autopilot’s driver monitoring system, which is supposed to alert drivers when their hands are no longer touching the steering wheel. Some experts have alleged that the monitoring system is defective and have also criticized its limitations during night driving.

Teslas have cameras to monitor drivers on the road, but they do not possess night vision capabilities, and Autopilot is still operable when the cameras are covered.

The initial investigation began in 2021 after 11 reports surfaced of Teslas striking parked emergency vehicles while using Autopilot. Between June 8, 2022, and April 25, the Autopilot system caused 467 crashes and 14 fatalities.

Tesla said that both of its driver assistance systems—Autopilot and the advanced “Full Self Driving”—cannot drive themselves, despite the choice of the name.

The latter technology was linked to 75 crashes and one death, according to the investigation.

Tesla’s CEO Elon Musk previously promised a fleet of robotaxis driven with “Full Self Driving” to create revenue for both the company and the vehicles’ owners, as they can operate as taxis when they’d normally be parked. The cars are still being tested and have faced years of delays after Mr. Musk said they would be ready by 2020.

The company says drivers must always be ready to take control of the wheel when using Autopilot or “Full Self Driving” and that they do not give the vehicle complete autonomous control.

What does Tesla autopilot do?

Autopilot is an advanced driver assistance system that enhances safety and convenience behind the wheel. When used properly, Autopilot reduces your overall workload as a driver. Each new Tesla vehicle is equipped with multiple external cameras and powerful vision processing to provide an additional layer of safety.

What is the Tesla safety score ?

Safety Score is an assessment of driving behavior of a Tesla vehicles based on several metrics. Find details around Safety Score and how it’s calculated.

What safety features does a Tesla have?

  • Automatic Emergency Braking. Can detect vehicles, pedestrians or objects in front of you and applies the brakes to mitigate impact.
  • Forward Collision Warning. Provides visual and audible warnings of impending collisions with vehicles or obstacles.
  • Blind Spot Collision Warning. …
  • Lane Departure Avoidance.

How many Teslas are on the road?

Total number of Teslas sold from its existence until today (2008 – Q1 2023): 4,061,776 vehicles. Tesla cars have a 8-year warranty and few Teslas have been crashed, so we can assume than more than 4 millions Tesla are still used on roads.

Is Tesla’s Autopilot safe?

Safety metrics are emphatically stronger when Autopilot is engaged than when not engaged. a. In the 4th quarter of 2022, we recorded one crash for every 4.85 million miles driven in which drivers were using Autopilot technology.

What is the largest risk for Tesla?

The electric vehicle (EV) maker, Tesla, has a number of key risks that it will face in the next 5-10 years. Notable risks include Tesla cars being too expensive with tax breaks and that the construction of its Gigafactory (battery factory) taking longer than expected.

Which is the safest car in the world?

The world’s safest cars
  • BYD Seal. The BYD Seal electric sedan newcomer has tied with its Dolphin hatch counterpart as one of the safest new cars globally. …
  • BYD Dolphin. …
  • Kia Sportage. …
  • Mitsubishi Outlander. …
  • Tesla Model Y. …
  • Mercedes-Benz C-Class. …
  • Honda Civic. …
  • Toyota Kluger/Highlander.

Leave a Reply

Your email address will not be published. Required fields are marked *