NHTSA Upgrades Probe of Autopilot Teslas Colliding With Emergency Vehicles, Increasing Possibility of Recall

NHTSA Upgrades Probe of Autopilot Teslas Colliding With Emergency Vehicles, Increasing Possibility of Recall
Tesla China-made Model 3 vehicles are seen during a delivery event at the carmaker's factory in Shanghai, China, on Jan. 7, 2020. Aly Song/Reuters
Katabella Roberts
Updated:
0:00

The National Highway Traffic Safety Administration (NHTSA) has upgraded its investigation into collisions of Tesla vehicles with the autopilot feature switched on, meaning the automobiles are potentially one step closer to being recalled.

NHTSA initially launched its investigation into Tesla’s Autopilot feature in August 2021, following a string of crashes in which Tesla vehicles operating in autopilot mode had collided with stationary in-road or roadside first responder vehicles tending to pre-existing collision scenes.

The investigation covers all four Tesla vehicles—Models Y, X, S, and 3—representing about 830,000 vehicles that have been sold in the United States since the start of the 2014 model year.

On Thursday, the federal government agency said it was updating the probe (pdf) to an “engineering analysis,” allowing it to “extend the existing crash analysis, evaluate additional data sets, perform vehicle evaluations, and to explore the degree to which Autopilot and associated Tesla systems may exacerbate human factors or behavioral safety risks by undermining the effectiveness of the driver’s supervision.”
Tesla’s autopilot feature is designed to “enhance safety and convenience behind the wheel,” according to the company’s website. However, it should be used with a fully attentive driver who keeps their hands on the steering wheel and is prepared to take over at any given moment.

NHTSA reported that it has found 16 crashes into emergency vehicles and trucks with warning signs, which resulted in 15 injuries and one death. The crashes took place between January 2018 and January 2022.

The agency said analysis had shown that the majority of the 16 vehicles involved in the crashes had received Forward Collision Warnings (FCW) prior to impact, meaning they were warned that they were in dangerously close proximity to another vehicle.

Automatic emergency braking (AEB) intervened in approximately half of the collisions to slow the vehicles down prior to impact, NHTSA said. On average, autopilot aborted vehicle control less than one second prior to the first impact, the agency said in documents detailing the probe.

“All subject crashes occurred on controlled-access highways,” officials said. “Where incident video was available, the approach to the first responder scene would have been visible to the driver an average of 8 seconds leading up to impact.”

For 11 of the collisions, forensic data indicated that none of the drivers of the vehicles took “evasive action between 2-5 seconds prior to impact,” and the agency found that in many cases, drivers had their hands on the steering wheel, as required by Tesla, leading up to the impact.

“However, most drivers appeared to comply with the subject vehicle driver engagement system as evidenced by the hands-on wheel detection and nine of eleven vehicles exhibiting no driver engagement visual or chime alerts until the last minute preceding the collision (four of these exhibited no visual or chime alerts at all during the final Autopilot use cycle),” the agency wrote.

Investigators said that in the next phase of the probe, they will evaluate additional data, vehicle performance, and “explore the degree to which Autopilot and associated Tesla systems may exacerbate human factors or behavioral safety risks by undermining the effectiveness of the driver’s supervision.”

The Epoch Times has contacted Tesla for comment.

In total, the agency looked at 191 crashes involving crash patterns not limited to the first responder scenes but removed 85 of them due to external factors, such as the actions of other vehicles, or a lack of information to support a definite assessment.

The agency noted that “a driver’s use or misuse of vehicle components, or operation of a vehicle in an unintended manner does not necessarily preclude a system defect.”

NHTSA also said it’s looking into crashes involving similar patterns that did not include emergency vehicles or trucks with warning signs.

In a statement, NHTSA said there aren’t any vehicles available for purchase today that can drive themselves. “Every available vehicle requires the human driver to be in control at all times, and all state laws hold the human driver responsible for operation of their vehicles,” the agency said.

The latest update to the investigation into the vehicles produced by Elon Musk’s company comes as the businessman has put on hold his bid to buy Twitter, amid a dispute with the social media company over the number of automated or “bot” accounts.

Musk has said he believes up to 90 percent of Twitter accounts may be fake and will not be forging ahead with the deal unless Twitter can hand him an accurate figure.
The Associated Press contributed to this report.
Katabella Roberts
Katabella Roberts
Author
Katabella Roberts is a news writer for The Epoch Times, focusing primarily on the United States, world, and business news.
Related Topics