Is Tesla’s Fatal Crash a Setback for Self-Driving Technologies?

Is Tesla’s Fatal Crash a Setback for Self-Driving Technologies?
Tesla vehicles outside a Tesla showroom and service center in Brooklyn, New York, on July 5, 2016. Spencer Platt/Getty Images
Emel Akan
Updated:

The U.S. safety agency expanded its investigation of Tesla Motor’s autopilot system after a fatal crash in Florida in May. Tesla may become subject to product liability claims, and negative publicity could curb enthusiasm for Tesla cars and other self-driving technologies.

“Depending on the specific facts and the particular state’s law, Tesla could conceivably be liable to the family of the Tesla driver for the design and marketing of its autopilot system,” said Bryant Walker Smith, an assistant professor of law at the University of South Carolina, who specializes in autonomous vehicle regulations.

“Other vehicle owners might claim misrepresentation,” he said.

The National Highway Traffic Safety Administration (NHTSA) on July 12 disclosed a nine-page letter sent to Tesla asking for more information about its Autopilot (automated driving system), as part of an ongoing probe. The Autopilot system was in use when 40-year-old Joshua Brown, driving a Tesla 2015 Model S sedan, collided with a tractor-trailer in Florida.

Other vehicle owners might claim misrepresentation.
Bryant Walker Smith, University of South Carolina

The news of a preliminary NHTSA review first came out on June 30. The agency is now requesting data and event images related to all known crashes to determine if a safety defect exists with the Autopilot system.

“We don’t have the full information at this point—although there will be much more digital data from this [fatal] crash than from many others,” said Smith.

Tesla tracks data from its cars through the internet. Tesla’s Autopilot system uses cameras, radar, and computers to detect objects to automatically brake before potential collisions.

“What we know is that the vehicle was on a divided highway with Autopilot engaged when a tractor-trailer drove across the highway perpendicular to the Model S,” Tesla announced in a blog about the tragic accident.

“Neither Autopilot nor the driver noticed the white side of the tractor-trailer against a brightly lit sky, so the brake was not applied,” the company said.

The safety agency requested data on consumer complaints and lawsuits related to Tesla vehicles. The agency also asked whether there has been any modification to the Autopilot system, including software changes, since its launch.

Tesla has to reply to NHTSA by the end of August and could face penalties of up to $21,000 per day, up to a maximum of $105 million, if it fails to respond promptly, according to the letter.

Is Autopilot Safe?

Brown may have been watching a Harry Potter movie on a portable DVD player when the crash occurred, according to media reports. The driver also had a YouTube channel with a number of short videos demonstrating Tesla’s Autopilot features.

Roughly 100 other Americans died in crashes the same day as this Tesla driver.
Bryant Walker Smith, University of South Carolina

Since the investigation first started there have been at least two crashes in which Autopilot was activated but no one was injured.

“Roughly 100 other Americans died in crashes the same day as this Tesla driver. Distraction likely played a role in some of these crashes too,” said Smith.

“Against the backdrop of a tragic status quo of carnage on our roads, no one is quite sure how to strike the right balance between caution and aggression in deploying these systems.”

Tesla officials insist that its Autopilot system is safe and they have no plans to disable the feature, which is built into 70,000 Tesla cars worldwide. Instead, the company blames drivers for not following the instructions.

“When drivers activate Autopilot, the acknowledgment box explains … that Autopilot ‘is an assist feature that requires you to keep your hands on the steering wheel at all times,’ and that ‘you need to maintain control and responsibility for your vehicle’ while using it,” Tesla stated, in a post on the company’s website. 

Tesla says Autopilot has been safely used for over 100 million miles of driving worldwide and the customers using Autopilot are statistically safer than those who don’t.

The company released the Autopilot system last fall in “beta” phase—a computer industry term for software development or customer testing phase. Some experts criticize regulators for allowing the public access to the system too soon.

“It will be difficult for Tesla to be found liable given that they clearly state the human driver is ultimately responsible. And they describe their product as beta,” said Nidhi Kalra, a senior information scientist at RAND Corporation, a U.S. think tank.

Regulatory Response 

Companies are racing to develop self-driving cars. Google’s founder Sergey Brin said the company plans to have its driverless cars on the market no later than 2018. Tesla’s founder Elon Musk also expects first fully autonomous Teslas to be ready by 2018.

The tragedy will not change the investment pace for autonomous vehicles but may be a wake-up call for regulators.

BMW announced on July 1 it would launch fully autonomous cars by 2021. Some states are introducing legislation to speed the testing and development of self-driving cars on public roads.

The tragedy will not change the investment pace for autonomous vehicles, but may be a wake-up call for regulators, according to experts. It may also affect the public perception of semi-autonomous and fully autonomous vehicles.  

“Depending on the findings of the investigation, the accident may change the pace of Level 3 [e.g. Tesla’s limited self-driving automation] deployment, but it is not likely to slow overall deployment of autonomous technologies,” Kalra said.

If a pattern is found after the investigation, NHTSA may introduce some tighter regulations on both technologies like the Autopilot and on driver behavior, according to Kalra.

The U.S. Securities and Exchange Commission is also investigating whether Tesla Motors breached securities laws by failing to disclose the fatal crash to investors, according to a Wall Street Journal report on July 11.

Tesla spokesperson said that it “has not received any communication from the SEC regarding this issue.”

Emel Akan
Emel Akan
Reporter
Emel Akan is a senior White House correspondent for The Epoch Times, where she covers the Biden administration. Prior to this role, she covered the economic policies of the Trump administration. Previously, she worked in the financial sector as an investment banker at JPMorgan. She graduated with a master’s degree in business administration from Georgetown University.
twitter
Related Topics