Skip to Content
Available 24/7 - No Fees Unless We Win - Free Phone Consultation 800-349-0000
Top

NTSB Reports Fault Tesla’s “Self-driving” Vehicles in Two Fatal Accidents

|

Although the COVID-19 epidemic has led to slowdowns in some “non-essential” sections of the federal government, the National Transportation Safety Board (NTSB) is still working to insure that our transportation sector remains as safe as possible. Among the NTSB accident investigation reports that went largely unnoticed by the news media were two final reports on fatal accidents, one in Florida and one in California, involving automobiles manufactured by Tesla Motors that were being operated in “driverless” (autonomous) mode with the vehicles’ Autopilot system engaged.

In today’s post the car accident lawyer at The Doan Law Firm reviews the findings of these NTSB investigations. He will then discuss how these findings could influence future product liability and wrongful death lawsuits involving Tesla vehicles being operated in a “hands-off” mode.

Case 1:
March 23, 2018
Mountain View, CA
2017 Tesla Model X P100D

On the morning of March 23, 2018 a 2017 Tesla Model X P100D sport utility vehicle being operated with its Autopilot (Traffic-Aware Cruise Control and Autosteer) systems engaged when it veered into a highway lane section that was under repair. The Tesla first struck a “crash attenuator” before reentering the flow of traffic and striking two other vehicles. The SUV’s battery pack was ruptured by the force of the initial collision and the vehicle caught fire. The vehicle’s driver, a 39-year-old male, died at a local hospital as short time later as a result of blunt force trauma.

The NTSB noted that the SUV’s Autopilot sensors and software was apparently unable to distinguish between section of highway that was under repair (and thus closed to vehicle traffic) and a normal traffic lane when it detected a slower vehicle ahead and switched lanes.

Case 2:
March 1, 2019
Delray Beach, FL
2018 Tesla Model 3

At about 6:15 a.m. on March 1, 2019 a 2018 Tesla Model 3 was being operated with its Autopilot mode engaged as it traveled south on U.S. Highway 441 when a tractor-trailer rig attempted to enter the northbound lanes of the highway by crossing the Tesla Model 3’s path. According to data recorded by the Model 3’s onboard systems, its driver did not apply the vehicle’s brakes and did not take any evasive action before being killed instantly in a “side underride” collision with the tractor-trailer.

In its report the NTSB commented that the Model 3’s Autopilot software apparently failed to identify the tractor-trailer crossing the vehicle’s path as a hazard and failed to take evasive action or to notify the Model S’s driver of danger.

Discussion

Although not specifically addressed in the NTSB accident reports, the fundamental questions raised in both these accidents are 1) could Tesla’s highly-touted Autopilot system reliably identify potential road hazards and take appropriate measures to avoid such hazards and 2) did Tesla’s marketing material and tactics mislead vehicle buyers into believing that its Autopilot system could relieve a driver of safe vehicle operation duties?

As to the first question, our car accident lawyer points out that the software in use at the times of each accident was notorious for its inability to distinguish between non-moving road and roadside objects (such as trees, stop signs, and overpasses) and moving objects (such as pedestrians, bicyclists, and other vehicles). This problem was not unique to Tesla vehicles, as other manufacturers and autonomous vehicle software developers were experiencing similar difficulties. These problems apply directly to our second question.

Although a review of Tesla’s marketing and promotional materials does not specifically state that its vehicles were fully “self-driving,” it certainly implies that the autonomous driving features of both the Model X P100D and Model 3 could be trusted to essentially replace a vehicles driver, who would be relegated to the role of a navigator who would select a destination and then engage the Autopilot once the vehicle was on the appropriate highway. In the opinion of our car accident lawyer, a strong case can be made that Tesla’s advertisements deliberately mislead consumers by concealing the fact its Autopilot was, in fact, unsafe because of its inability to identify potentially hazardous road conditions but continued to sell its vehicles even though it was aware of these serious safety issues.

***

At The Doan Law Firm, our car accident and defective products lawyer will continue to monitor both the news media and the professional engineering journals for future developments regarding autonomous/self-driving vehicles and will post updates as they become available.