
TL;DR:
- Tesla held 33% responsible for a 2019 fatal crash involving Autopilot.
- Jury awarded $200 million in punitive damages against Tesla.
- Driver was found two-thirds at fault and sued separately.
- Verdict may reshape regulation and liability around self-driving tech.
- Tesla plans to appeal, calling the verdict a legal error.
Jury Rules Against Tesla in Landmark Autopilot Liability Case
A federal jury in Miami has ruled that Tesla shares liability in a deadly crash involving its Autopilot driver assistance system. The crash occurred in 2019 when the Tesla vehicle failed to stop at an intersection, resulting in a collision that killed 20-year-old Naibel Benavides Leon and severely injured her boyfriend, Dillon Angulo.
Tesla was assigned one-third of the blame, with the remaining two-thirds falling on the driver, who was sued in a separate proceeding. The jury awarded $200 million in punitive damages against Tesla, according to CNBC’s report, with compensatory damages raising the total to approximately $242.5 million.
Tesla Accused of Misrepresenting Autopilot Capabilities
The plaintiffs argued that Tesla irresponsibly allowed Autopilot to be used outside its intended environment, namely non-highway roads, without restrictions. In his closing remarks, Brett Schreiber, the lead attorney, claimed Tesla knowingly put public safety at risk for profit.
“Tesla’s lies turned our roads into test tracks for their fundamentally flawed technology,” said Schreiber in a statement to TechCrunch. “This verdict represents justice for Naibel’s tragic death and Dillon’s lifelong injuries.”
Schreiber added that Elon Musk promoted Autopilot as “better than a human driver”, despite internal awareness of its technical shortcomings.
Tesla Calls Verdict “Wrong,” Plans to Appeal
Tesla responded to the verdict by stating it will appeal the decision, citing what it described as “substantial errors of law” and trial irregularities.
“No car in 2019, or today, would have prevented this crash,” the company argued in its official statement. “This was never about Autopilot—it was a fiction concocted by plaintiffs’ lawyers.”
Tesla maintained that the driver admitted responsibility from the start and reiterated its belief that the Autopilot system was not the direct cause of the crash.
Safety Agencies Have Warned of Autopilot Misuse
The National Transportation Safety Board (NTSB) previously warned that Tesla’s Autopilot design encouraged overconfidence, citing multiple incidents. In its 2020 report on a 2018 crash, the NTSB concluded that driver inattention while using Autopilot was a key factor in the accident.
That driver, Walter Huang, died after crashing into a barrier while reportedly playing a game on his phone. The NTSB accused Tesla of ignoring safety recommendations following the investigation, including features to monitor driver engagement.
Even Elon Musk, during a 2018 Tesla earnings call, acknowledged that drivers could become too reliant on Autopilot. He described user “complacency” as a critical risk.
Verdict Arrives as Tesla Launches Robotaxi Rollout
The verdict comes at a pivotal moment for Tesla, which is in the process of launching its Robotaxi network in Austin, Texas. These autonomous vehicles are powered by Tesla’s Full Self-Driving (FSD) system, an even more ambitious evolution of Autopilot.
Tesla’s marketing has long leaned on self-driving promises, a strategy that helped build the company’s trillion-dollar valuation. However, this legal defeat could serve as a significant precedent for future litigation regarding driver-assist systems and AI-powered mobility.
The Data
Metric | Detail | Source |
Crash Date | 2019 | CNBC |
Punitive Damages | $200M | TechCrunch |
Total Verdict Value | $242.5M | CNBC |
Autopilot Blame Assigned | 33% to Tesla | TechCrunch |
NTSB Warning | 2020 on overreliance | NTSB |
Conclusion
The Miami jury’s decision marks one of the most significant legal defeats for Tesla regarding its driver-assist software. As Tesla ramps up its self-driving ambitions with Robotaxis, the case sets a powerful precedent around how Autopilot failures are viewed through the lens of corporate liability and public safety.