Tesla Supercharger stations in a parking lot in Austin, Texas, on September 16, 2024.
Brandon Bell | Getty Images
Tesla The family of a driver who died in a 2023 crash has filed a lawsuit, claiming the company's “fraudulent misrepresentation” of its Autopilot technology was to blame.
Tesla driver, Genesis Giovanni Mendoza Martinez, died in a crash of a Model S sedan in Walnut Creek, California. His brother Caleb, who was a passenger at the time, was seriously injured.
Mendoza's family filed a lawsuit against Tesla in October in Contra Costa County, but in recent days, the case was moved from state court to federal court in the Northern District of California. The Independent first reported on the change of venue. Plaintiffs generally face a higher burden of proof in federal court for fraud allegations.
The accident involved a 2021 Model S, which collided with a parked fire truck while the driver was using Tesla's Autopilot, a partially automated driving system.
Mendoza's lawyers claimed that Tesla and Musk exaggerated or made false claims about the Autopilot system for years in order to “create excitement about the company's vehicles and thereby improve its financial position.” They pointed to tweets, company blog posts, notes on earnings calls and press interviews.
In their response, Tesla's lawyers said the driver's “negligent acts and/or omissions” caused the collision, and that “reliance on any admission made by Tesla, if any, was not a material factor” in causing the driver's harm. Driver or passenger. They claim that Tesla's cars and systems have a “reasonably safe design,” in compliance with state and federal laws.
Tesla did not respond to requests for comment on the case. Brett Schreiber, the attorney representing the Mendoza family, declined to make his clients available for an interview.
There are at least 15 other active cases focusing on similar claims involving Tesla accidents where Autopilot or FSD — Full Self-Driving (supervised) — was in use before a fatal or harmful accident occurred. Three of them were transferred to federal courts. FSD is the premium version of Tesla's partially automated driving system. While Autopilot comes as a standard option on all new Teslas, owners pay a premium up front, or subscribe monthly to use FSD.
The accident at the heart of the Mendoza-Martinez lawsuit was also part of a broader investigation into Tesla Autopilot by the National Highway Traffic Safety Administration, which began in August 2021. During that investigation, Tesla made changes to its systems, including countless From over-the-air software updates.
The agency has opened a second investigation, which is currently ongoing, to evaluate whether a “recall remedy” introduced by Tesla to resolve issues with Autopilot behavior around stationary first responder vehicles was effective.
The NHTSA has warned Tesla that its social media posts may mislead drivers into thinking its cars are robotaxis. Additionally, the California Department of Motor Vehicles sued Tesla, alleging that its claims of Autopilot and FSD were false advertising.
Tesla is currently rolling out a new version of FSD to customers. Over the weekend, Musk instructed his more than 206.5 million followers on X to “demonstrate a self-driving Tesla to a friend tomorrow,” adding: “It feels like magic.”
Musk has been promising investors that Tesla cars would soon be able to drive autonomously, without a human at the wheel, since around 2014. While the company has demonstrated a design concept for a two-seat self-driving car called CyberCab, Tesla has not yet produced a robotaxi.
Meanwhile, competitors including WeRide and Pony.ai in China, and alphabet In the US, Waymo already operates commercial robotaxi fleets and services.
Watch: Tesla FSD tests were 'incredibly good'