Skip to main content

Tesla sued by Texas cops after a Model X on Autopilot slammed into five officers

Tesla sued by Texas cops after a Model X on Autopilot slammed into five officers

/

Another incident involving Autopilot and emergency vehicles

Share this story

A group of Texas law enforcement officials are suing Tesla after a Model X with Autopilot engaged crashed into five police officers. The suit was first reported by KPRC 2 in Houston.

It is the latest legal headache for the automaker as it seeks to roll out its controversial driver assistance software to more customers. And it comes as Tesla is facing renewed scrutiny over several crashes that have taken place involving Autopilot and emergency vehicles.

The crash took place February 27, 2021, in Splendora, a small town in Montgomery County in the eastern part of the state. According to the lawsuit, the Model X SUV crashed into several police officers while they were engaged in a traffic stop on the Eastex Freeway in Texas. “All were badly injured,” the lawsuit says.

The plaintiffs claim that “design and manufacturing defects known to Tesla” are responsible for the crash, as well as “Tesla’s unwillingness to admit or correct such defects.” Autopilot, they argue, “failed to detect the officers’ cars or to function in any way to avoid or warn of the hazard and subsequent crash.”

The plaintiffs also note that “this was not an isolated instance,” citing “at least” 12 other crashes that have occurred involving a Tesla vehicle using Autopilot. Incidentally, the National Highway Traffic Safety Administration is investigating 12 crashes in which Tesla owners using the company’s Autopilot features have crashed into stationary emergency vehicles, resulting in 17 injuries and one fatality.

The lawsuit cites several tweets by Tesla CEO Elon Musk commenting on crashes involving Autopilot or incidents of Tesla owners misusing the system as evidence that the company is aware of these defects and has failed to recall or correct them.

Tesla’s blatant refusal to adopt additional safeguards or to fix the issues with its Autopilot system demonstrate a lack of supervision and oversight of Tesla’s Autopilot system. Tesla has intentionally decided not to remedy these issues and must be held liable and accountable, especially when it has detailed knowledge of the risks and dangers associated with its Autopilot system.

The officers are also suing a local restaurant owner, claiming the driver of the Model X was served too much alcohol prior to the incident. They are seeking damages for injuries and permanent disabilities. The lawsuit lists damages in excess of $1,000,000, with maximum damages of $20,000,000.

Tesla has been hit with lawsuits regarding crashes involving Autopilot in the past. In 2019, Tesla was sued by the family of Jeremy Banner, a 50-year-old man who died in a crash while using Autopilot. Earlier that year, the company was sued by the family of 38-year-old Wei Huang, who died in 2018 after his Model X crashed into an off-ramp divider with Autopilot engaged.

Last week, Tesla enabled access to the beta of its “Full Self-Driving” (FSD) program to more customers via a “request” button on Teslas’ dashboard screens. FSD is marketed as a more advanced version of Autopilot that enables drivers to use its features, like steering control and adaptive cruise control, on local roads.

Safety officials have criticized the rollout. Jennifer Homendy, chair of the National Transportation Safety Board, said last week that Tesla should address “basic safety issues” before expanding FSD, calling the company’s use of the term full self-driving “misleading and irresponsible.”