Tesla is being sued by the family of a 50-year-old man who died in a crash while using the company’s Autopilot advanced driver assistance system. The family of Jeremy Beren Banner is suing for wrongful death, while asking for damages of more than $15,000. A lawyer for the family announced the lawsuit on Thursday, though it has apparently not yet been filed with the Palm Beach County Clerk.
Banner is the fourth known person to die while using Autopilot, and his family is the second to sue Tesla over a fatal crash involving the technology. In May, Tesla was sued by the family of 38-year-old Wei Huang, who died in 2018 after his Model X crashed into an off-ramp divider with Autopilot engaged.
Banner was killed while driving along a Florida highway at 68 miles per hour on March 1st of this year. His Tesla Model 3 collided with a tractor-trailer that was crossing his path, which tore the roof off of the car. The vehicle ultimately came to a stop about 1,600 feet away from the site of the impact.
Banner activated Autopilot about 10 seconds before he died
In May, the National Transportation Safety Board revealed in a preliminary report that Banner turned Autopilot on about 10 seconds before the collision. The agency said that the vehicle “did not detect the driver’s hands on the steering wheel” between about 8 seconds before the crash and the time of impact.
Tesla’s account of the crash differed slightly. The company said it told the NTSB and the National Highway Traffic Safety Administration that the vehicle’s data logs showed Banner “immediately removed his hands from the wheel.” That would mean Banner didn’t comply with the company’s instructions that drivers keep their hands on the wheel while using Autopilot. (Though CEO Elon Musk is often seen on broadcast news demonstrating the opposite behavior.)
But the NTSB’s language — “did not detect the driver’s hands” — leaves room for the possibility that Banner had his hands on the wheel when he crashed. Autopilot users often receive a warning to apply pressure to the wheel even when they’re already gripping it, and so the exact order of events remains up in the air. The NTSB also said that “[n]either the preliminary data nor the videos indicate that the driver or the ADAS executed evasive maneuvers.”
The NTSB’s full investigation is likely to take as long as another year to be completed. A lawyer for Banner’s family said in a press conference Thursday that Tesla has video of the accident from the car’s onboard cameras, but it’s unclear if the family has been given access to that footage.
Tesla also often reminds drivers that they need to supervise Autopilot at all times, though the company still markets and sells an Autopilot package it calls “full self-driving.” Musk has said in the past that serious crashes involving Autopilot are often the result of the “complacency” of “inexperienced user[s].”
Musk has said the “complacency” of “inexperienced user[s]” is to blame for serious crashes
“They just get too used to it. That tends to be more of an issue. It’s not a lack of understanding of what Autopilot can do. It’s [drivers] thinking they know more about Autopilot than they do,” he said in 2018.
The circumstances of Banner’s crash very closely resemble those of the first high-profile fatality involving Autopilot. In 2016, 40-year-old Joshua Brown collided with a tractor-trailer that was crossing his path on a Florida highway. Brown was also using Autopilot at the time of his death. Tesla said in 2016 that its camera system failed to recognize the white broadside of the truck against the bright sky. The NHTSA eventually came to the conclusion that Brown was not paying attention to the road, though the NTSB said a lack of safeguards contributed to his death.
The car Brown was driving had a completely different version of Autopilot which relied on tech from Israeli company Mobileye. But the similarities in the crashes suggest that Tesla didn’t address this issue with Autopilot’s ability to recognize a crossing tractor-trailer, regardless of the potential fault of the driver.