A Tesla Model Y in “Full Self-Driving” (FSD) beta mode allegedly crashed on November 3rd in Brea, a city southeast of Los Angeles, marking what is likely to be the first incident involving the company’s controversial driver assist feature. No one was injured in the crash, but the vehicle was reportedly “severely damaged.”
The crash was reported to the National Highway Traffic Safety Administration, which has multiple, overlapping investigations into Tesla’s Autopilot system. The incident report appears to have been made by the owner of the Model Y.
According to the report:
The Vehicle was in FSD Beta mode and while taking a left turn the car went into the wrong lane and I was hit by another driver in the lane next to my lane. the car gave an alert 1/2 way through the turn so I tried to turn the wheel to avoid it from going into the wrong lane but the car by itself took control and forced itself into the incorrect lane creating an unsafe maneuver putting everyone involved at risk. car is severely damaged on the driver side.
Tesla’s decision to test its “Full Self Driving” driver assistance software with untrained vehicle owners on public roads has attracted a massive amount of scrutiny and criticism. Throughout, the company has rolled out — and retracted — several software updates meant to upgrade the system while also addressing bugs in the software.
There have been many video clips uploaded online showing Tesla owners using FSD beta, with varying degrees of success. Some clips show the driver assist system confidently handling complex driving scenarios, while others depict the car drifting into the wrong lane or making other serious mistakes.
Despite its name, FSD is not an autonomous driving system. Drivers are required to stay vigilant, keeping their eyes on the road and their hands on the steering wheel. Vehicles with highly automated driving systems that still require human supervision are classified as Level 2 under the Society of Automotive Engineers’ taxonomy. (Level 5 describes a vehicle that can drive anywhere, under any conditions, without any human supervision.)
The US government has taken a renewed interest in Tesla, recently announcing that it was investigating incidents involving Tesla cars operating Autopilot that have crashed into parked emergency vehicles.
NHTSA is also seeking more information from Tesla about the growing public beta test of FSD, the recently launched “Safety Score” evaluation process for entering the program, and the nondisclosure agreements Tesla was making participants sign up until recently.
A spokesperson for Tesla did not respond to a request for comment — nor is it likely they will after disbanding their press department in 2019. A spokesperson for NHTSA confirmed that the agency was investigating the claim.
Updated November 16th, 8:28AM ET: Updated to include response from NHTSA.