A Tesla owner in Utah says that he returned to his parked Model S to find that it had crept forward and crashed into the trailer parked ahead of it. The man claims that this all went down without explanation; Tesla refutes that claim, saying that after taking a look at the car's logs, it determined that the Summon feature had been activated, which autonomously calls the car to (or from) the waiting driver without anyone behind the wheel.
The owner says that he was answering questions about the car from a passerby, so it's entirely possible that he opened the app and accidentally activated Summon (for what it's worth, he claims he didn't). But here's the thing: it doesn't matter. Under no circumstances should a production vehicle autonomously collide with a parked vehicle mere feet ahead of it. I think that's common sense, right? Am I saying anything controversial?
As a refresher, here's Tesla's video explaining how Summon works:
A letter to the owner from a regional Tesla service manager obtained by KSL says that "the incident occurred as a result of the driver not being properly attentive to the vehicle's surroundings while using the Summon feature or maintaining responsibility for safely controlling the vehicle at all times." Perhaps! But cars — by virtue of the fact that people of all walks of life and levels of attentiveness use them — are supposed to be far more bulletproof than this by the time they reach production. It's why the FCA just recalled over a million cars with confusing shift levers, because a driver who isn't paying attention could fail to put it in park.
In other words, Summon — and similar features that will inevitably come from other manufacturers — can't rely on an attentive owner to keep the car from smashing into something. It's one of the reasons why Google and a half-dozen automakers are moving more slowly with commercial autonomous deployments. Tesla will be quick to note that Summon is "in beta," but owners are going to use it either way, and unfortunately, we can't always expect them to use good judgement.
We can't be required to be smarter than the software
The notion of safety-critical "beta" software in deployment on commercial vehicles is a relatively new one, and it's something that regulatory agencies aren't equipped to deal with. I suspect that in a decade or two, there will be a well-established framework of tests that systems like Summon will have to pass before being allowed on a production car. But for now, it's the Wild West.
In a statement to KSL, Tesla says that Summon "may not detect certain obstacles, including those that are very narrow (e.g., bikes), lower than the fascia, or hanging from the ceiling." In this incident, the trailer is very high — it only comes as low as the midpoint of the Model S's windshield. But we've seen autonomous vehicles solve much harder problems than this (just watch any of Nvidia's keynotes on computer vision in self-driving cars from the past couple years). And coincidentally, the Model S is equipped with a forward-facing camera mounted at rear view mirror height.
That is to say: "beta" or no, owner liability disclaimer or no, a car capable of autonomously crashing itself into a stationary object in normal operating conditions should never have been sold. Tested, yes — sold, no.
We humans just aren't smart enough to handle something like that.
Update May 11th 3:02PM ET: We saw the letter that Tesla sent to the vehicle's owner, which details what happened. You can read our full story here.