Honestly it's a design issue. The way that summon was activated(double tap on P) is very possible to slip and do. The screen where you cancel I've occasionally seen take 1-3s to pop up depending on what the rest of the SoC is doing.
I could totally see a scenario where this happened, screen popped up while he was exiting and wasn't able to hear/notice that summon was engaged.
The better fix here is to have a CONFIRM on the touchscreen rather than a CANCEL. It wouldn't hinder the experience since you already select forwards/back and catches this accident case.
For the record, love the car and almost everything that Tesla does but I really hope they revisit this and design it a bit more defensively.
"Unfortunately, these warnings were not heeded in this incident. The vehicle logs confirm that the automatic Summon feature was initiated by a double-press of the gear selector stalk button, shifting from Drive to Park and requesting Summon activation. The driver was alerted of the Summon activation with an audible chime and a pop-up message on the center touchscreen display. At this time, the driver had the opportunity to cancel the action by pressing CANCEL on the center touchscreen display; however, the CANCEL button was not clicked by the driver. In the next second, the brake pedal was released and two seconds later, the driver exited the vehicle. Three seconds after that, the driver's door was closed, and another three seconds later, Summon activated pursuant to the driver's double-press activation request. Approximately five minutes, sixteen seconds after Summon activated, the vehicle's driver's-side front door was opened again. The vehicle's behavior was the result of the driver's own actions and as you were informed through multiple sources regarding the Summon feature, the driver is always responsible for the safe operation and for maintaining proper control of the vehicle."
Basically, they designed an autonomous-operation mode that was easy to activate by accident and incapable of reliably avoiding crashing into things, it appears someone did and his shiny Tesla crashed into a trailer as a result, and they responded by accusing him of intentionally activating the feature and misusing it.
Why is this being down voted? The highly detailed log of the driver's every action is crazy creepy.
I get that the data is likely useful for debugging, and may very well be a function of the feature's beta status (can someone confirm? Or is this something that Teslas do all the time?), but it's still insanely creepy that every single action this guy took in his own car was remotely logged and accessible. This guy is basically driving a Telescreen from 1984 to work.
Looks like there's someone doing a carpet-downvoting everything in the subthread(my root post dropped ~3 pts just as these were downvoted).
Yeah, it's a double-edged sword. On one hand it's a ton of data, on the other there's multiple cases where you don't need to bring the car into the dealer for them to diagnose something.
Oh wow. I have been anti-Tesla due to their creepy "we still own your car" auto-update craziness but that just takes it up another level. No, Tesla, I will not buy your cars, not now not ever, because you don't trust me and therefore I do not trust you.
But that seems to mostly be speed and throttle information stored in a black box in the car that logs in the event of an accident and isn't remotely accessible. That sort of thing is a far cry from "our server logs show you opened the driver side door at 5:53 PM" like Tesla is doing. If other manufacturers are recording that sort of granular data, too, then yikes.
I don't think the car's logs are automatically transmitted to Tesla. They reside on the car, and Tesla can login remotely to view them if they have a valid reason to.
Who decides if it's a valid reason, and who authorizes such access?
If it's not the owner... then they aren't really the owner.
With the number of cameras/sensors on a Tesla, it's a privacy nightmare... I won't buy one until the answers to these questions are the ones that I want them to be.
What I also found irritating in this article was the sentence "you remain prepared to stop the vehicle at any time using your key fob or mobile app or by pressing any door handle". Stopping the car by mobile app?!? Imho stopping the car in such situations is something safety critical for me, which absolutely requires hard realtime behavior. I don't see any chance to achieve this through any consumer hardware/software or a bluetooth/wifi connection.
Unfortunately, these warnings were not heeded in this incident.
The way cars chime at people, so often, for such bs reasons, it's a wonder anyone would design a UX where it's safety-critical for someone to pay attention to a chime and pop-up. That UX designer needs to have a good talking to, or be fired.
Heh. In contrast to most GPS systems, my Audi doesn't outright block the ability to enter destinations whilst en route, but instead says "don't do this while driving", and proceeds to let you do it if you so choose.
I agree with you in this case. A Tesla is a mass market item. It should be held to very high safetly standards.
Just ask the designers of that Chernobyl plant.
I somewhat disagree here. It was 1960's technology. Even now, complex systems like that inevitably have a plethora of ways that humans can screw them up. It's very hard to completely prevent determined idiots from destroying the equipment.
1-3s? That could really lead to serious problems for such features. Normally such stuff must be guaranteed to be displayed in less than 200ms or something around that.
I'm currently wondering if this a safety relevant feature (according to ASIL/ISO26262) and whether it would be even allowed to run such a feature on a component that is not designed for safety related environments which require realtime behavior (a QT UI running on Linux certainly doesn't provide that, and even lots of other automotive software stacks including Autosar give only limited guarantees).
Yeah, I did a bit more testing and I could actually get ~5s if I kicked off a navigation right before summon(I've also seen similar slowdowns on canceled nav).
The hazards do flash at the same time but that also happens when you lock the car.
You do need to opt-in but I'm hoping they make a change to have it a bit more defensive. In my opinion the right thing to do here is to admit it is possible to accidentally kick off and remedy it(much like with the battery shield).
>The hazards do flash at the same time but that also happens when you lock the car.
When activated from the stalk, the hazards lights don't flash continuously like the key fob or mobile app. They only flash once, because the car immediately and "automagically" selects the forward direction, transitioning away from the flashing state. And the only auditory indication when the driver is still in the car is a single chime.
>You do need to opt-in but I'm hoping they make a change to have it a bit more defensive.
The Autopark dialog is opt-out not opt-in, unlike other Tesla automatic features which require manual confirmation on the touch screen to activate.
A single additional press of the park button brings up the Summon dialog, with arrows to move the car forward and backward. The flaw is that forward is the default. You don't have to press it. The default should be "do nothing," making the driver confirm their intent to Autopark.
Honestly it's a design issue. The way that summon was activated(double tap on P) is very possible to slip and do. The screen where you cancel I've occasionally seen take 1-3s to pop up depending on what the rest of the SoC is doing.
I could totally see a scenario where this happened, screen popped up while he was exiting and wasn't able to hear/notice that summon was engaged.
The better fix here is to have a CONFIRM on the touchscreen rather than a CANCEL. It wouldn't hinder the experience since you already select forwards/back and catches this accident case.
For the record, love the car and almost everything that Tesla does but I really hope they revisit this and design it a bit more defensively.