
Tesla Model S autonomously crashes into a parked trailer while in ‘summon’ mode - davidiach
http://www.theverge.com/2016/5/11/11656496/tesla-model-s-autonomous-summon-mode-crash
======
Animats
Tesla: _" the incident occurred as a result of the driver not being properly
attentive to the vehicle's surroundings while using the Summon feature or
maintaining responsibility for safely controlling the vehicle at all times."_

That's the "deadly valley" I've written about before - enough automation to
almost work, not enough to avoid trouble, and expecting the user to take over
when the automation fails. That will not work. Humans need seconds, not
milliseconds, to react to complex unexpected events. Google's Urmson, who
heads their automatic driving effort, makes this point in talks.

There is absolutely no excuse for an autonomous vehicle hitting a stationary
obstacle. If that happened, Tesla's sensors are inadequate and/or their vision
system sucks.

~~~
jakob223
It bothers me, though, that the standard for automation is that it must
/never/ hit a parked car, not "at least as good as the average human" or "at
least as good as the 95th percentile human" etc.; I don't know enough to judge
what's going on in this situation, but if the technology saves more
lives/property/etc than it damages, IMO it's worth adopting.

~~~
taneq
Agreed. Zero tolerance (or 100% reliability) necessarily has infinite cost
and/or takes infinite time. We need to be reasonable about our expectations
for autonomous systems.

That said, what's the likely accident rate for a 95th percentile human,
starting in a parked car, hitting another stationary vehicle parked directly
in front of them? There must be a few "accidentally put it in drive instead of
reverse" type incidents but I'd except it to be exceedingly rare.

~~~
mpnordland
Also, such accidents are not because the driver failed to identify a
stationary object in front of the car.

~~~
mokus
To be fair, an autonomous vehicle will probably also never accidentally put
the vehicle in drive instead of reverse. The particular failure modes are
likely to be radically different in many cases, so it seems reasonable to
gloss over their individual differences and talk about them in aggregate.

~~~
mantas
Unless there's some bug. Errr, software impairment...

~~~
taneq
What if the vehicle is really drunk?

------
givinguflac
Read the 'updated' version, it explains that the user was doing nearly
everything they shouldn't regarding this feature.
[http://www.theverge.com/2016/5/11/11658226/tesla-model-s-
sum...](http://www.theverge.com/2016/5/11/11658226/tesla-model-s-summon-
autopilot-crash-letter)

That said, sure it should be able to stop on it's own, but I think they
couldn't have been more clear that this is beta and the driver is still always
responsible.

In my view the driver is just as liable if the put on cruise control and don't
pay attention. Is it the manufacturer's fault the car slammed into a vehicle
in front of them while cruise control was on? No, I think any reasonable
person will be saying it's the driver's fault.

~~~
ececconi
I think the liability shifts a little bit because when you're in cruise
control you're literally behind the steering wheel. In this case you're
outside the vehicle.

I think as more autonomous features get developed it's going to be complicated
for some time regarding who is culpable for a given accident.

~~~
justina1
Even outside the vehicle you're still in control. By default, Summon can only
be used with the mobile app and with a 'dead man switch': lift your finger off
the button in the app and the car stops. The driver had to specifically
disable that protection to use the feature the way he did, and now claims no
responsibility. Also, pressing any button on the key stops the car. Seems like
a whole host of bad decisions.

~~~
CydeWeys
> The driver had to specifically disable that protection to use the feature
> the way he did

It shouldn't be possible to disable safety features. I can't press a button to
disable the brakes on my car, for instance.

~~~
kamaal
>>It shouldn't be possible to disable safety features.

Here really it needs to be defined what qualifies as safety. The act of
driving(in certain circumstances) in itself is unsafe by many a definition, by
relinquishing control to the driver you open all the risks that likely to
occur. Its hard to draw the boundary as to what is safe and what isn't. Please
tell me should cars carry mandatory breath analyzers? and disable ignition in
case the analyzer finds traces of alcohol in the breath? Should there be a
detector to check if the driver at the wheel has been well rested a night
before? A detector to check if the driver isn't in rage? These scenarios only
increase exponentially, but remember when you relinquish control to the driver
you now open door for all risks equally. Therefore you can't handle all the
thousands of situations, instead what you do is remind the user of the
responsibility, double check the user decision and then relinquish the control
to the user.

Therefore in all seriousness, if you take control of the car regardless of the
situation you are really responsible. Even if you control the car through your
phone.

~~~
CydeWeys
There's a qualitative difference between allowing disabling of a safety
feature (what I'm talking about) and adding additional ones (every example you
just mentioned). In that light, do you have any response to my statement that
it shouldn't be possible to disable safety features? Note that requiring the
addition of other safety features is a separate issue.

------
mdorazio
If Tesla's response to this is actually what the article says, then that's
somewhat worrying. It's never a good idea to blame the user for a failing of
the product like this, especially on something like a car, beta version or
not. If the car can't reliably not collide with obstacles in Summon mode, then
the mode shouldn't be available to the public yet.

This also points out a failing with Tesla's "we don't need LIDAR" strategy for
sensors. Ultrasonic/IR sensors around the body might be reasonable for most
driving situations, but clearly there are going to be incidents like this one
if the car can't see at the full height of the body at close distance.

~~~
vvanders
Tesla owner w/ AP here.

Honestly it's a design issue. The way that summon was activated(double tap on
P) is very possible to slip and do. The screen where you cancel I've
occasionally seen take 1-3s to pop up depending on what the rest of the SoC is
doing.

I could totally see a scenario where this happened, screen popped up while he
was exiting and wasn't able to hear/notice that summon was engaged.

The better fix here is to have a CONFIRM on the touchscreen rather than a
CANCEL. It wouldn't hinder the experience since you already select
forwards/back and catches this accident case.

For the record, love the car and almost everything that Tesla does but I
really hope they revisit this and design it a bit more defensively.

~~~
makomk
The follow-up at [http://www.theverge.com/2016/5/11/11658226/tesla-model-s-
sum...](http://www.theverge.com/2016/5/11/11658226/tesla-model-s-summon-
autopilot-crash-letter) seems to confirm this, though naturally Tesla are
trying to spin this as his fault:

"Unfortunately, these warnings were not heeded in this incident. The vehicle
logs confirm that the automatic Summon feature was initiated by a double-press
of the gear selector stalk button, shifting from Drive to Park and requesting
Summon activation. The driver was alerted of the Summon activation with an
audible chime and a pop-up message on the center touchscreen display. At this
time, the driver had the opportunity to cancel the action by pressing CANCEL
on the center touchscreen display; however, the CANCEL button was not clicked
by the driver. In the next second, the brake pedal was released and two
seconds later, the driver exited the vehicle. Three seconds after that, the
driver's door was closed, and another three seconds later, Summon activated
pursuant to the driver's double-press activation request. Approximately five
minutes, sixteen seconds after Summon activated, the vehicle's driver's-side
front door was opened again. The vehicle's behavior was the result of the
driver's own actions and as you were informed through multiple sources
regarding the Summon feature, the driver is always responsible for the safe
operation and for maintaining proper control of the vehicle."

Basically, they designed an autonomous-operation mode that was easy to
activate by accident and incapable of reliably avoiding crashing into things,
it appears someone did and his shiny Tesla crashed into a trailer as a result,
and they responded by accusing him of intentionally activating the feature and
misusing it.

~~~
Phlarp
Does anyone else find it creepy that Tesla can track every door and button
press across their entire fleet in real time?

Does Tesla release reports on how often this usage information is requested by
law enforcement agencies and or how many requests are complied with?

~~~
w4
Why is this being down voted? The highly detailed log of the driver's every
action is _crazy_ creepy.

I get that the data is likely useful for debugging, and may very well be a
function of the feature's beta status (can someone confirm? Or is this
something that Teslas do all the time?), but it's still insanely creepy that
every single action this guy took in his own car was remotely logged and
accessible. This guy is basically driving a Telescreen from 1984 to work.

~~~
TYPE_FASTER
Throttle position and other data have been logged for a long time:

[http://www.caranddriver.com/columns/your-car-as-a-witness-
fo...](http://www.caranddriver.com/columns/your-car-as-a-witness-for-the-
prosecution)

[http://www.edmunds.com/car-technology/car-black-box-
recorder...](http://www.edmunds.com/car-technology/car-black-box-recorders-
capture-crash-data.html)

Now it's standardized:

[http://www.cbsnews.com/news/new-law-mandates-black-boxes-
in-...](http://www.cbsnews.com/news/new-law-mandates-black-boxes-in-all-cars-
by-2015/)

~~~
w4
But that seems to mostly be speed and throttle information stored in a black
box in the car that logs in the event of an accident and isn't remotely
accessible. That sort of thing is a far cry from "our server logs show you
opened the driver side door at 5:53 PM" like Tesla is doing. If other
manufacturers are recording that sort of granular data, too, then yikes.

~~~
cynix
I don't think the car's logs are automatically transmitted to Tesla. They
reside on the car, and Tesla can login remotely to view them if they have a
valid reason to.

~~~
brokenmachine
Who decides if it's a valid reason, and who authorizes such access?

If it's not the owner... then they aren't really the owner.

With the number of cameras/sensors on a Tesla, it's a privacy nightmare... I
won't buy one until the answers to these questions are the ones that I want
them to be.

------
kylec
I feel that pressing "park" should be idempotent. If I press "park" twice in
my car, I don't want to drive away once I get out. Tesla really needs a
dedicated "start autopilot" button to make the intention to use the feature
explicit.

Yes, apparently the fact that this feature was activated was messaged on the
instrument cluster, but that shouldn't be sufficient to absolve Tesla from the
liability of this poor UI decision.

~~~
FireBeyond
Especially when considering, as mentioned upstream, Tesla's UI can have
significant latency issues. "Several seconds" to display a confirmation (or
actually a "Cancel") easily means the difference between a catching of your
breath and several thousand dollars damage or worse.

------
iamleppert
Is it just me or if you can't approximate obstacles via a sensor within the
complete bounding box of the car -- except for perhaps the top and bottom, can
you really even have this feature work reliably?

From a technical perspective, you just don't have all the data necessary, and
therefore any solutions will be guesses, hacks and "best efforts", and cannot
be improved on via any manor of software update. This voids the "beta" claim
made by the company, as no software update could remedy the situation.

Tesla has got to know this, and I think its negligent for them to release a
feature (even in "beta") when they know there are hard technical limitations
(sensors, not software) that prohibit it from working properly. It puts
property and people's lives at risk unnecessarily.

At the minimum, Tesla cars equipped or enabled with these features represent a
higher risk to the public, and the owners of these vehicles should be required
to carry high risk insurance.

------
CydeWeys
> In a statement to KSL, Tesla says that Summon "may not detect certain
> obstacles, including those that are very narrow (e.g., bikes), lower than
> the fascia

May as well rewrite that to read "May run over bikers or children". If you
can't implement a feature properly, then don't implement it at all. If that
means current Teslas can't do it because they lack the proper sensors, then
they shouldn't do it.

------
jvm
There is a grave danger that Tesla's precocious push of autonomous features
could result in a PR disaster for self-driving technology if it actually ends
up killing someone.

We shouldn't have this in the wild until we're sure it's ready.

~~~
ivraatiems
I think if autonomous technology ends up killing somebody, PR is the last
angle we should worry about. Let's first worry about pushing a technology
that, y'know, kills people.

~~~
positr0n
That line of thinking is erroneous in my opinion. Autonomous technology only
needs to kill a few less people than the existing manual technology to be
worth debating, and it's a no brainer if it kills orders of magnitude less
people.

~~~
tinalumfoil
About 1 in a million people die annually by car accident per VMT (vehicle-
miles travelled) [0] and the number's declining. So if Tesla sells a thousand
cars, each travelling a thousand miles that kill 2 people in the first year,
they're already above the mark. If there was a person between the truck and
Tesla they'd be way above the mark. Even more so when you consider "Summon
Mode" isn't full self-driving and is probably only being used less than .1%
the time the car is turned on.

More problems are caused by taking things too fast than taking it too slow.
Full self driving cars are probably farther off than you guys seem to think.

[0]
[https://en.wikipedia.org/wiki/List_of_motor_vehicle_deaths_i...](https://en.wikipedia.org/wiki/List_of_motor_vehicle_deaths_in_U.S._by_year)

EDIT: per VMT

~~~
todd8
There are somewhere around 100,000 Tesla cars on the road (as of end of 2015).
If each is driven say 10,000 miles that is a total of 1000 Million miles
driven. At the current US average of over 1 fatality per 100 Millon miles
driven that would make 10 fatalities per year expected from Tesla drivers.

Not all Tesla cars are going to be driven in self-driving mode at first, so we
can expect the numbers to look much worse early on. If only 20 percent of
Tesla drivers let the cars self-drive (and of course only the newer Teslas
will be capable of self driving because of sensors, etc.) we are down to 2
deaths expected by human drivers in that 20 percent.

I used to think that these kinds of numbers would act as a barrier to the
development of self-driving cars, but each time one car has an accident all of
the cars will learn how to avoid it the next time. Every human driver has to
learn what to do around icy roads, what to do when cut off by a car in a
neighboring lane, what to do when approaching a neighborhood where kids and
dogs are playing ball in the front yards, but a self-driving car only has to
learn once and all the other self-driving cars will know what to do too.

When I was growing up, there were around 6 or 7 times as many people killed
per vehicle-miles traveled. I hope that self-driving cars won't be as
dangerous as the cars of the 50's and 60's when they first hit the road. In
the longer run as thousands and eventually millions of self-driving cars begin
driving I expect them to improve rapidly though their shared experiences.

~~~
brokenmachine
That is dependent on them actually being software-fixable problems.

In this case, the sensors do not actually monitor the complete space taken up
by the vehicle, so this kind of accident would be impossible to prevent by
modifying software.

------
Unklejoe
How hard is it to disable the “dead man's switch” for this feature? Can it be
done without searching the forum for hours? Is it documented in the owner's
manual?

The direction of my blame here kind of depends on the answer to those
questions. Of course, it's technically the owner's fault, but a feature like
this really needs to be 100% idiot proof.

This is a new feature to many people, and it's exactly the type of feature
that people are going to “test” outside of the ideal operating conditions.
It's not Tesla's responsibility to account for every stupid decision of its
customers, but Tesla should have at least done everything in their power to
ensure that critical safety features couldn't be disabled (which they may have
done; I don't know).

Most critical safety features on cars can't be trivially disabled (ABS,
airbags, automatic seatbelt locks, etc...). The only safety feature that I can
think of that can be trivially disabled is traction/stability control, but
there's a real reason for this (getting out of deep snow/mud). Also, disabling
traction/stability control is a multi-stage process on many cars. On late
model BMWs at least, pressing the “DTC” button once will partially reduce
traction/stability control, but not completely disable it. To the average
person, it appears to be completely disabled. However, if you do a little
research, you'll find that if you hold it down for another 5 seconds, it
disables completely (sort of). Even with it completely disabled, certain
aspects of the system remain on. The only way to completely disable those
portions would be to flash custom software to the car (which is well beyond
the ability of the average person).

~~~
makomk
Single toggle in the normal Summon settings screen with a help message about
the great convenience features it enables, apparently:
[https://youtu.be/Cg7V0gnW1Us](https://youtu.be/Cg7V0gnW1Us) It's like Tesla
want people to disable it. (Their original version didn't even have a dead
man's switch; they added one after Consumer Reports raised concern about its
safety.)

~~~
Unklejoe
Wow. If they were going to be that lackadaisical about it, it really should
have been 100% idiot proof.

------
mikeash
I see a lot of comments about safety here. Note that Tesla's Summon mode
limits the car to 1MPH and 39ft of movement. It also is very sensitive to
resistance, to the point that I actually had to construct ramps for my car to
climb the 1-inch lip at the entrance to my garage, otherwise it would stop at
that point and refuse to go further. Using this feature to kill somebody is
going to take a _lot_ of effort. There are many interesting discussions to be
had here about software, UX, corporate responsibility in the face of user
error with bad UX, etc., but I don't think there's much room to discuss safety
here. This is a risk to property, not life.

~~~
tclmeelmo
> It also is very sensitive to resistance, to the point that I actually had to
> construct ramps for my car to climb the 1-inch lip at the entrance to my
> garage, otherwise it would stop at that point and refuse to go further.

The pictures of the car in a article linked elsewhere in this discussion
([https://www.ksl.com/?sid=39727592&nid=148&title=utah-man-
say...](https://www.ksl.com/?sid=39727592&nid=148&title=utah-man-says-tesla-
car-started-on-its-own-crashed-into-trailer)) show that the windshield was
smashed. I'm finding it challenging to reconcile your personal example with
the images of the smashed windshield; in my own experience, it takes some
effort to break laminated safety glass. I would think (perhaps wrongly) that
it would take more effort to break safety glass than would be stopped by a
1in. step.

> but I don't think there's much room to discuss safety here.

I couldn't disagree more. I don't understand why it should be possible to
disable any safety interlock (at least, that's how I'm interpreting the
feature description) in a consumer product, especially persistently.

~~~
phaemon
> I'm finding it challenging to reconcile your personal example with the
> images of the smashed windshield

Really? It seems fairly obvious to me. The sensors are in the front of the
car, lower down. So anything on the ground in front of the car registers as an
obstacle and the car stops. Such as a small step.

The front of this trailer (they keep referring to it as the back, but it's
clearly the front) was too high off the ground to register as an obstacle.
You'll note that the front of the car has plenty room - there's nothing
blocking it - but the windscreen doesn't!

So it needs to be fixed; you could imagine a Tesla running into, say, a small
truck with timber sticking out the back when the speed was low enough that the
safe distance between vehicles was small. But it hardly seems life-threatening
in any way.

~~~
mikeash
The lip isn't detected by the sensors. The car stops on it when the tires hit
it because of the physical obstacle it presents and the extra force needed to
climb it.

It's weird, usually the car actually passed over the lip with the front tires,
but then stopped when the rear tires got to it. The threshold for stopping
must be very close to what it actually encounters there.

~~~
tclmeelmo
Someone on Youtube had what I thought was a clever solution: a piece of trim
(looked like shoe molding) to bridge the 1" step. Cheap, and they claimed 100%
effective.

------
aresant
The "parked trailer" description is confusing.

A picture is worth a thousand words - check out the crash here:

[https://www.ksl.com/?sid=39727592&nid=148&title=utah-man-
say...](https://www.ksl.com/?sid=39727592&nid=148&title=utah-man-says-tesla-
car-started-on-its-own-crashed-into-trailer)

This is not a sensor failure, a sensor just clearly is NOT on the roof forward
to measure object height in close-contact situations.

~~~
FireBeyond
This could easily have been a garage door. And in Tesla's own words, Summon
can be used to park in a garage, and handle opening and closing the garage
door.

It may not be a sensor failure, but for Tesla's advertised (and this) use
case, it definitely is a _design_ failure.

~~~
Dylan16807
The bottom of my garage door is rubber. I'd expect little to no damage.

~~~
FireBeyond
If the car couldn't sense it to avoid it in the first place, what makes you
think it's going to stop when it hits your door, versus continuing on plowing
through?

------
freehunter
From an updated version of the story
([http://www.theverge.com/2016/5/11/11658226/tesla-model-s-
sum...](http://www.theverge.com/2016/5/11/11658226/tesla-model-s-summon-
autopilot-crash-letter)) it sounds like the driver was standing next to the
car when it crashed. Tesla says that Summon mode started operating three
seconds after he got out of the car.

~~~
kylec
Electric cars are quiet, it's conceivable that he was walking away, unaware
his car was moving behind him.

~~~
freehunter
But enabling Summon mode is not quiet nor automatic. It's a manual process and
it dings and shows a light on the dashboard. Tesla is saying the guy turned on
Summon mode in a place where he wasn't supposed to, in a situation it was
never designed to work in, and that's why the driver is at fault. The software
worked as designed, but the driver didn't pay attention to its limitations.

~~~
dimino
The software _did not_ work as designed, because it is designed, above all
else, to not _hit_ stationary objects.

~~~
mikeash
That's not true. The software is designed to work under human supervision, and
the autonomous collision avoidance is a _supplement_ to that supervision, not
a substitute.

You could certainly argue that the software _should be_ designed to, above all
else, not hit stationary objects. But it is not actually designed that way.

~~~
dimino
> That's not true. The software is designed to work under human supervision

It was not designed to work under human supervision, but even if it was, that
design was wrong, and the design was broken from the start.

~~~
mikeash
What is your basis for the statement that it was not designed to work under
human supervision? The documentation is quite explicit about needing human
supervision, and I see nothing whatsoever indicating anything otherwise.

~~~
dimino
[https://www.teslamotors.com/models](https://www.teslamotors.com/models)

"Digital control of motors, brakes, and steering helps avoid collisions from
the front and sides, and prevents the car from wandering off the road."

It's designed to avoid collisions from the front and sides.

~~~
mikeash
The use of the word "helps" makes it pretty clear that it's not intended to be
the only thing working to avoid collisions.

~~~
dimino
No it doesn't.

And you're still ignoring the part where I said it doesn't matter if it was
designed intentionally this way or not, it _shouldn 't_ behave this way,
regardless.

~~~
mikeash
Really, you think that "X helps Y" is compatible with X being intended as the
sole thing performing Y? If so, we clearly speak fundamentally different
languages, despite the superficial similarities in spelling and such.

As for ignoring the other part, I explicitly acknowledged in my original reply
that this would be a valid criticism, and it's not something I feel strongly
enough about to argue.

~~~
dimino
I didn't say "x being intended as the sole thing performing Y", I said, "It's
designed to avoid collisions from the front and sides.", which it was.

~~~
mikeash
You said it's designed to work without human supervision. Unless you're
proposing some third entity besides the car and the human which would be
responsible for collision avoidance, then what you said is that the car is
fully responsible for it.

~~~
dimino
> You said it's designed to work without human supervision.

When?

~~~
mikeash
"It was not designed to work under human supervision...."

I don't know how else to understand that other than that it was designed to
work without human supervision. If that's not what you meant, perhaps you
could elaborate.

~~~
dimino
I meant what I said. Collision detection was not designed to work under human
supervision, which means when it runs into something, it has failed its
design.

Tesla cars are not designed to run into things.

~~~
simoncion
> Collision detection was not designed to work under human supervision, which
> means when it runs into something, it has failed its design.

From your quote upthread:

> "Digital control of motors, brakes, and steering _helps avoid_ collisions
> from the front and sides, and _prevents_ the car from wandering off the
> road."

(Emphasis mine.)

Notice the shift in language from "helps avoid" to "prevents" when describing
the two different aspects of the car's thrust and positioning systems. The
different is _significant_ and _important_. It's clear that the systems are a
front and side collision _assistance_ system, not a front and side collision
_prevention_ system.

You see the distinction?

~~~
dimino
I see the distinction you're trying to make, but it's not referencing the
design.

Are you suggesting collision detection is _not_ designed to prevent
collisions?

~~~
simoncion
> Are you suggesting collision detection is _not_ designed to prevent
> collisions?

Lol. Okay, dude.

------
sigmar
[http://kxan.com/2016/05/11/man-says-tesla-started-on-its-
own...](http://kxan.com/2016/05/11/man-says-tesla-started-on-its-own-got-into-
crash/)

"[Tesla] is just assuming that I just sat there and watched it happen and I
was okay with that." (in the video)

The article notes: "A worker at the business met him at the side of the road,
Overton said, and asked him multiple questions about his car."

Tesla said 3 seconds passed from the door closing to the car moving.

It sounds to me like he was showing off some features to someone. It failed to
stop as he expected it to. But he decided to blame the incident on Tesla.

~~~
sickbeard
This is also assuming that telsa's software is infallible and has no bugs. I
wouldn't automatically discount bugs because of the recorder.

------
sschueller
I find it more worrying that Tesla will connect to your car to pull logs to
manage their PR.

In future incidents, what prevents Tesla from forging logs and who could prove
it?

~~~
kylec
I suspect that the "connect to your car" step wasn't even necessary - that the
activity logs are constantly streamed to Tesla's service. And yes, it's
worrying to me that they collect information to that level of detail, that the
information is tied to your identity, and that they have no problem
publicizing that data if you say anything bad about them.

If I buy a Tesla, the first thing I'll do is snip the antenna cable.

~~~
roywiggins
Imagine if Google routinely published details of people's search history if
they complained about Google making crappy software?

"I searched for 'friendly women pictures', and Google showed me porn!"

Google: "no, you searched for X,Y,Z porn repeatedly at 3:30 PM every day for a
year"

------
newman314
Maybe it's just me but I find the fact that Tesla has ready access to such
detailed logs to be extremely creepy and pretty much a showstopper for me ever
owning a Tesla.

~~~
hellcow
Particularly knowing that they use those logs against you, openly and to the
press.

~~~
greglindahl
Not the first, by any means: [http://www.geek.com/mobile/toyota-acceleration-
case-shows-wh...](http://www.geek.com/mobile/toyota-acceleration-case-shows-
why-we-fear-black-boxes-1361357/)

~~~
FireBeyond
That's not comparable. Toyota used black box data in _court cases_ where they
were being accused of negligence, and people were attempting to hold them
civilly and financial liable.

This is Tesla managing their PR and saying "if you say anything bad about us,
we'll publish to the world your driving history" (or in other cases, as
they've already done, disable features and functions in what is supposedly
your property, or downgrade the software thereof).

------
jtchang
Tesla needs to have some sympathy for the user. People "will" accidentally
press a button, sometimes more than once. Sometimes you press it because it is
there. This happens. The feature needs to be robust in the face of failure and
it seems like this did not happen.

------
centizen
Tesla's response to this issue is staggering to me. Summon mode is not a
remote control. The car is controlling itself, and I don't think it's
reasonable to expect the driver to be responsible for it's actions when that
is happening.

~~~
Buttons840
If you read the linked Verge article at the end you'll see that Tesla's manual
instructs that the user of Summon mode must monitor the vehicle and be
prepared to stop it at any time. If this is the case, which according to
Tesla's manual is the case, then a simple remote control operation would be
better overall. If the user must monitor the car the whole time they mine as
well just take control. No need for any smarts from the car itself.

~~~
Super_Jambo
There's no indication that the guy was aware he put it in Summon mode. He
double tapped something he was trying to single tap and then didn't respond to
a modal popup as he was exiting the vehicle.

This is not a UI interaction that should result in the car driving off by
itself. Humans are fallible.

~~~
alphapapa
Seriously, imagine an episode of Star Trek where the computer hears the
captain asking the engineer a question about the self-destruct sequence, and
the computer hears the magic words "self-destruct sequence" and pops up a
dialog box on a nearby console saying, "Self-destruct sequence activated.
Press CANCEL to cancel." Well guess what, nobody expected the computer to
activate the self-destruct sequence just because someone happened to mention
it, so no one was looking at the console, so no one canceled it, so the ship
blew up.

Way to go, Tesla. Great UI design there. I'll be glad to entrust my life to
your software on the highway...

------
stcredzero
The traditional Silicon Valley development culture that developed under web
apps isn't up to the complexity that results when software meets the real
world in safety-critical contexts. Heck, as formulated, it wasn't ready to
deal with smartphone apps!

You have a feature where the car navigates itself through parking situations,
and no hardware or software developer ever paid attention to overhanging
obstacles? That wasn't even a concern!? To me, this smacks of the same kind of
arrogance and shortsightedness that caused Nest to release thermostats that
deactivated without WiFi signal.

~~~
sz4kerto
Elon Musk probably understands this, as in the other half of his days, he's
running SpaceX that can successfully return first stages in one piece.

------
TheSpiceIsLife
It's not clear to me that 'beta' is, or should be, an allowable thing with
regards to vehicles.

As far as I'm aware you're not, legally, allowed to put out of spec tyres or
breaks on your car, but somehow a beta feature that can autonomously control a
vehicle is okay. I'm not convinced.

Personally I'm not at all fond of the idea of trialing beta features. If you
ride the bleeding-edge expect to get cut.

------
lelf
Update: Driver whose Tesla Model S crashed while using Summon was breaking all
the rules

[http://www.theverge.com/2016/5/11/11658226/tesla-model-s-
sum...](http://www.theverge.com/2016/5/11/11658226/tesla-model-s-summon-
autopilot-crash-letter)

~~~
bagacrap
OK so the main rule appears to be "the operator must be prepared to stop
Summon if an object isn't detected". This is exact problem has been predicted
by others, including Google. If you create machines that are really good but
still need rare supervision, you're not going to be able to convince people to
provide that supervision.

~~~
FireBeyond
"Autonomously summon your car whilst providing continuous supervision" is a
lot less headline worthy.

Indeed, Tesla themselves ([https://www.teslamotors.com/blog/summon-your-tesla-
your-phon...](https://www.teslamotors.com/blog/summon-your-tesla-your-phone))
like to describe it as you pulling up in your driveway, hitting a button and
the car going off and parking itself in your garage while you walk inside.

------
amluto
Tesla says:

> As such, Summon requires that you continually monitor your vehicle's
> movement and surroundings while it is in progress and that you remain
> prepared to stop the vehicle at any time using your key fob or mobile app or
> by pressing any door handle.

Let's consider the choices. Key fob? Might be in your pocket. App? Has anyone
from Tesla tried to _use_ the Tesla app? It frequently takes literally minutes
to respond. What if you try to press the door handle and the [expletive
removed] door handle sensor doesn't notice? (The latter happens all the time
with my car. It usually works when I press very hard on it, which might be a
challenging thing to do when the semi-autonomous car is _moving_.)

This crap makes my glad my Tesla is too old to support their beta autopilot.

------
abhi3
Tesla's official statement:

 _Safety is a top priority at Tesla, and we remain committed to ensuring our
cars are among the absolute safest vehicles on today 's roads. It is paramount
that our customers also exercise safe behavior when using our vehicles -
including remaining alert and present when using the car's autonomous
features, which can significantly improve our customers' overall safety as
well as enhance their driving experience. Summon, when used properly, allows
Tesla owners to park in narrow spaces that would otherwise have been very
difficult or impossible to access. While Summon is currently in beta, each
Tesla owner must agree to the following terms on their touch screen before the
feature is enabled: This feature will park Model S while the driver is outside
the vehicle. Please note that the vehicle may not detect certain obstacles,
including those that are very narrow (e.g., bikes), lower than the fascia, or
hanging from the ceiling. As such, Summon requires that you continually
monitor your vehicle's movement and surroundings while it is in progress and
that you remain prepared to stop the vehicle at any time using your key fob or
mobile app or by pressing any door handle. You must maintain control and
responsibility for your vehicle when using this feature and should only use it
on private property._

~~~
koja86
The trailer "was" hanging from the ceiling. [http://kxan.com/2016/05/11/man-
says-tesla-started-on-its-own...](http://kxan.com/2016/05/11/man-says-tesla-
started-on-its-own-got-into-crash/)

------
prmph
Maybe it's early days yet, but I'm not comfortable with the approach taken by
Tesla. Either give the car full control, or else the operator must be in full
control with the technology playing an assistive role. I cannot be expected to
sit doing nothing behind the wheel for hours and then suddenly be called upon
to take over the driving in a split second.

Obviously for the former option Tesla is not there quite yet (and that is an
understatement), but I wonder if better sensor tech will not help here. Sensor
tech is reasonably robust; thus, for example, even if the autopilot is no
longer able to properly make out the markings on the road, a sensor override
should be able to determine (using radar?) the locations of nearby vehicles
within say a 100 meter radius, thus ensuring collisions are avoided, and give
the human driver several second or even a minute to take over.

No wonder this [1] effort from Volvo focuses on, among other this, Radar tech.
I think that is a core tech for the success of self-driving cars

[1] [http://spectrum.ieee.org/cars-that-
think/transportation/self...](http://spectrum.ieee.org/cars-that-
think/transportation/self-driving/volvos-selfdriving-program-will-have-
redundancy-for-everything)

------
abalone
To be clear: this was a UX failure on the part of Tesla, not merely a sensor
failure. And they are super wrong to blame their faultless user for their
screwup.

The UX failure is this happens if you simply double tap the park button and
exit the vehicle. That's it! It starts moving forward. No fob interaction, no
confirmation, nothing. I'm not making this up, it's insanely bad UI. [1]

A double vs single tap is super easy to do mistakenly. And there we get to the
really sh*tty thing: Tesla must know this, and hey are selling out their
customer to cover their ass. The whole "the logs prove the user is in he
wrong" is wrong and disingenuous. All the logs show is he double tapped the
park button.. Probably meant to do a single tap!

Shame on Tesla for disingenuously blaming the user for their design error with
such a safety critical feature. I hope they are more careful than this as they
move forward. (No pun intended.)

[1] video of this "feature" in action (thanks to user schiffern)
[https://www.youtube.com/watch?v=t-JoZL9edlA](https://www.youtube.com/watch?v=t-JoZL9edlA)

------
tclmeelmo
This feature has me wondering what the SAE standards for automatic
transmissions have to say on the matter, if anything?

I'm not familiar with them, nor do I have access, but I'm hoping someone here
might be helpful on either aspect. Specifically, I'm wondering if the
standards for automatic transmission controls recommend against having the
Park setting do anything beyond activating the parking pawl and parking brake
(when electronically controlled).

------
ahadley
Isn't the real issue here that the trailer doesn't appear to have an ICC bar?
I'd expect the object detection to see that. Example:
[http://www.morgancorp.com/images/08_options/01_bumpers/optio...](http://www.morgancorp.com/images/08_options/01_bumpers/options_large/03-optional-
bumpers.jpg)

~~~
chc
ICC bars go at the back of the trailer. The car crashed into the front.

------
tempestn
I'm not trying to take this to extremes, but my first thought reading this,
especially the part about it not necessarily detecting objects close to the
ground, is how long will it be before a Tesla in summon mode runs over a kid
playing in the driveway? It doesn't seem inconceivable, and you would think it
should be literally inconceivable that something like that could happen before
releasing the feature. If it was a critical safety feature that could have a
deadly side effect - like an airbag - that's a different thing entirely. But a
convenience feature more akin to comfort locks should be held to a higher
standard of safety.

It's a similar situation to determining what side effects are acceptable in
medication. If it cures a deadly disease, serious side effects including
chance of death are acceptable. If it's a cure for, say, male pattern
baldness, that level of risk would obviously be unacceptable.

------
yalogin
Why would tesla refute this? It's a bug. They can accept it and fix it and not
be pricks about it.

~~~
cynix
> It's a bug.

It's a limitation of the system, for which they have clear warnings in the
user manual as well as next to the button that enables this feature. It's not
their fault a user failed to understand the system's limitations.

~~~
yalogin
It's even worse then. I don't see it as a limitation. It should never have
made into production at all if that is how it works. I wouldn't have added
that feature in.

------
TYPE_FASTER
They need lidar, and some UI improvements.

GM built an ignition switch that killed people, I'm ok with Tesla releasing a
semi-autonomous beta that you have to use correctly.

------
throw7
This is really bad UX on tesla's part and i'm not surprised they got it wrong.
You can bet this is being looked at and analyzed in detail.

~~~
alphapapa
Yes, but what's worrying is their response. Instead of taking responsibility
for the fail-unsafe UI design and their failure to avoid obstacles above a
certain height, they are completely blaming the user.

I can't see myself trusting their cruise control or autopilot features. It's
like they're saying, "Hey, our cars can drive themselves! But not really. They
can maintain speed and lanes and avoid other cars on the highway! But you
can't take your hands off the wheel. You can relax and let the car steer for
you! But you'd better be ready to take control in a millisecond if it beeps at
you. Oh, and if anything bad happens, it's your fault for using it wrong.
Thanks for participating in the Tesla Autopilot Early Access Beta Program!"

No thanks.

------
Ameo
>We can't be required to be smarter than the software

Jesus - that's the kind of stuff that makes me afraid for the future of
humanity. Yeah I get that this may very well have been caused by human
stupidity and also that the software that powered the car should have been
smarter than to do that, but that quote just gave me the chills.

~~~
throwawaykeno
I read it as meaning "we can't be expected to understand the car's algorithms
well enough to be able to predict what the car is going to do and react to the
output of its computations in real time."

Which as a general sentiment, abstracted from the particularities of this
case, is entirely reasonable.

~~~
simoncion
Sure.

But in _this particular case_ , the quote is saying: "One can't _possibly_ be
expected to understand the _two_ sets of clearly written dire warnings and
strident instructions one was required to acknowledge before enabling this
feature and using this feature. And one can't then be expected to bear the
blame for proceeding to fail to follow the instructions one was given to
ensure the safe operation of one's car while using this feature.".

 _That_ is why the quote is so disheartening.

------
rideralong
"Autonomous Vehicles Could ‘Change Everything,’ But ‘Growing Pains’ Are
Likely"

[http://www.wbur.org/2016/04/29/traffic-future-driverless-
car...](http://www.wbur.org/2016/04/29/traffic-future-driverless-cars)

------
sickbeard
How come the NHTSA doesn't have rules concerning this scenario? Regardless of
how it was operated the car should not run into obstacles when no one is
behind the wheel and it is guiding itself.

------
nunofgs
It's too bad the driver stated that he didn't use the summon feature. It gave
Tesla an excuse to shift the conversation away from the real issue.

------
YeGoblynQueenne
>> we can't be required to be smarter than the software

And that, kids, is what happens when you overhype the state of the art.

------
curiousgal
Well given the shape of the obstacle could it be that the model S doesn't have
sensors for that? i.e. there's space in front of the car but not high enough.

------
colordrops
Move fast and kill people.

------
bunnymancer
Beta is Beta.

Expect to not see any new features being offered before they've been run
through the entire gambit at this point.

Some might feel that's good.

Others may want to see what's going on.

Release cycle will be changed from bimonthly to each new car model.

Congratulations everyone.

~~~
dimino
"Move fast and break things." probably didn't literally mean bones.

~~~
chc
There were literally no bones broken here.

~~~
dimino
This time.

~~~
chc
This is a generic FUD-ism that doesn't seem especially relevant to the topic
at hand. If somebody is flying five feet over the ground, a car windshield
bumping them at extremely low speed is probably not in their top 10 biggest
concerns.

I worry for protruding objects in garages, but I don't see how anyone's bones
are in danger here.

~~~
dimino
Protruding objects? Okay, so "move fast and break through skin" is easier for
you to swallow, then?

~~~
chc
Can you explain the actual danger you're imagining rather than trying to be
pithy? Because I have no idea how you envision this "breaking through skin."
The car isn't even moving fast.

~~~
dimino
What kind of behavior do you expect out of software in "beta"?

