
Model X Crash – Open Letters Between Driver and Tesla - zaroth
http://www.teslarati.com/model-x-crash-montana-open-letter-musk/
======
zaroth
I editorialized the title a bit to remove the click bait.

This line of Tesla's letter back to the driver caught my eye;

    
    
      The diagnostic data shows that the driver door was later
      opened from the outside...
    

That is pretty impressive that diagnostics are not just logging that a door
was open, but which handle was used to open it! I guess Tesla has had enough
issues with their retracting outer handles to warrant some additional
diagnostics in this area, but those logs sure do come in handy.

~~~
bsdetector
In this crash they give details like the door handle opening from the outside.
In the Pennsylvania crash they say autopilot turned itself off 11 seconds
before the crash.

In the Florida case they say 'who knows?'. How many seconds did the guy in
Florida not have his hands on the wheel? How many alarms were sounded? What
did the radar detect? What did the sonar say? Nothing.

If Tesla wants to be trusted they need to give out all the relevant
information for all crashes, including timestamps, not pick and choose crafted
statements that make their PR department happy.

~~~
curiousgal
Maybe that particular data didn't get transmitted / got corrupted?

~~~
Alupis
Would they not have said something to that regards then?

It does appear, so far, that these "log dumps" are being carefully sifted
through and cherry picked for release.

------
pwinnski
Tesla says that only the passenger side was opened at first, and the driver
side was only opened from the outside. And yet the accident was on the right
side of the vehicle, so if either door would have had trouble opening, it
would be that one. If Tesla's information is correct--and isn't a sensor
malfunction or similar--then my hypothesis would that the driver was not even
in the driver's seat. However, we're also told that steering and braking
happened before the car came to a stop, so is the idea that the driver climbed
over to exit the damaged passenger side rather than the untouched driver side?
Possibly while running from the noise? Alternatively, the logs are incorrect.

Tesla also says that there was an "abrupt steering action" that preceded
hitting the first post, and that the car was not auto-steering at all for the
other 11 posts. There's some fuzziness in Tesla's description. The implication
is that the driver hit the brakes immediately, but it's only an implication.
No timestamps are provided, nor clarifying language beyond the broad
implication.

I'm generally thinking Tesla's information is probably correct, BUT I'm
really, really, really tired of hearing about how two seconds of no-hands is
somehow unreasonable, when the car itself allows for much, much longer. If you
want to say that someone has to keep a hand on the wheel, have it beep and
prepare to disengage after two _seconds_ of no-hands, not 15 minutes. If you
allow for 15 minutes, don't be shocked when people use all 15 minutes.

~~~
zaroth

      As road conditions became increasingly uncertain, the vehicle
      again alerted you to put your hands on the wheel. No steering
      torque was then detected until Autosteer was disabled with an
      abrupt steering action. Immediately following detection of the
      first impact, adaptive cruise control was also disabled, the
      vehicle began to slow, and you applied the brake pedal.
    

It's actually not clear the order of events from Tesla's letter, but I read
this to mean the abrupt steering action came after the first first impact.

------
mandeepj
It is impressive of Tesla that they are doing this in-depth logging. I don't
think none of the other non-electric vehicles log this much.

~~~
dragontamer
Impressive? Or immoral?

When I purchase a $100,000 vehicle, I want the vehicle to act under __my
__control. Not under Tesla 's control.

The sensor data belongs to Tesla, not to the customer. And Tesla uses the
sensor data to serve their Public-Relation benefits instead of helping the
individual customer.

~~~
Alupis
Is there any mechanism that will disable all this logging?

> When I purchase a $100,000 vehicle, I want the vehicle to act under my
> control. Not under Tesla's control.

I feel the same way, and kind of find it intrusive that Tesla can seemingly
obtain this data at will without my explicit permission.

~~~
guitarbill
I'd bet it's mentioned in the paperwork somewhere when you buy it. Alas I
haven't had the good fortune of being able to go through the purchasing
process, but give me $100,000 and I'll find out.

~~~
Alupis
> I'd bet it's mentioned in the paperwork somewhere when you buy it.

It probably is - but that doesn't make it feel better. I want to explicitly
allow log collection upon request, or disable it completely.

Where my car goes, how fast it was driving, which window I had open, whether
or not I was listening to AM or FM radio - is frankly none of Tesla's
business. As it is - these logs are seemingly only used to defend Tesla's PR -
which as a simple customer, is not something I'm interested in.

------
trhway
>As road conditions became increasingly uncertain, the vehicle again alerted
you to put your hands on the wheel.

why not publish the video from the car's Autopilot for say last 30 seconds?

Looking at the photos, the road seems straight and with proper lane markings
well visible. Though, given that it was 2AM, may be Autopilot has issues at
low-light - if say they use narrow lenses (for cost as well as depth of field
reasons) then it would mean more sensitive sensor (more noise) or longer
exposure (less FPS) and relationship of colors of objects are different when
illuminated by headlights instead of Sun - all this making it harder to
discern the objects. Yet it all would be just a technical reasons that should
have been solved before product release.

~~~
thaeli
Even if that info is saved locally, video is certainly not uploaded to Tesla's
servers over the air.

~~~
mikeash
There's no facility to record the video in the car either. The video is
processed directly in the camera module, and only a high-level summary is sent
out from there. There's no connection with the bandwidth needed for video, and
not enough storage in the camera module to save it there.

~~~
trhway
>There's no facility to record the video in the car either.

that is really hard to believe. Given how useful dash cams are in cases of
accidents, etc..

>and not enough storage in the camera module to save it there.

and how all these dash cams do it? Storing 8hrs or more. If Tesla didn't put
an SD card there, it sounds really not smart.

To "mikeash" below: thanks for the link. It does seems like a small camera
(thus one can expect low light issues). So Tesla uses 3rd party system - no
miracles here, i hoped that they developed their own and thus hoped that they
would improve it fast. That using of the 3rd party system explains while there
are a lot of things missing that one would naturally expect in the sensing
system of an "autopilot" functionality. It is also explains why they are so
defensive instead of just going ahead and fixing issues.

~~~
mikeash
Autopilot actually does better at night. Better contrast. I think it may use
infrared, but whatever it's doing, it doesn't have issues with low light.

What functionality is missing that you'd expect it to have? For the small
number of sensors Tesla has (one camera, one radar, a dozen short-range
ultrasonics), the system is amazing, and it's the best one commercially
available right now (as verified by _many_ independent tests, that's not just
Tesla talking).

The system is from Mobileye, which is currently the best in the business. I
don't see why this would hurt improvement (the system has improved
dramatically through software updates) or why it would make Tesla defensive.
It's not like they've tried to hide the fact that Mobileye provides the camera
and image processing hardware.

(Note that Tesla has parted ways with Mobileye and the next generation of
Autopilot is going to be a Tesla product. Not sure why, but it sounded like
Mobileye delayed their next generation hardware too much for what Tesla
wanted. But not really relevant to the current hardware.)

~~~
trhway
> at night. Better contrast.

I think we have completely opposite understanding of things here.

>What functionality is missing that you'd expect it to have? For the small
number of sensors Tesla has

exactly - small number of sensors. That is one of the main deficiencies.

>why it would make Tesla defensive.

because they can't improve it.

~~~
mikeash
"I think we have completely opposite understanding of things here."

OK? I'm telling you how it actually is. The system works better at night,
because there's better contrast. At night, the lane markers are lit up by your
headlights, and the road is very dark. During the day, the difference in color
is much less distinct, and as such the system doesn't perform as well. I've
seen this in action with my own eyes during thousands of miles of Autopilot
driving.

The small number of sensors has nothing to do with using Mobileye technology.
Tesla easily could have incorporated multiple cameras or radars, they just
didn't. And that's not missing _functionality_ , that's missing _hardware_.
You haven't given me any _functionality_ you expect the system to have that it
doesn't.

"they can't improve it"

Of course they can. Did you not read the part in the comment you're replying
to where I said, "the system has improved dramatically through software
updates"?

I don't mean to offend you with this question, but do you actually _know_
anything about this stuff, or are you just guessing? It's getting tiresome to
correct all of your incorrect statements.

------
tmd83
The Tesla response talks about Autosteer is that different from Autopilot or
has there been a name change.

If I'm reading this correct all the crashes against the post are supposedly
with the drivers hand on the wheel? If thats the case then maybe the owner
wasn't aware of the logging when he tried those claim.

I also depend on application log being correct but errors in logging is a
scary thought in the sense that how hard it would be prove (or near
impossible) that what you are saying the truth. And even beyond weird bug what
happens when malicious actors change computer record and takes away all our
proof of innocence (aka The Net). I wonder if there is really solid protection
against such act possible. I want things more accessible but can you truly
make accessible + safe work together (not necessarily now but long in the
future). I do sure hope so.

~~~
freerobby
It wouldn't stop deceit if done at the engineering level, but one thing Tesla
could do to gain trust in its logs is to let its drivers digitally sign a
version of them. That way, if it ever lands in court (or some other form of
arbitration), both sides could verify that Tesla's version and the driver's
version match up -- that neither side has tampered with anything.

(I don't believe Tesla has ever tampered with or lied about what the logs say;
this would merely address folks who have such doubts).

------
smallnamespace
If the car's sensors were faulty, then the logs would be in error too.

It's wouldn't be possible to say for sure whether the driver or the log is
correct unless the sensors are physically recovered and tests.

~~~
guitarbill
Not necessarily - sensors are usually picked to be independent of each other.
So e.g. if the camera was malfunctioning, the output of the accelerometers and
door sensors could (probably) still be trusted. Impossible to say without the
raw data though.

------
tylercubell
> No steering torque was then detected until Autosteer was disabled with an
> abrupt steering action. Immediately following detection of the first
> impact...

Well, which way did the driver steer? Away from the barrier to avoid more
damage, like he claims, or into it causing the accident in the first place?
This statement is ambiguous and I get the feeling from reading the whole PR
response that Tesla is either omitting or cherry-picking information to
obfuscate the truth.

~~~
madmax96
This is __very__ ambiguous:

Mr. Pang states that:

    
    
        > the car suddenly veered right
    

And later that he:

    
    
        > managed to step on the break, turn the car left and 
        > stopped the car
    

So, that's __two__ steering motions (one to veer right and another to turn
left and stop.)

Tesla only says:

    
    
        > No steering torque was then detected until Autosteer 
        > was disabled with an abrupt steering action. 
        > Immediately following detection of the first impact, 
        > adaptive cruise control was also disabled, the 
        > vehicle began to slow, and you applied the brake 
        > pedal.
    

That's one steering motion -- no indication of direction or magnitude. Also,
Tesla's language deliberately avoids ordering the events. Since the "steering
action" appears first in the paragraph, it seems as though this happened
before the impact. That information is not actually encoded in the article,
but a reader would naturally assume this is the case (I know I did).

Short of a publicly released log file or an investigation by a third-party, I
don't think we'll ever know the truth of what happened.

------
gordon_freeman
I think Tesla should be forced to hand over all the log-data rather than pick-
and-choose kind of log-data in order to not taking responsibility of an
accident.

~~~
jmcdiesel
Why? Does Dodge, ford, chevy, mercedes, etc... have any obligation to step
into every crash and turn over all kinds of data? Tesla looks into more
crashes than they should, and in turn are being treated more in the wrong for
simply looking into it... they do better than any other car company, yet are
basically punished for it by groups of people for simply looking into wrecks
to see IF there is something there they could do better...

~~~
pwinnski
Dodge, Ford, Chevy, Mercedes, etc do not collect nor publicly release data at
all, while Tesla does. Because Tesla collects and publicly releases data at
their own choosing, they have an obvious motivation to release only data that
supports limited liability against them.

They might not be doing that, but until and unless they make _all_ data
available every time they make _any_ data available, it's a matter of faith.

~~~
jmcdiesel
Yeah... then what?

Police say "oh, you were doing 75mph in a 50? ticket!"... insurance rate hike,
etc... how much liability would they be picking up if they exposed all of
that, and how quickly would people be throwing a fit about privacy?

I know multiple people (in multiple states) who have been ticketed based on
their youtube videos... im sure police would try to use the logs just the
same...

~~~
pwinnski
Let's recap this thread: 1\. gordon_freeman suggests that Tesla should be
forced to disclose what they have, rather than pick and choose what to share.
2\. You said that (a) other companies don't do that, and suggested (b) Tesla
is being treated unfairly as a result of doing more than any of those
companies. 3\. I highlighted that, as I believe gordon_freeman was saying, the
difference is that Tesla is already collecting that information _and_ using it
very selectively, putting them in a class of one. If they were to keep all of
the data to themselves, that would be fine, or if they were to release it all,
that would be fine, but as it is, nobody really has any reason to believe
them.

And now here we are. I'm not sure why you're jumping to liability questions
about logs of speeding and bringing up privacy (amusing, that one, when it
comes to Tesla's public descriptions of events in their cars). I think the
point is clear. Tesla, either provide evidence for your claims, or stop making
claims. Simple.

------
curiousgal
The devil is in the details.

------
mastermojo
When will people learn to stop lying about the behavior of a car that keeps
logs of everything it does?

~~~
canthonytucci
So how does this work? Do we trust the car or the man?

If the car does not detect any steering input, even though it has been given
(due to bad software or hardware), or logs a warning being sounded, even
though none manifested in the real world, what do we do? How do we know?

Bugs are by definition human assertions of computer failure, the computers
can't lie (yet), but they can be wrong.

Are inward-facing dash-cams going to be required so that we have evidience of
compliance with the terms of service of our vehicles?

I think it is rather unlikely that a false steering input detection, a false
warning log, and a false door-opening log all happpend at once...perhpas not
in thic case, but subtle, cascading bugs can and do happen.

~~~
jmcdiesel
I'd say trust the car...

It would be easy to correlate one sensor failure and one part of the crash
(detected hands on the wheels, kept it going int autopilot - because the
sensor failure would lead to the crash)

But to say that sensors failed, then warning sounds failed, then failsafes
failed, becomes rather unbelievable.

Also the high drama content of the customer's letter (there is no 50 foot drop
off, i just checked on streetview for that entire section of route 2). There
is a railroad, then a river, but not a steep drop. Even then, railroads pretty
much stop cars dead trying to cross them (dont ask how i know, lets just say a
1986 ford tempo can go from about 45mph to a dead stop real fast)

~~~
madmax96
Agreed that the car data is more reliable than the human.

But since we don't have the logs themselves, every conclusion we come to is
pure conjecture and obscured by cloudy language.

