
Tesla Was Kicked Off Fatal Crash Probe by NTSB - Element_
https://www.bloomberg.com/news/articles/2018-04-12/tesla-withdraws-from-ntsb-crash-probe-over-autopilot-data-flap
======
Animats
Tesla has totally botched this. What Tesla has run into is the approach to
safety that makes commercial air travel safe - figure out exactly what went
wrong and fix it so it stays fixed. The aviation community is used to this.
"Flying" magazine publishes crash reports in most issues.[1] Pilots read
those. Manufacturers read those. People in aerospace read those. NASA collects
near-miss reports to keep tabs on why close calls happened.

The aviation industry has experience with aircraft automation leading pilots
into trouble, and so does the NTSB. Here's a former head of the NTSB
commenting on the Asiana crash at SFO in 2014:[2] “How do we design this whole
human-machine system to work better so we don’t lose the professionalism in
the humans who are doing this?” Crash investigators are very aware of the
problems of a poor man-machine interface leading pilots into a bad situation.
That is the kind of thinking needed to make these kinds of systems work.

It's not about blame. Most air crashes have multiple causes. That's because
the single points of failure were fixed long ago. This is somewhat different
than the traffic law enforcement approach. The NTSB's previous investigation
of a Tesla collision resulted in Tesla changing their system to enforce the
hands-on-wheel requirement. That may not be enough.

Playing hardball with the NTSB is not going to work. The NTSB rarely uses it,
but they have broad authority in investigations. They can get subpoenas. They
can inspect Tesla's facilities, hardware, and software: "An officer or
employee of the National Transportation Safety Board ... can ... during
reasonable hours, may inspect any record, process, control, or facility
related to an accident investigation under this chapter." The NTSB even has
the authority to publicly disclose corporate trade secrets if necessary.[3]
Aviation companies routinely cooperate, knowing this, and the NTSB seldom has
to compel disclosure.

Incidentally, lying to the NTSB is a federal crime. The CEO of Platinum Jet
Services tried to cover up some safety violations that resulted in a crash. He
became Federal Prisoner #77960-004 from 2007 to 2013.[4]

[1]
[https://www.flyingmag.com/tags/aftermath](https://www.flyingmag.com/tags/aftermath)
[2] [https://www.aopa.org/news-and-media/all-
news/2014/october/21...](https://www.aopa.org/news-and-media/all-
news/2014/october/21/acting-ntsb-chief-says-automation-affects-
professionalism) [3]
[https://www.law.cornell.edu/uscode/text/49/1114#b](https://www.law.cornell.edu/uscode/text/49/1114#b)
[4]
[http://www.cnn.com/2010/CRIME/11/16/new.jersey.charter.convi...](http://www.cnn.com/2010/CRIME/11/16/new.jersey.charter.conviction/index.html)

~~~
skywhopper
I’m confused about how hands on the wheel is “enforced”? Just chiming and
blinking lights? Or does the car actually shut off auto driving mode when you
remove your hands?

~~~
ryjxfk
Audible and visual alerts, followed by bringing the car to a safe stop.

------
mikerathbun
I get frustrated when Tesla blames the driver for these crashes by saying that
they aren't touching the wheel or are ignoring messages. It is extremely
common for my car to warn me to keep my hands on the when WHILE I have my
hands on the wheel. Even shaking it a little will sometimes not cancel out the
warnings. It doesn't seem to matter if my hands are on the top or bottom of
the wheel. Anything short of a deathgrip and constant jiggling of the wheel
doesn't seem to consistantly register as keeping your hands on the wheel.
Anyone who has used Autopilot for a significant amount of time has been beeped
at for not having their hands on the wheel even though they are. And yes, the
car will make crazy maneuvers at times that no driver in their right mind
would make. I still like it and am glad I opted for it, but it is far from
earning its namesake.

~~~
intherdfield
That is very significant!

Your comment suggests that Tesla's claims about the driver having his hands
off the wheel for x seconds before a crash could be wrong. If the sensors
detect the drivers hands are not on the wheel when they actually are on the
wheel, then data logged about how long the driver's hands were off the wheel
should be considered suspect.

I hope that's being investigated.

Personally, I'm having trouble believing the driver who died in the recent
accident ignored the warnings for six seconds before the head-on collision,
especially when he knew Autopilot didn't work well at that section of road.

I'm not familiar with how the system works. If the warning engages, is there a
guarantee that the driver will have to take over shortly? Or are there
scenarios when the warning turns off by itself and so the driver could have
been waiting to see if the car would correct?

[Edit: lolc and Vik1ng pointed out that the warning isn't related to unsafe
conditions as I implied. It's used whenever the sensors think the driver's
hands are off the wheel.]

~~~
MertsA
Tesla didn't claim that he didn't have his hands on the wheel, they claimed
that it wasn't detected. Tesla knows how inaccurate that is. Tesla has always
been shamefully misleading with all of their victim blaming PR pieces. I get
that there's plenty of "unintended acceleration" cases where the driver
straight up lies about hitting the wrong pedal but they included that
information about what the driver did earlier in the trip and the time he
would have had a view of the barrier solely to imply that it's his fault that
autopilot killed him. That's never an okay thing to do yet it seems like this
is Tesla's modus operandi when there's some high profile accident.

Who cares if he had 5 seconds to see the barrier if he only had 0.5 seconds to
realize that the car went into casual murder mode right as it veers into the
barrier. They make it sound like it had to have been driver error with facts
that are irrelevant to the question at hand. Your assumption that Tesla
claimed he didn't have his hands on the wheel at the time of the accident is
exactly what I'm talking about, Tesla's PR release is filled with weasel words
to do exactly that.

I'm not even trying to suggest that Autopilot is less safe than an alert human
driver, but one thing is clear, we certainly can't trust Tesla to determine
how safe it is.

~~~
userbinator
_Who cares if he had 5 seconds to see the barrier if he only had 0.5 seconds
to realize that the car went into casual murder mode right as it veers into
the barrier._

This video is quite relevant:

[https://www.youtube.com/watch?v=6QCF8tVqM3I](https://www.youtube.com/watch?v=6QCF8tVqM3I)

A fully alert driver trying (and succeeding) in reproducing this behaviour.
Observe how long the barrier is visible, how long it takes him to react, and
how close the car was to hitting the barrier.

~~~
brandonjm
This video is horrifying, I assume even the sensors on cars with automatic
braking sensors should be able to identify a barrier that large in front of
the car. I am way oversimplifying this but I'm sure there is some level of
prioritisation in Tesla's Autopilot software to decide what action the car
should take. Is the Autopilot software prioritising staying in lane (the car
appears to stick to the left white line) over collision avoidance?

~~~
URSpider94
It can’t, nor can most cars with automatic braking. The Tesla will happily run
headfirst into a brick wall all day long. The reason is that the driving world
is chock-full of objects with zero velocity relative to terrain - the trivial
cases being a rise in the road ahead, or an at-grade bridge where the travel
lanes suck below grade to pass. Therefore, autopilot and many auto-braking
algorithms filter out completely static objects (see previous story about a
Tesla ramming a fire truck stopped on the highway).

Musk has commented publicly before (though somewhat obliquely) about this
flaw. He indicated that the company is trying to build a map of reference data
so that it can be filtered out automatically, and real hazards can be
seen/found.

~~~
skywhopper
Mapping can never solve this problem. If these cars don’t yet have the ability
to detect stationary physical barriers that represent a crash risk, then they
are further away from being practical than I thought, and I’m extremely
pessimistic. It is clearly a hard problem to solve with naive tech.
Unfortunately we have a huge industry and tens of billions of dollars and
politicians, corporations, governments, and media who all believe this naive
tech is close to perfect. But it can’t see a wall or a fire truck in its path?
If this isn’t solved then this whole house of cards will fall apart. The
sooner the better, IMO. Let’s start building cars that supplement driver
awareness instead of numbing it.

------
ilamont
_“We believe in transparency, so an agreement that prevents public release of
information for over a year is unacceptable.”_

As noted in the article, the information being released includes blaming the
victim and other PR spin. This doesn't serve the public or further highway
safety; it further's Tesla's commercial goals.

If Tesla's withdrawal hampers the investigation, let's hope the agency uses
its subpoena power to get the info it needs to determine the cause of the
accident and what Tesla needs to do to prevent similar incidents from
happening in the future.

~~~
valuearb
Is there any indication that the information that Tesla released wasn't true?

And Tesla said they'd continue to cooperate with the investigation, so why do
you think a subpoena is suddenly needed?

~~~
rohit2412
Have you looked at that information. Here, I'll get it for you from
[https://www.tesla.com/blog/update-last-
week%E2%80%99s-accide...](https://www.tesla.com/blog/update-last-
week%E2%80%99s-accident)

> Over a year ago, our first iteration of Autopilot was found by the U.S.
> government to reduce crash rates by as much as 40%.

Why don't they talk only about Autosteer instead of Autosteer+AEB (and other
features combined). Why not discuss how many accidents could have been caused
by Autopilot if the driver did not correct it. Why is there no commentary on
if whether their active driving system would be better run as a passive driver
error correction system, as other manufacturers do.

> In the US, there is one automotive fatality every 86 million miles across
> all vehicles from all manufacturers. For Tesla, there is one fatality,
> including known pedestrian fatalities, every 320 million miles in vehicles
> equipped with Autopilot hardware. If you are driving a Tesla equipped with
> Autopilot hardware, you are 3.7 times less likely to be involved in a fatal
> accident.

Ahh, so a Tesla is safer than a 10 year old beaten down car with no airbags
driven on country roads, I never expected that. Tesla is safer than even a
motorbike, what a surprise. And just having Autopilot in your car (even
disabled), makes you 3.7 times less likely to die. Does that say anything
about Autopilot, or are they just touting non-Autopilot premium safety
features?

> There are about 1.25 million automotive deaths worldwide. If the current
> safety level of a Tesla vehicle were to be applied, it would mean about
> 900,000 lives saved per year.

So, motorbikes driving in Himalyas are more deadly than a Tesla on California
highways, who would have guessed. What does this say about Autopilot again?

> We expect the safety level of autonomous cars to be 10 times safer than non-
> autonomous cars.

Elon musk also expected driverless cars to be everywhere in 2017. And
autonomous doesn't need to mean hands off driverless cars, they keep
conflating assistive feature benefits with their hands-off technology.

~~~
valuearb
\- The average US fatality is not in a ten year old car on a country road. \-
US crash rates are lower than worldwide rates, so Tesla’s estimate of lives
saved worldwide is low. \- If autopilot is only used part of the time, that
implies its safety benefit is significantly more than 3.7x. A model S is not
3.7x better at passive safety than other cars.

Of course none of this data is very meaningful. You have to compare the rates
of autopilot users to non-autopilot users of similar demographics, and have
all crashes recorded, not just fatalities, to get close to a comparative
samples.

~~~
idoubtit
> US crash rates are lower than worldwide rates, so Tesla’s estimate of lives
> saved worldwide is low.

Road fatalities per 100,000 inhabitants per year (2013, WHO):

    
    
        UK      2.9
        Spain   3.7
        Germany 4.3
        France  5.1
        USA    10.6
    

Among the rich countries, the US crash rate is one of the worst. With a
similar set of vehicles, this rate is very fluctuent, per country and per
year. Hint: the problem may not be purely mechanical.

Seriously, would Tesla's autopilot lower the crash rate in the UK? In
Thailand? I guess it wouldn't change much in European countries, and any good
car would do the same in poor countries.

~~~
kps
> _I guess it wouldn 't change much in European countries_

Why not?

Via
[https://en.wikipedia.org/wiki/List_of_countries_by_traffic-r...](https://en.wikipedia.org/wiki/List_of_countries_by_traffic-
related_death_rate), here is another table (same countries, same order) giving
road fatalities per 10⁹km in 2015:

    
    
        UK      3.6
        Spain   7.8
        Germany 4.9
        France  5.8
        USA     7.1

------
hyperrail
The headline has been updated to say that:

> _Tesla Was Kicked Off Fatal Crash Probe by NTSB_

> The National Transportation Safety Board told Tesla Inc. on Wednesday that
> the carmaker was being removed from the investigation of a fatal accident,
> prior to the company announcing it had withdrawn from it, according to a
> person familiar with the discussion.

> NTSB Chairman Robert Sumwalt relayed the decision in a call to Tesla’s Elon
> Musk that was described as tense by the person because the chief executive
> officer was unhappy with the safety board’s action. NTSB is expected to make
> a formal announcement in a release later Thursday, said the person, who
> spoke on the condition of anonymity.

~~~
mcguire
" _Tesla, in a statement issued late Wednesday, suggested the company chose to
leave the probe._ "

Technically, that's sorta true. By releasing information and opinions about
the ongoing investigation, Tesla decided not to be a part of it.

~~~
avs733
That is, terrifyingly, the same logic my friends use with their young
children.

~~~
8bitsrule
Only in this case, it's like a parent giving a kid 'a good shaking' to 'wake
them up'.

Any 'artificially intelligent' car that can't avoid running straight into
something solid right in front of it is dangerously stupid. Elon's age is
starting to show.

~~~
avs733
The age comment manages to be both a non sequitor and an ad hominem so lets
ignore that.

The issue at play here is far larger than a car having a crash. The issue I
was referring to is the actions of Tesla as they respond to the crash. That is
cultural and it affects both the process of creating and releasing products as
well as the process of responding to their failure. If your product can kill
people, and lets be clear - cars can easily kill people SDV or not - you need
to act like professionals when something happens. Playing the blame game, in
public, and violating the expectations of a body like the NTSB shouldn't
garner trust, no matter who is right at a technical level.

EDIT: fixed my abysmal grammar.

------
dsabanin
> Tesla said that the Model X driver’s hands weren’t on the steering wheel for
> six seconds prior to the fatal crash

Call me a sceptic, but saying that with absolute certainty is troubling. As an
engineer, knowing how a lot of software and hardware systems are actually
implemented, 6 seconds is such a minuscule time frame. Data about hand
position could be in a write cache somewhere waiting to be flushed, inside
monitoring service, inside OS, inside RAID controller, inside storage
controller, or even de-prioritized by a busy IO scheduler. Then at a moment of
a crash that the data might never make it to the storage for recovery.

I'm not saying that it is, but if the hand monitoring service is glitching
out, it will take a lot of time to resolve the issues completely, mostly
because just looking at the data as a remote observer (without camera
installed in a car) it would just look like the hands were not on the wheel.

Also it doesn't help that for Tesla it seems to be a nice way out of this kind
of crashes – they can just say that driver's hands were not on the wheel and
that's that. Who even has the data and can decode it? How can we make sure the
data was not tampered with?

This needs to be investigated properly and not just by Tesla, they are a
biased party.

~~~
jjoonathan
> As an engineer, knowing how a lot of software and hardware systems are
> actually implemented, 6 seconds is such a minuscule time frame.

It's a geological amount of time to a real-time system, which anything related
to controlling a car absolutely will be. "Multiple seconds of unpredictable
latency are OK" is an attitude that doesn't get you far when interacting with
the physical world, and that's why there is an entire discipline dedicated to
dealing with it that has to use special chips (e.g. the ARM R series), special
OSes (e.g. VxWorks), and special tools (special IDEs, oscilloscopes, special
cameras) to rise to the occasion.

If they let a "latency doesn't matter" system near any of these data streams,
_that 's_ the scandal.

~~~
dsabanin
I do understand that, and that may be the case, but I don't think it will
surprise anyone if there's a piece of nodejs somewhere handling something it
shouldn't have, because someone just needed to ship. We don't know the quality
of their code base and hardware or of all the code bases and hardware modules
provided to them by 3rd parties.

~~~
sidlls
I would be surprised if nodejs were anywhere near the real time, safety
critical systems here. I'd also not buy a Tesla in that case.

~~~
jaggederest
I've seen so many pieces of safety-critical hardware BSOD at this point that I
would not be at all surprised.

~~~
radix07
I don't think you know what actual safety-critical hardware is if you are
running Windows or even Linux on it...

~~~
andrewflnr
Well, obviously the maker of said hardware doesn't know what safety critical
hardware means, but that doesn't mean it doesn't exist.

------
jaclaz
It is interesting (to me at least) to compare the "first update":

[https://www.tesla.com/blog/what-we-know-about-last-weeks-
acc...](https://www.tesla.com/blog/what-we-know-about-last-weeks-accident)

with the "last update" (the one that triggered the NTSB reaction):

[https://www.tesla.com/blog/update-last-
week’s-accident](https://www.tesla.com/blog/update-last-week’s-accident)

as it allows to easily identify the "boilerplate" parts, but, more than that,
the "first update" closes with:

"Out of respect for the privacy of our customer and his family, we do not plan
to share any additional details until we conclude the investigation.

We would like to extend our deepest sympathies to the family and friends of
our customer."

So, _logically_ , either the respect for the family vanished or Tesla
concluded their investigation (in three days time).

~~~
slivym
I think as mentioned in the article - the family started making public
statements and Tesla figured "Oh, you don't want PRIVACY (read: to STFU) then
we're going to be TRANSPARENT (read: slander your dead relative to boost our
shareprice)"

------
icc97
This goes from bad to worse for Tesla. It's all related to the original PR
statement, but I'm guessing that Tesla have made further actions to anger the
NTSB.

I don't understand why the change. They've already co-operated with the NTSB
in the first Autopilot death (back in 2016) [0] with a much simpler PR
statement [1] that didn't upset the NTSB.

[0]: [https://static.nhtsa.gov/odi/inv/2016/INCLA-
PE16007-7876.PDF](https://static.nhtsa.gov/odi/inv/2016/INCLA-
PE16007-7876.PDF)

[1]: [https://www.tesla.com/blog/tragic-
loss](https://www.tesla.com/blog/tragic-loss)

~~~
Someone1234
Tesla has made three statement about this accident:

\- Original: [https://www.tesla.com/blog/what-we-know-about-last-weeks-
acc...](https://www.tesla.com/blog/what-we-know-about-last-weeks-accident)

\- Update: [https://www.tesla.com/blog/update-last-
week%E2%80%99s-accide...](https://www.tesla.com/blog/update-last-
week%E2%80%99s-accident)

\- Newest: [http://abc7news.com/automotive/exclusive-tesla-issues-
strong...](http://abc7news.com/automotive/exclusive-tesla-issues-strongest-
statement-yet-blaming-driver-for-deadly-crash/3325908/)

None of which address the multiple videos of people reproducing the accident
at the same and different locations, and none take any responsibility for this
likely bug (lane centering in non-lanes) introduced in an over the air update.

Even Reddit's Tesla Motors sub which used to be a 24/7 Tesla celebrating has
been pretty negative about Tesla's handling of this incident, and I'm talking
regular posters and verified Tesla owners.

------
danepowell
Tesla's terrible response to this and the detrimental effect on its image must
be pretty obvious to most of the HN crowd. I have to wonder why they persist.
Is their PR spin actually having a positive effect on the public at large?

~~~
JBReefer
Culture (like shit) runs downhill.

Mr. Musk seems like he's becoming increasingly erratic and controlling, and
there's a record of him being unable to take criticism. This is a pretty
intense form of criticism.

~~~
craig1f
Your comment seems like trolling.

~~~
rohit2412
You didn't get the narcissism over the twitter spat with that urban planner?

------
NelsonMinar
NTSB does an excellent job in its investigations in getting to the root cause
of accidents. The only way they can do that is in a depoliticized atmosphere
where they can focus on getting to the truth of something, not worrying about
one of the parties to the accident spinning things for their commercial
advantage. Tesla trying to corrupt an NTSB investigation like this is the
worst kind of bad faith. Particularly for a technology as important as self-
driving cars.

~~~
unityByFreedom
> NTSB ... The only way they can do that is in a depoliticized atmosphere
> where they can focus on getting to the truth of something

And, if there is evidence Tesla has been negligent, the only way _they_ can
win is by not being forced to release this evidence, perhaps by attacking the
NTSB as being biased against Tesla.

------
timewarrior
I see many comments wondering why does Tesla not recognize stationary objects
when it has a forward looking radar.

The radar can't distinguish between overhead signs, trees and things on the
side of road. It only track moving things, otherwise there would be too many
false positives and the car will keep braking every few minutes. Other Level 2
autonomous systems have the same limitation.

You might recall, Tesla hit a stationary fire truck in Jan [0]. This was
caused due to a similar limitation in their software. Tesla was following a
car at a slow speed. When that car suddenly moved out of the way to avoid
hitting the Fire truck, Tesla AutoPilot accelerated to match the set speed and
hit the fire truck. To summarize, if a moving car suddenly moves out of the
way, Tesla will almost always hit whatever is stopped further down the lane.
They even call this out in their User Manual.

Tesla is hoping that it would be able to get it's Vision capabilities to the
point where it would be able figure out obstructions using cameras working in
conjunction with the radar. But there is no committed timeline for this.

LIDAR is able to avoid this by using directed LASERs to build a 3d outline of
all the surrounding objects. A system relying on LIDAR would have challenges
in RAIN and falling snow. However under natural conditions (where most Tesla
deaths with Autopilot have occurred - hitting a SEMI from side, hitting the
barrier recently), LIDAR would be able to completely avoid the accidents.

However, if and when, Tesla vision capabilities are able to detect
obstructions, they might be much cheaper and more effective than LIDAR in all
kind of weather condition. Hoping they can get there soon. And I also wish
they would fix their marketing to more accurately reflect the capabilities of
the driver assist system and stop misleading the public.

[0] [https://www.mercurynews.com/2018/01/22/tesla-on-autopilot-
sl...](https://www.mercurynews.com/2018/01/22/tesla-on-autopilot-slams-into-
parked-fire-truck-on-freeway/)

~~~
wilburTheDog
If this is true then how can we have had Collision Avoidance systems[1] in
cars for years that are able to automatically apply the brakes to prevent this
sort of accident? And If the answer is 'autopilot doesn't work that way' then
why not use both systems, with the collision avoidance system taking
precedence as it deems necessary?

1\.
[https://en.wikipedia.org/wiki/Collision_avoidance_system](https://en.wikipedia.org/wiki/Collision_avoidance_system)

~~~
timewarrior
All the systems which work on radar have similar limitations. More details:
[https://www.wired.com/story/tesla-autopilot-why-crash-
radar/](https://www.wired.com/story/tesla-autopilot-why-crash-radar/)

~~~
wilburTheDog
After reading about this for a while, just now, I'm much less comfortable with
the idea of trusting an automotive autopilot system. Just imagine if your taxi
driver told you "I will drive perfectly safely, except if there is a truck
parked in the road. Then I will run into it at speed." It seems like such a
glaring omission. If I were an engineer I don't think I would be comfortable
releasing an autopilot system with this kind of safety issue.

~~~
azernik
Which is why most companies aren't willing to call these things "autopilots"
\- Mobileye, for example, pitches its systems as assistants which will catch
some of the driver's mistakes, similar to the "collision avoidance" language.
This also comes with the corollary that they often aren't willing to give
these systems as much control of the car as Tesla does.

------
thisisit
I am surprised at PR Silicon Valley companies are doing these days. First it
was Facebook and now Tesla. They keep trying to avoid responsibility by
blaming someone other than themselves. How hard is to just say "we are looking
into it" and do a comprehensive test before making a statement.

~~~
skwirl
What if Tesla isn't responsible? I assume you don't expect them to sit quietly
for a year+ while they are ravaged in the media in that case. What is the
correct course of action?

I feel like some of the criticism of tech companies has gone from thoughtful
to kneejerk lately.

~~~
greedo
The correct course of action is to follow the NTSB guidelines for
participating in a probe, and to not continually attack the accident victim in
the press.

Tesla could survive this accident easily if they handle it properly. Keep
mishandling it, and Tesla could be in a world of hurt.

------
dekhn
Tesla's actions in this case, and before, have convinced me they are not a
company I would want to buy a car from.

~~~
gkya
Tesla has done one really important and useful thing: demonstrate fully
electric cars are not only possible, but also feasible. The rest has been a
shitshow. They should've focused on range instead. But oh this is one of the
companies meant to fund the incredibly useful mission to send speaking monkeys
to mars to see if mr. Musk can go there too.

------
jvolkman
Letter from NTSB to Tesla regarding its removal:
[https://www.ntsb.gov/investigations/Documents/HWY18FH011-Tes...](https://www.ntsb.gov/investigations/Documents/HWY18FH011-TeslaPartyRemovalNotificationLetter-041218.pdf)

------
_Codemonkeyism
Essentially a Tesla car tries to kill you by following the wrong road markings
over and over again (as remarked by the wife of the dead driver). Tesla puts
the focus on you to detect ever time when the car wants to kill you and act
fast enough. If you like that, buy Tesla.

------
Semirhage
If there was any doubt that Tesla is a bad actor in the self-driving space,
this settles it. Imagine if Boeing or Lockheed did this nonsense after a fatal
plane crash!

~~~
joshstrange
Yes, what a shame it would be if Boeing made public comments about something
being investigated by the NTSB... Oh wait they did [0] and the NTSB decided
not to kick them off...

> The agency stopped short of restricting Boeing’s access to its investigation
> ... In some previous cases, the NTSB has gone so far as to throw
> participants off an investigation for violating its rules. In December 2010,
> the safety board removed American Airlines Inc. from an investigation into a
> runway accident in Jackson Hole, Wyoming, two days earlier.

[0] [http://www.heraldnet.com/business/ntsb-scolds-boeing-
over-78...](http://www.heraldnet.com/business/ntsb-scolds-boeing-
over-787-comments/)

~~~
dingaling
Yes, they did; once. Which is far, far outweighed by the hundreds of
investigations in which they did not.

The exception which proves the rule.

What's Tesla's batting average? 50%?

~~~
bloodorange
So many keep quoting this but I fail to see how exceptions prove rules. In
fact they contribute to disproving whatever it is that they are an exception
for.

~~~
stordoff
I believe the original meaning is more like if there is an exception, there
must be a rule (otherwise it wouldn't _be_ an exception/you wouldn't need an
exception). It's been watered down a lot, and is often used to suggest that
any counterexample makes a rule stronger (which is obviously not always the
case -- enough counterexamples just proves the rule is wrong/doesn't exist).

------
RcouF1uZ4gsC
Given how Tesla has handled this fatality, how has your opinion of Tesla
changed. Has anybody here decided that they are never going to buy a Tesla? Or
do you view this as just overblown and you would still buy a Tesla if you
could.

~~~
Analemma_
> Given how Tesla has handled this fatality, how has your opinion of Tesla
> changed.

Just one data point here, but it absolutely has tanked Tesla's reputation in
my mind. Even though I was always skeptical of them from a business
perspective, I thought they were basically well-meaning and their only crime
was being too enthusiastic. These last few months have convinced me that
they're no better than Uber: they're being reckless with unproven and inferior
technology (Elon's spin notwithstanding, there are good reasons why every
other autonomous car program is using LIDAR), and they're trying to use
disingenuous PR to pass off the inevitable deaths as someone else's problem.

I was vaguely considering getting a Model 3 when the waitlist eased up a
little, but now that's definitely not going to happen.

------
bjl
Misleading title. Tesla was forcibly kicked off the investigation by the NTST.

~~~
Element_
That was the original title of the article at the time of submission, after
more details came out Bloomberg updated the title.

------
inverse_pi
Tesla's autopilot system is no different than a self driving car with a safety
driver and should be subjected to the same regulation. In a self driving car
(Waymo, Cruise, etc) driver turns on autonomous mode when the driver thinks
it's safe to do so. All drivers are thoroughly trained for the safety of
themselves and other people on the road. Companies must obtain permits to have
cars on the road and report annually on safety of the system. I'm not
comfortable driving in the road knowing Tesla's drivers are not trained, the
company is not subjected to the same laws as self driving system even though
the drivers can turn on Autopilot ANYTIME they want, and the car can do
WHATEVER.

~~~
fastball
Drivers actually cannot turn on Autopilot anytime they want: it prevents you
from doing so in noticeably adverse conditions.

------
282883392
While Tesla's response to the accident has been very aggressive (to the point
of being rude), transparency is also important. Not being able to talk about
an accident for an entire year would be a disservice to the community and most
likely lead to rampant speculation considering how high profile these
accidents are.

In this case Tesla has handled the entire incident rather poorly, but it makes
sense that they would want to abstain from the investigation to be more
transparent about what happened. Keep in mind that in a year from now the data
about the accident will be mostly irrelevant.

~~~
Analemma_
> transparency is also important.

It's pretty misleading to defend Tesla in the name of transparency: so far,
Tesla has been nothing but opaque regarding this incident. There's a whole lot
of spin, unsupported claims, blaming the driver, and refusing to acknowledge
Autopilot problems like veering into crash barriers post-update. That's the
whole reason _why_ the NTSB "fired" them: they were trying to create confusion
around a transparent investigation.

------
mtgx
It's frustrating to see that car companies see themselves as "partners" to the
NTSB or other investigators.

Self-driving car companies should have only one obligation: provide all the
relevant car crash data to the investigators, then step out of the way, and
keep its mouth shut until the investigation concludes.

I've been warning since a couple of years ago that this is going to be an
issue if not solved early on and that self-driving car companies need to be
_liable_ for car crashes (the one who's found at fault by investigators). But
we haven't even established this.

Also, I think it's inevitable that all "self-driving" that isn't guaranteed by
a government body to be Level 4 or higher should be banned from public roads.
Sub-Level 4 self-driving is just not safe, and the very situations in which
carmakers argue it would help reduce accidents are incompatible with these
companies' own ToS.

For instance, let's say someone is drunk and has a Level 2 car (like a Tesla).
According to Tesla, such a person should enable self-driving, as that will _on
average_ reduce the likelihood of crashes compared to if the drunk person
drove themselves home.

But Tesla's own ToS says such a person would need to be constantly vigilant
and with their hands always on the wheel, not fall asleep, and so on. How
likely is that going to be with a drunk person? My guess is not very likely.

If the Autopilot malfunctioned like it did in the latest crash, that drunk
person would die, because there's no way a drunk person could take the wheel
and drive safely when even sober people can't seem to do it when they stop
paying attention for a while.

~~~
neurotech1
> It's frustrating to see that car companies see themselves as "partners" to
> the NTSB or other investigators.

That is how the NTSB "party" system is ran[0]. Contrary to what is shown in
various movies eg. Flight(2012), Sully (2016), an NTSB investigation is not an
adversarial process, and is not a criminal investigation. Additionally, NTSB
determined cause (an opinion by the board) cannot be used as evidence in civil
litigation.

Asiana 214 [2] is a good example of the NTSB being somewhat divided in the
opinion of the 777 autopilot having a design flaw vs "pilot human performance"
as the cause of the crash.

IMHO both the autopilot design and the pilot performance were causal factors
in the crash. It is probable the NTSB will determine both the Tesla driver and
the autopilot were causal factors in this crash.

[0]
[https://www.ntsb.gov/investigations/process/Pages/default.as...](https://www.ntsb.gov/investigations/process/Pages/default.aspx)

[1]
[https://www.ntsb.gov/legal/Documents/NTSB_Investigation_Part...](https://www.ntsb.gov/legal/Documents/NTSB_Investigation_Party_Form.pdf)

[2]
[https://en.wikipedia.org/wiki/Asiana_Airlines_Flight_214](https://en.wikipedia.org/wiki/Asiana_Airlines_Flight_214)

------
iamleppert
Riding in a Tesla with autopilot turned on is like driving with a new teenage
driver: you’re terrified at any minute it’s going to do something crazy like
steer you into a concrete wall.

There are certain situations that you can quickly get into via steering inputs
to the car that cannot be recovered, so the point of even keeping your hands
on the steering wheel as an argument for legal liability is moot: before you
even realize what’s happening to take over and apply corrective action it will
often times simply be too late.

Additionally, most people, save for professional drivers, do not have enough
experience to know _what_ corrective steering inputs to perform during an
emergency situation. In many cases, any corrective action via steering, brake
or acceleration control input from the human driver will be incorrect and thus
lead to an unpredictable and dangerous situation.

By this logic, Tesla needs to be held accountable for any accidents where they
argue human intervention would help.

~~~
fastball
Literally moving the wheel or stepping on the gas stops Autopilot -- I'm not
sure how that is unknown if such an action will be "correct".

------
avs733
I always see this framed as between Tesla and the driver of whatever vehicle
has crashed. That framing is irresponsible and should stop.

When a Tesla owner chooses to use autopilot, it has a potential impact on
other drivers and pedestrians around them - not just driver. Part of the
reason we have regulations about seatbelts and airbags isn't just that they
keep passengers safe its that they make crashes safer for others on the roads.
With a human driver, others are on the same page and can do things like make
eye contact or assess the behavior of another driver. In an accident, they can
hold that driver responsible. It is not other drivers nor pedestrians role to
accept the risk for a Tesla driver wh omay or may not understand the specific
nuances of a specific software patch. Thats why we regulate vehicles and
license drivers. As a society we choose to do that because the organizations
that do so are an enormous efficienty and capability improvement in assessment
as compared to asking each individual driver to do the same.

I don't really want to make this political but frankly the seeming lack of
understanding strikes me as very software/app/tech mindset and very much a
product of the intermingled libertarian leanings. Cars are not software.

~~~
azernik
> Cars are not software.

Software is also not software (in this framing). I think the Facebook
elections fiasco has shown that even something as innocuous as social media
has consequences beyond the strictly user<->company relationship.

------
wilburTheDog
So I can't help but wonder. If the Tesla autopilot system has forward looking
radar with a 160m range, why wouldn't it notice it was about to drive into
something and apply the brakes?

I've read these systems work by assigning probability to each of the sensor
inputs and choosing the most likely guess at what's really happening (or
something like that). But shouldn't even a low probability of "you're about to
crash and die" have more weight than a high probability of "it looks like the
road goes this way".

------
JustJay613
I neither like nor dislike Tesla but most of you are missing the point. The
cars are semi-autonomous not fully. They are hardware ready and one day will
get a firmware upgrade as such. It is very, very clearly stated that you are
still responsible for driving the vehicle. If you choose to focus on your
phone or watch a movie or read a book, etc, well bad things can happen.
Slamming on the brakes or taking evasive action is not appropriate for a semi-
autonomous car to do. What if the Tesla driver survived but an innocent person
in another car was killed. Wow, would everyone be screaming. We always like to
blame something and not take responsibility. Last year a woman made a left
hand turn in front of a slow moving transport as I was driving in the lane
beside the transport. She drove right in to the drivers door and then tried to
tell the police it was my fault. Until something is proven otherwise this
looks like someone taking the tech for granted and not watching the road
inspire if instructions and warnings. Tesla is I no way at fault at this time.

------
fastball
Many people in these threads are stating very loudly "well if Tesla didn't
want to be culpable they shouldn't call it Autopilot".

Two things:

1\. Regardless of the name, Tesla has never marketed this as a product which
allows you to at any point stop paying attention to the road or take your
hands off the wheel. At least to my knowledge.

2\. The name. "Autopilot". The only time I'd ever heard the term "autopilot"
before Tesla used it was in Airplanes. I'm not entirely sure how everyone is
forgetting this, but planes have had "autopilot" for decades and _all_
passenger planes still have pilots. Commercial jets have TWO pilots (at least)
who are highly paid and have a huge amount of flight time in order to ensure
that _someone_ is paying attention _at all times_. All of these aforementioned
jets have "autopilot". Tesla is absolutely not holding their drivers to a
higher standard than they should given the name "autopilot". I'm not sure why
this is so often parroted but it's nonsense.

~~~
productiveacct
Autopilot means self-piloting. How many average people know the extent to
which autopilot on airplanes is used, and what exactly it does? I would
imagine a good bit of people think that autopilot does everything and that the
pilot is just there in case of an emergency. If you translate that to cars,
given the number of people with driver's licenses (not exactly a specialty
like being a pilot is), I think it's dangerous to use the name autopilot.

~~~
fastball
Autopilot has never been used to describe a system which does the entirety of
control functions on its own. Ever. Autopilot is an aeronautics term, and
every system ever described as autopilot has meant the ability to maintain
course. That is all.

In actual fact, because Tesla's Autopilot has lane change capability, it
actually goes above and beyond the conventional definition[1] of "Automatic
Pilot".

Anyone with half a brain doesn't even expect a "hoverboard" in 2018 to
actually hover, and that _is_ a bastardization of the term. I'm honestly
struggling to understand why people are having such a hard time with this.

1: [http://www.dictionary.com/browse/automatic-
pilot](http://www.dictionary.com/browse/automatic-pilot)

------
danso
A little OT, but given the nature of this bug, I'm surprised we haven't heard
of many more of these types of accidents, given the possible number of
highways on which road lines may be ambiguous. Is it a problem of
underreporting? Are all the other Tesla drivers just that much more attentive
when using Autopilot?

------
tomohawk
Why would a party like Tesla, with an obvious conflict of interest, be part of
the investigation? They should be on the receiving end of the investigation,
not part of it. It's like having Enron be part of the investigation.

It really doesn't matter what spin Tesla puts on this. Their car drove into a
concrete wall. They can talk about hands on the wheel or off. They can talk
about how it's only called autopilot, but it really isn't. It doesn't change
the fact that their software drove the vehicle into a concrete wall. Was it
asleep? Perhaps it had an attention lapse? Perhaps it mistook a solid,
immovable object for something else?

If you have something called autopilot, the one thing it should never, ever do
is drive into a solid, immovable object, that is easy to not hit.

------
hirundo
> Companies that no longer have formal status as a party to an NTSB
> investigation can lose access to information uncovered in the probe and the
> ability to shape the official record of the incident ...

It's absurd that the normal process does give companies the ability to shape
the official record of the incident. They should never be a party to the NTSB
investigation to begin with, because they cannot be neutral with respect to
their own acts and omissions. Their role should always have been what it is
now: technical assistance.

------
thetruthseeker1
I like Tessa. I hope they succeed as a company. That said, their claim that
the driver had his hands of the steering wheel for 6 seconds implying that
they are absolved of any wrong doing seems hard to buy. They may eventually be
absolved of wrong doing legally, however I think that statement is bad from PR
point of view in addition to casting a bad light on the autopilot tech (It got
confused within 6 seconds of him taking his hands off)!

------
Robotbeat
Here's the thing about autopilot all the way up through self-driving cars:

A perfect system would never have a fatality. EVERY problem that ends in
fatality can ALWAYS be traced to some flaw or inadequacy. Every. Single. One.

That is why self-driving cars will eventually be far safer than any human
driver (because every fatality can be stopped), and why the natural human
tendency will be to crucify any company who attempts to enter the market[0,1],
which will mean millions of more unnecessary deaths at the hands of human
drivers because we'll delay deploying self-driving technology until it's
perfect.

[0]Even if they, accurately, point out the driver is still responsible and
that overall safety is improved. Accuracy doesn't matter, emotional resonance
does. That's victim blaming!

[1]At the same time, this intense focus--while it's "unfair" given the masses
of car deaths every day--is also what will drive the massive improvements of
safety. So the inaccurate outcry can actually be a good thing. Provided that
original player doesn't give up or go out of business first. This dynamic can
help explain why airlines in the developed world are so _ridiculously_ safe
(...but also perhaps why airliner technology has been stagnant for half a
century, with only safety and incremental efficiency/cost improvements).

~~~
cma
> A perfect system would never have a fatality. EVERY problem that ends in
> fatality can ALWAYS be traced to some flaw or inadequacy. Every. Single.
> One.

I've been in a wreck on the highway from a random tire blow out. Happens all
the time. While an AI will ultimately be able to handle that situation way
better than a human, I don't see it eliminating all risk without just always
driving slower.

~~~
Robotbeat
>...random tire blowout...

I'm not saying your point isn't valid, however...

[http://spacecraft.ssl.umd.edu/akins_laws.html](http://spacecraft.ssl.umd.edu/akins_laws.html)

Akin's Laws of Spacecraft Design:

"25\. (Bowden's Law) Following a testing failure, it's always possible to
refine the analysis to show that you really had negative margins all along."

Your tire blowout wasn't "random," from this perspective, it occurred due to a
failure at some point: inadequately tight manufacturing tolerances, a
regulator system or system integrator that accepted such tolerances, a random
thing in the road (which AI could've picked up on), a temperature spike (which
AI could've picked up on), a change in tire dynamics immediately preceding the
incident which you didn't notice (which AI could've picked up on), incomplete
inspections, etc.

Even if that specific failure is NOT directly something that AI would solve,
the elimination of basically every human driving error would bring focus to
any other problems. Right now, random tire blowouts are not the predominant
cause of death and so people don't focus much on it, but if all other causes
were eliminated, then all of a sudden everyone would be focusing on better
engineered tires.

This is the same dynamic as in airline safety. Every airliner failure in
developed countries is plastered over the media for months simply because it's
so rare, therefore any remaining deficiency is brought into tight focus.

I can see the headline: "Tesla owner DIES in random tire blowout! Shoddy
quality to control to blame; sharks circle as Tesla tanks!"

(High media coverage is a double-edged sword for Tesla...)

> I don't see it eliminating all risk without just always driving slower.

...it's worth noting that airliners use tires at hundreds of miles per hour,
yet fatalities basically never happen. Things can be engineered to be
arbitrarily safe.

~~~
cma
If you are talking about improving the crash safety of cars in general by
making vehicular structural changes and inventing new tire chemistry and
manufacturing processes, that is a separate issue than self driving, to the
extent it would improve safety for regular drivers too.

>...it's worth noting that airliners use tires at hundreds of miles per hour,
yet fatalities basically never happen. Things can be engineered to be
arbitrarily safe.

Aircraft are protecting all eggs in one basket. If you want to talk about the
roadways being maintained and cleaned at the level of runways, you're
crediting self-driving for something that isn't self-driving. Aircraft may
have different structural scaling laws in play too (500 20 second touchdowns
before replacement/refurbishment, short periods high stress).

An how far are you going to allow reengineering everything? Would it be ok to
call slightly modified 747s that can taxi themselves self-driving cars?
Trains?

------
dingo_bat
I wish more federal agencies did this to more arrogant corps. People have died
and tesla is responding by blaming the dead people for everything! Not even a
semblance of a good investigation.

Elon Musk should intervene and apologize if he has even a bit of humanity.

------
memeweaver
Autopilot is a terrible marketing term. Should not be trusted in any sort of
complex highway situation. Reliance on road lines is so risky...

------
gpm
Tesla's done a shitty job of handling PR related to this crash. But I think
the NTSB is in the moral (not legal) wrong on this particular issue.

The NTSB's job is to investigate accidents, and as part of that to maintain
their own reputation. Their job is not to pressure companies into not saying
anything. By attempting to do so they are just hampering the ability to
investigate the accident.

If Tesla had said "and the NTSB investigation is going in this direction" I
would have agreed with the decision. But as boneheaded as Tesla's PR has been
it hasn't done anything of the sort, the NTSB has no reason to fear for their
own reputation as a result of this.

~~~
gsnedders
Their job often involves interviewing many people, and to do that and get
useful results you want them unbiased (for the same reason as you want a jury
that doesn't already know about the trial).

~~~
gpm
Sure, but how does working less closely with Tesla result in more unbiased
people?

~~~
azernik
A couple of things:

1\. Having Tesla keep their mouths shut results in more unbiased people.
Kicking Tesla off the investigation is in large part a _deterrent_ , not a
corrective action.

2\. This change also involves giving Tesla less access to data that they might
leak. Information collected from the car's logs is under NTSB control (in past
cases, for example, American Airlines was punished for _making its own copy of
the black box_ before handing it over). You can reduce these issues by not
sharing information with an unreliable party.

~~~
gpm
> 1\. Having Tesla keep their mouths shut results in more unbiased people.
> Kicking Tesla off the investigation is in large part a deterrent, not a
> corrective action.

This purpose is in plain violation of the principles of freedom of speech, and
the first amendment. The government is not permitted, morally or legally, to
punish people for speaking. As such I'm not counting it.

> 2\. This change also involves giving Tesla less access to data that they
> might leak. Information collected from the car's logs is under NTSB control
> (in past cases, for example, American Airlines was punished for making its
> own copy of the black box before handing it over). You can reduce these
> issues by not sharing information with an unreliable party.

This could be a legitimate purpose, but it seems so minor that short of the
NTSB mentioning it I'm skeptical. The gain doesn't seem to be worth the loss
of working closely with the manufacturer...

~~~
azernik
The government is absolutely allowed to limit speech on this level within the
bounds of the 1st Amendment. Fire-in-a-crowded-theater and all that.

~~~
gpm
"fire in a crowded theater" is largely repealed and now requires "imminent
lawless action" which certainly isn't the case here. Please read more about it
here [0], written by a lawyer. Ignore the headline, the author wasn't pleased
with it either [1].

Generally exceptions to free speech are extremely limited, "because we don't
want you to bias people" is _definitely_ not within an exception.

[0] [http://www.latimes.com/opinion/op-ed/la-oe-white-first-
amend...](http://www.latimes.com/opinion/op-ed/la-oe-white-first-amendment-
slogans-20170608-story.html)

[1] [https://www.popehat.com/2017/06/08/free-speech-tropes-in-
the...](https://www.popehat.com/2017/06/08/free-speech-tropes-in-the-la-
times/)

~~~
Fins
Jurors are routinely forbidden from discussing the case they are trying,
parties to a settlement can be barred from discussing it, etc., etc., so there
are quite a bit more limitations on "free speech" than that.

------
ibdf
I had the same thought after reading the news. It should be called pilot
assistant or something similar.

------
brisance
Tesla is using the orange dotard defense: "no collusion!" NTSB did right by
me.

