
Tesla's Autopilot Chief Keller Steps Down After Two Years - trimbo
https://www.bloomberg.com/news/articles/2018-04-26/tesla-says-autopilot-vice-president-keller-is-leaving-carmaker
======
nakedrobot2
I think the brunt of the "Tesla Autopilot Deaths" problem is not the tech, but
the marketing department (which has to include Elon Musk of course). This
"kind of autonomous driving" is absolutely deadly, by definition. It's never
going to work well. "The car will drive by itself, until it can't. So don't
pay any attention, until you need to save your own life. We'll give you 2
seconds warning". I mean, really? I don't know why Tesla could possibly think
this is a good idea. By extension, anyone who actually uses or trusts
autopilot is basically forfeiting their life. I think it is cognitively harder
to use autopilot and pay attention, than to simply drive safely.

As far as the tech goes, I am in agreement with Elon. I think LiDAR is a short
term advantage, but the companies/stacks depending on LiDAR will reach a local
maximum from which they won't recover without totally starting again from
scratch. LiDAR is short range, low resolution, and extremely expensive. There
is no reason that visible light (+ IR) CMOS cameras can't do better by an
order of magnitude, at a lower cost.

~~~
NickM
_This "kind of autonomous driving" is absolutely deadly, by definition. It's
never going to work well._

Instead of just speculating, the NHTSA did an actual study on this, and they
reached the opposite conclusion. They investigated crash rates in Tesla
vehicles that had the original autopilot hardware installed, both before and
after autosteer was enabled via a software update, and found that autosteer
reduced accident rates by 40%.

When Tesla talks about how much lower their accident rate per mile is compared
to other cars, that's kind of statistical bullshit, since there are plenty of
other uncontrolled variables that are in play that could affect this besides
the cars themselves. But in the case of the NHTSA study, they were looking at
the same drivers in the same cars, and the only difference was that one day a
software update came out and enabled autosteer, and suddenly accident rates
went way down. That's pretty damn convincing if you ask me.

[https://static.nhtsa.gov/odi/inv/2016/INCLA-
PE16007-7876.pdf](https://static.nhtsa.gov/odi/inv/2016/INCLA-
PE16007-7876.pdf)

~~~
kwhitefoot
This agrees with my experience. Anecdota I know, but I feel that I am safer
with autosteer on because a moment of inattention will not cause the car to
drift out of lane or drive into another vehicle. I have a 2015 S 70D,
autopilot 1.5 I suppose.

~~~
rsync
"I feel that I am safer with autosteer on because a moment of inattention"

If one has "moments" of inattention significant enough to drift out of a lane
or "drive into another vehicle", one should not be driving.

There is a current (Toyota ?) commercial where a young driver with her father
is saved from _mowing down a crosswalk full of pedestrians_ by some form of
"auto stop" and the father proudly declares "it stopped for you". If you need
that to avoid killing people, you should not be driving.

~~~
sjsjxhhc
I don’t think it’s as black-and-white as you claim. Suppose a driver is
inattentive at some moment with probability x, and conditions where
inattentiveness would lead to a crash exist with probability y, then a
probability of a crash at that moment is xy. Probably, both x and y are > 0
for all drivers, but xy could be low enough that an accident never occurs.

The driver has no reliable way of estimating x or y and therefore can’t judge
the risk of their lack of attention causing a crash. Maybe they “shouldn’t” be
driving according to this formula, but they don’t know that, and autosteer
could save them from an accident.

Most likely, the presence of autopilot increases x and decreases y. We can’t
tell which affect dominates by arguing on the internet. We need data.

------
code4tee
The “autopilot” concept isn’t going to work at scale with cars. It’s either
autonomous or it’s not.

As these accidents demonstrate drivers can’t be relied upon to ‘monitor’ the
autopilot and react if something goes wrong. It’s a deeply flawed design
concept from an otherwise very innovative company.

~~~
NickM
_As these accidents demonstrate drivers can’t be relied upon to ‘monitor’ the
autopilot and react if something goes wrong._

We don't know how many other accidents have been prevented by autopilot. A
safety system should not be judged by whether it's perfect, but by whether
having it is better than not having it.

The NHTSA did a study on exactly this issue by looking at accident rates in
cars with autopilot hardware before and after autopilot was actually enabled
via a software update. They found that accident rates went down by about 40%
after autosteer was enabled.

This seems to suggest that either people are fine with monitoring and taking
over when needed, or else autosteer was already so good in 2016 that it made
fewer mistakes than humans.

[https://static.nhtsa.gov/odi/inv/2016/INCLA-
PE16007-7876.pdf](https://static.nhtsa.gov/odi/inv/2016/INCLA-
PE16007-7876.pdf)

~~~
frgtpsswrdlame
>else autosteer was already so good in 2016 that it made fewer mistakes than
humans.

No no no no no! The NHTSA report says _nothing_ about whether Tesla's system
is safer than humans. In that study there are two buckets, one which is total
Tesla miles in TACC-enabled cars and then after the update total Tesla miles
in cars with TACC + Autosteer and they calculated on airbag deployments. Human
driven miles are going to dominate both of those buckets and there's a reason
the NHTSA report makes zero claims about Tesla's safety relative to human
drivers. It's totally outside the scope of the study. Then add in that some
researchers who are skeptical of the methodology and have been asking for the
raw data from NHTSA/Tesla have yet to receive it.

~~~
NickM
I didn't say it was safer than humans. I said that the study results _suggest_
that _either_ humans are fine with taking over and driving manually when
required _or_ if humans are not taking over when they should then autosteer
might just be safer than humans. I don't claim to know exactly why autosteer
would reduce accident rates, and neither does the study, but there must be
_some_ explanation and the last bit of my comment was simply speculating as to
what that explanation might be.

~~~
frgtpsswrdlame
Or it could show that Autosteer was so unsafe that people became scared and
stopped using it entirely, engaging in the much safer method of driving
manually and lowering the accident rate of Teslas after Autosteer. Or it could
show that drivers new to Teslas have initial higher accident rates because of
the driving differences between traditional gas cars and EVs. Or it could show
that Tesla changed the sensitivity of the airbags in the same update.[1] Or it
could shows that autosteer is actually less safe than TACC alone but AEB and
collision warnings more than make up the safety difference. The point is that
we don't even know that autosteer is actually reducing accident rates. No
matter if you're a fan of Tesla or not you should support the release of the
data the NHTSA worked with so we can do an independent analysis.

[1] It can do this, see: [https://electrek.co/2017/06/07/tesla-software-
update-airbag-...](https://electrek.co/2017/06/07/tesla-software-update-
airbag-model-x/)

------
loser777
Not mentioned in the bloomberg article (at the time of reading): Keller is
apparently moving to _Intel_ (according to a quick search).

Wow, I really hope that doesn't worsen the current imbalance of the x86 and
GPU markets. Given that Keller seemed to be to Ryzen at AMD what Raja was to
Vega/Polaris, it doesn't look good.

~~~
slivym
That's really hilarious, because Intel's entire automotive division is run by
the guy who came from the Mobile Eye acquisition. For those that don't
remember - Mobile Eye was one of Tesla's tech suppliers until they split up.
Then Mobile Eye basically came out and said that Tesla's self-driving
technology was unsafe. It was a massive PR hit for Tesla, and a massive PR hit
for Mobile Eye (who wants a technology partner that'll publicly accuse you
like that).

It'll be interesting to see Keller work in that environment.

~~~
dogma1138
Tesla tried to throw MobilEye under the bus and MonileEye basically said that
they’ve used their platform out of spec and out of its functional domain,
there were also some cheeky remarks by Amnon who was making statements like
“if you have big balls like Tesla you can only use a single sensor type with
no redundancy”.

Currently MobilEye seems to be the only vendor who doesn’t oversells their
capabilities like crazy and has a solid evolutionary roadmap on all 3
landscapes - technology, regulation, and business model.

~~~
Tepix
Do you also consider Waymo to be a vendor?

~~~
dogma1138
If they won't be on some level it's an extreme waste of effort.

------
deng
Title is misleading. Keller was at Tesla for two years, but was head of
"Autopilot" for less than a year (he took that part over from Lattner, who was
head of software for less than 6 months).

~~~
andygates
That seems like an unfortunate amount of churn at the top of a long-term
development department. Trouble in Teslaville, or is this just regular valley
bed-hopping?

~~~
davedx
> Trouble in Teslaville, or is this just regular valley bed-hopping?

A bit of both I'd say. From what I've been following, Tesla's autopilot
efforts have been a particularly rocky road, comparable with the Model 3 ramp-
up troubles.

------
jacksmith21006
Been a software engineer majority of my life and if I worked on technology
that could kill people and my company was not approaching how I thought made
sense it would be difficult for me to stay.

I do not know if that is what is happening here but might be.

IMO, you can NOT be kind of driving. SDC technology should NOT go directly to
consumers at this point.

Should be done with a commercial offering as we work through testing.

SDC - Self Driving Cars

------
loceng
I would love to see the chain of command, along with specifically who, for
decision making relating to the investigation of and up until the press
releases relating to that last crash where someone died (where the roadway
safety buffer structure wasn't replaced properly), where based on information
and analysis in HN comments, it seemed like Tesla/someone at Tesla was
disingenuous or had a thinking error in information they translated up the
chain of command. The PR definitely felt like the marketing department - who
we can assume doesn't have the same technical understanding or abilities - had
sway in perhaps a situation they shouldn't have. The only way for Elon to
survive the hole that's getting dug deeper is to own up to analyze their
process, own up the mistake, and be very public/open about the process and
improvements to their processes they will make - and highlighting if it was
brought to light by information/analyze from the tech community; the liability
also shouldn't be any higher than 50% for Tesla IMHO, because if that safety
infrastructure was properly replaced then there's a very high probability the
individual would have lived, and they would become a spokesperson crying foul.

------
ggg9990
What they really need to do is have the decision on when to ship be put in the
hands of someone other than the person who is managing the development of the
technology.

------
JudasGoat
Here is someone that should be paid like a sports superstar. There seems to be
so few people with his skill set.

~~~
btian
you're free to pay him $50 million a year. I don't believe there are laws that
would prohibit it

------
tim333
I wonder how Andrej Karpathy is getting along. He was in " Tesla hires deep
learning expert Andrej Karpathy to lead Autopilot vision" headlines not long
back.
[https://news.ycombinator.com/item?id=14599668](https://news.ycombinator.com/item?id=14599668)

