
Andrew Ng calls Tesla irresponsible for shipping an imperfect autopilot - impish19
https://www.facebook.com/andrew.ng.96/posts/1025649884157585
======
xpda
There will be better autopilots, but never perfect ones. Aircraft have been
using imperfect autopilots since their inception. Most planes have a prominent
red button you can press to disable the autopilot whenever it misbehaves.

I'm not sure why this post routes through facebook, but here is a link to the
article: [http://www.cnet.com/roadshow/news/model-s-on-autopilot-
crash...](http://www.cnet.com/roadshow/news/model-s-on-autopilot-crashes-into-
other-car-proving-yet-again-tesla-owners-arent-paying-close-enough/)

~~~
alistairSH
Part of the problem is Tesla's "autopilot" isn't designed to be fully
automatic. It's a diver's assistance package. As a comment in the Facebook
thread mentioned, I too wonder if the problem is partial-autopilot in general.
People will tend to put their trust in the system if it appears to be
automatic. We probably shouldn't be shipping partial autopilot at all. Fully
automatic or bust.

~~~
geoelectric
I have an assisted driving system in my Jeep. I'm pretty clear on its
capabilities and, for example, stay alert to the possibility I have to hit the
brakes when coming up on stopped traffic at freeway speeds because I'm
outrunning its detection range for it to do so safely automatically.

The problem is Tesla selling a system with similar limitations as "autopilot."
It's not. It's an assisted driving system, albeit a very smart one, and they
need to make it more clear that's what it is. They're setting incorrect
expectations, and the various "I climbed into my passenger seat while it
drives!" bits aren't helping.

Basically we're living the old urban legend of the dude putting on cruise
control and then crawling into his camper for a nap. The cruise control is a
little bit smarter nowadays, but it comes down to a gross misunderstanding
that Tesla is (inadvertently) encouraging.

~~~
alistairSH
Does the Jeep system function if you don't provide any inputs? The Subaru
system will brake if it has too, but won't actually drive the car for you.
Same for lane holding - it'll give some steering input and beep if you start
to swerve, but it won't actively steer around a curve.

~~~
geoelectric
Really late response, but it'll gas/brake on its own and will steer back to
the center of the lane if you go past a configurable margin.

Gas/brake is meant to be "foot off pedals." You set a desired speed, and it'll
go the lesser of that or the speed of the car you're following, and follow at
a configurable distance otherwise. It's very, very, VERY nice for Bay Area
commuting.

The only real needs to pay attention are A) if a car cuts you off and then
slams brakes, B) the car in front of you locks brakes up or swerves out of the
way last minute with something in front of it, or C) you're booking it at
freeway speeds and come up on stopped traffic. In all three cases an alarm
sounds telling you to brake, but if you don't want to to be an emergency
you'll be looking out for at least C) so you can take over the slowing down
before that alarm could go off. Hopefully the sensors get longer range in the
future.

The lane-keeping is meant to only be an assist, though, and would leave you
pinballing from one side to the other (or getting stuck on the side that's
downhill from the crown) as you hit the margin and it pushes you back center.
It'd also get you through a gentle curve the same way. You _can_ go hands free
with good alignment and tolerance for looking like you're driving drunk but I
wouldn't recommend it. Eventually an alarm will sound if you're fully hands
off the wheel.

Worth noting Tesla's system isn't much different there, though, aside from
that it'll hold center of the lane and won't pinball. You're still expected to
keep your hands on the wheel, and it has a similar alarm that goes off if you
don't.

Don't get me wrong--their system is ultimately more advanced, with speed limit
sign reading and a bunch of other cool tricks. But it's still meant to be a
hands-on assist and isn't sold that way.

Edit: to be clear, I'm just describing adaptive cruise control above, and
there's nothing particularly special about that. For all intents and purposes,
the only difference with autopilot is the active steering for lane-keeping,
even if it is smarter under the hood. It seems to be have the same limitations
otherwise judging by their manual disclaimers.

------
bargl
First, Andrew Ng is a co-founder of Coursera and helped teach a few Machine
Learning classes just so we all know he has the chops to say something like
this. That's how I recognized his name a little more digging shows he really
does have a good ML/AI background.

Second, He isn't saying an imperfect autopilot. He's saying that 1000 to 1 is
NOT acceptable error. So the title is misleading, unless he edited his post.

Third, he is correct. Lay people who see this will immediately think that
Autopilot is WAY to hard and that software devs will never get it right. I'm
not saying they are right, but I am saying it's terrible PR for AI to release
a system that is as buggy as this and assume zero responsibility.

------
suprgeek
Tesla is innovating rapidly in the Autonomous driving space AND releasing
those inventions to the customer, which is great.

Typically these things used to take 5-7years to "trickle down" where a
reasonably-priced car (not that $80k is reasonable) might have these available
for end users.

However Tesla absolutely NEEDS to do more to prevent this technology push
model that they have from tainting the whole Autonomous Driving trend.
Expecting that a Driver will read the f'ing manual about

"....Traffic-aware cruise control may not brake/decelerate for stationary
vehicles, especially in situations when you are driving over 50 mph (80 km/h)
....Depending on TrafficAware Cruise Control to avoid a collision can result
in serious injury or death."

is insane.

My suggestion: Enforce a must watch video tutorial with questions (such as is
common for say a driving test) before enabling the Partial AutoDrive. It
really is a matter of Life or Death - you have a very short time to react on
the road. With their distribution model Tesla definately has the resources to
pull this off, and might serve as a model for the rest of the industry.

~~~
vhold
I kind of wonder if Tesla figures "Hey, once a few crashes happen people will
figure it out."

~~~
madelinecameron
It doesn't matter what Tesla does or says. Stupid always finds a way.

------
robbrown451
I agree it is irresponsible. I think Google is correct in saying that self
driving cars probably shouldn't even have steering wheels...it is absolutely
unrealistic to think that people will hang out for an hour while the car does
things correctly, and then be alert when it happens to not do things
correctly.

Humans don't work like that.

------
raverbashing
From the Article

" However, as YouTube commenter Shaimach points out, the Model S manual calls
out this exact situation as something drivers need to be aware of:

Warning: Traffic-aware cruise control may not brake/decelerate for stationary
vehicles, especially in situations when you are driving over 50 mph (80 km/h)
and a vehicle you are following moves out of your driving path and a
stationary vehicle or object is in front of you instead. Always pay attention
to the road ahead and stay prepared to take immediate corrective action.
Depending on TrafficAware Cruise Control to avoid a collision can result in
serious injury or death."

------
sksixk
i can't get my head around a car and a human driver sharing the driving duties
at the same time. if you can't trust the car to do everything correctly, then
what am i supposed to do as a driver?

sit there tensely with my right foot and hands hovering over the brake pedal
and steering wheel?

~~~
chadgeidel
How is this different than the automatic transmission? Antilock brakes?
Traction control? Lane departure/encroachment warnings? Adaptive cruise
control?

Or even older tech like "drive by wire" (your foot isn't directly connected to
the throttle bodies) or manually adjusting the choke in carbureted vehicles?

These are innovations that have happened in my lifetime. I'm sure there are
even older "car and driver sharing duties" examples others can come up with.

~~~
robbrown451
It's very different. But some of those things, like adaptive cruise control,
get close. But even with adaptive cruise control, if you completely stop
paying attention, you will be reminded very quickly that you are doing it
wrong.

With this, you could go for hours without having to do a single thing, and
then suddenly you are expected to jump in and take control with barely a
second's notice. That is just bad human factors.

~~~
chadgeidel
I would refer to @mikeash's answer as I feel it addresses this question.

~~~
robbrown451
Again, I don't see how that is supposed to work. How long are people going to
pay attention to "the big picture" when they still are only very rarely
required to do something? If they don't have to do some action to actually
keep the car on the road or to avoid getting honked at every 30 seconds or so,
their minds will drift off.

This is just regular human nature. Sure you can do it for a while, but after a
while you stop doing it.

I am quite sure that Tesla et al will discover soon enough you can't rely on
people to step in like that. They just need to get their cars to work better
than humans, all the time. (or as Elon Musk said, they should be at least an
order of magnitude safer)

------
archildress
This is just the cost of progress. If every imperfect piece of technology
didn't ship, we'd have no technology at all.

~~~
riffraff
not all technologies are created equal.

If my macbook's battery dies when it rains, it's not the same as if my car
crashes into another one injuring multiple people.

~~~
moron4hire
One might make the argument that the automobile--as originally released
without autopilot--is a deeply flawed device and that even this level of
"flawed" autopilot is superior.

~~~
riffraff
the problem here is not that the thing is flawed, is that it's flawed in a
radical way that users are not aware of, i.e.

> Traffic-aware cruise control may not brake/decelerate for stationary
> vehicles

When you drive a car, you somehow expect to crash into something if you don't
pay attention. If you let your autopilot do it, crashing into stationary
objects is not something you are prepared for.

~~~
moron4hire
The point I'm making is that we should not be concerned with making a perfect
autopilot, but more importantly, we should not be concerned with making an
autopilot that is as good as a human driver in all aspects in which human
drivers are good.

We should be concerned with making one that reduces injury.

If that means that we have more low-speed fender benders--at the cost of zero
drunk-driving accidents--I'm all for it.

------
drivingmenuts
Well, seeing how Tesla just handed every auto insurance company in the world a
free get-out-of-paying card for accidents dealing with that car, I'd say Tesla
screwed up. Seeing as how Tesla has potentially exposed everyone involved in
the supply chain to lawsuits, then yes, they screwed up.

If I were a well-backed lawyer, right about now, I'd be girding up to file
requests for colonoscopies on the people who assembled the circuit boards in
those cars.

Seriously, right about now, some lawyer is figuring out how to suck money out
of Tesla until it's a dry, twitching husk in the noonday say. Tesla handed
that guy a gift.

------
kbenson
Crash? That was an accident, and it was definitely the auto-pilot's fault, but
to me that looks like a fender-bender. Let's resist the media's attempt to use
more extreme wording to sell stories.

Yes, I know the driver prevented it from being worse, but that's one of the
conditions Tesla's auto-pilot is supposed to be working under, a driver paying
attention. There are problems with this model, but that doesn't mean we should
ignore what actually happened in lieu of what _could_ have happened when
making a title.

~~~
27182818284
There has actually been a movement since at least 2000 to stop calling
accidents "accidents" because it was found to make it seem like "lol accident
nothing could have been done" I can recall the word "accident" being dropped
by driver instructors and such back then at least. It might even go back
further.

Collision is better, but has a lot of letters so you condense it to crash.

~~~
FireBeyond
When driving emergency vehicles, the course you take that serves amongst other
things as an exemption for having a CDL used to be called EVAP - Emergency
Vehicle Accident Prevention, and is now called EVIP, Emergency Vehicle
_Incident_ Prevention.

------
calcsam
Tesla should take a leaf out of Airbnb's book and pay for all damages caused
by a customer using Autopilot in the manner it was intended to be used.

[http://mashable.com/2011/08/01/airbnb-
ransackgate/#K9lVMU5w_...](http://mashable.com/2011/08/01/airbnb-
ransackgate/#K9lVMU5w_iqF)

~~~
rasz_pl
you mean like this: [https://boingboing.net/2016/05/20/airbnb-stealth-updates-
ter...](https://boingboing.net/2016/05/20/airbnb-stealth-updates-terms-o.html)
?

------
shogun21
The autopilot is still in beta and warns drivers to still pay attention to the
road.

As long as drivers have control, they are responsible for the vehicle. Only
after we remove steering wheels, the accelerator, and brakes; and have a fully
autonomous system should the car manufacturers (or software manufacturers) be
blamed.

~~~
mdorazio
While that might be true purely from a legal standpoint, it just doesn't make
much sense logically. Let's compare it to something like a gas stove. If you
turn the stove on with a bunch of combustible stuff around it and then walk
away for ten minutes and your house burns down as a result, that's pretty
obviously your own fault. However, if you turn the stove on with just a normal
pot of water on top, walk away for ten minutes, and it self destructs and
burns your house down, what would you think? Is it your fault for not standing
there with a giant fire extinguisher, or is it the stove company's fault for
making a poorly QA'd product? I think Tesla's Autopilot feature is a lot more
like the latter example than the former.

------
sunstone
This 'imperfection' probably needs to be weighed against the fact that Tesla
drivers apparently drive 2.5 million miles a DAY on autopilot. Cars have other
imperfections that lead to accidents as well, like obstructed vision and blind
spots for example. The real question is, is the autopilot, imperfections
included, statistically a better driver than a human pilot also with
imperfections. One or two fender benders does not make the case.

------
moron4hire
Because the regular pilots are so good at the job?

~~~
coralreef
Tesla isn't responsible for how a driver drives their car.

But they are responsible for the software they ship.

~~~
moron4hire
If Tesla can ship a not-perfect autopilot that could significantly reduce
injury over not using it at all, shouldn't we abandon this pursuit of
"perfection" and release early?

~~~
dilemma
Tesla's non-perfect "autopilot" seems to be causing accidents a human
wouldn't.

~~~
moron4hire
Have you been on the roads much?

~~~
dilemma
Im referring to Teslas hitting stationary objects. One such story was posted
here a week ago.

~~~
moron4hire
People also hit stationary objects.

------
pj_mukh
I dont understand, the ML systems he designs are not deterministic by design!
He's been letting that stuff fly (literally) for over a decade now[1]!

P.S: I realize a "shipping" scenario is different.

[1][https://www.youtube.com/watch?v=0JL04JJjocc](https://www.youtube.com/watch?v=0JL04JJjocc)

------
freddealmeida
I'm just wondering if Tesla has responded to this? Do we know it was the
Autopilot? Do we have evidence? I know Tesla captures a great deal of
information from the car.

------
27182818284
Cruise control can also lull you into not paying attention. Autopilot is
simply marketing term for "advanced cruise control" which is what this really
is right now.

------
rasz_pl
Tesla is NOT shipping an autopilot in the first place. They are selling LANE
FOLLOWING driver assisted system and CALLING it autonomous AI "Autopilot".

This is the main problem.

------
WalterSear
An imperfect autopilot, running on extra cpu cycles from the automatic door
lock processor.

------
lacker
If an imperfect autopilot is better than a human pilot, we should use it!

------
hayd
The collision itself looks surprisingly minor.

------
shawndumas
he should try driving a model T...

------
83457
Anyone have public link to video?

~~~
tyrust
[https://www.youtube.com/watch?v=qQkx-4pFjus](https://www.youtube.com/watch?v=qQkx-4pFjus)

------
gbin
I guess from his experience their team rigging AI competitions at Baidu he
might not be totally confident in _real_ software :p

