
Tesla's Autopilot: Too Much Autonomy Too Soon - jacquesm
http://www.consumerreports.org/tesla/tesla-autopilot-too-much-autonomy-too-soon/
======
brokenmachine
I'd say it's too _little_ autonomy too soon.

The most dangerous thing about "Autopilot" is that the human is still supposed
to be responsible.

So after what might be hours of inactivity, the human driver is expected to
assume control of the vehicle within a split second when the Tesla suddenly
decides it's not confident enough in it's own driving.

That's hardly autonomous.

------
slackstation
If we replaced every car in America with a Tesla with Autopilot, do you think
we would see more road fatalites or less?

Less by a long shot. We ignore tens of thousands of deaths per year and focus
the one fatality of someone who was using autopilot incorrectly.

People are much better at killing themselves and each other than autopilot is.

------
nxzero
"Human drivers, too much autonomy, too long."

~~~
ghostbrainalpha
There is going to be gigantic culture war coming over this issue.

On one side will be the technologists, familiar with rapid iterations of
software development, and willing to experience "bugs" because they realize it
means faster longer term progress and greater ultimate gains for the human
race.

On the other side will be luddites, who will see willingness to experience
bugs that result in real human tragedy as an entirely new category of human
moral decay. If I were a conservative with a book deal, this would the new
fear platform I would be trying to jump off.

~~~
melling
2.3 million accidents a year and 35,000 deaths in the US alone. We can't get
to autonomous soon enough. In the meantime, I wonder if we can't use "driver
assist" to greatly reduce that number within 5-10 years.

~~~
Raphmedia
It sucks to say it that way but we really need a driving assist that can
compensate to the losses of faculties that comes with drunk driving. That's
one third of all road crashes. People are always going to be idiots.

