
Silicon Valley-Driven Hype for Self-Driving Cars - dbcooper
http://www.nytimes.com/2016/07/10/opinion/sunday/silicon-valley-driven-hype-for-self-driving-cars.html
======
Animats
It's not Silicon Valley hype that's the problem. It's Tesla hype. Tesla's
"autopilot" is way under-sensored. One TV camera, a bumper-level radar, and
some ultrasonic range sensors. That's just not enough. (Tesla is supposedly
adding more cameras in later models. They still need windshield-height radar.)
It's also not clear how much redundancy Tesla has in the control system?
Backup processors? Redundant steering motors? Redundant brake actuators.
Google has all of those. Does Tesla?

The Tesla crash and rollover on the Pennsylvania Turnpike is troublesome.[1]
That's the situation Tesla's "autopilot" is supposed to handle. Here's the
crash location - Penna Turnpike mile marker 160, eastbound.[2] Concrete center
divider, metal guard rail on the right, clear pavement markings, summer
driving conditions. There's no excuse for a crash in that situation. That may
have been an outright system failure. NHTSA and the Pennsylvania State Police
are investigating.

Autopilot bug reports are written in customers blood. Musk seems to be in
denial about that.

Denial isn't going to work. Look how much trouble General Motors got into over
ignition switches with weak detent springs.[3] Keyrings with too many keys
could under some circumstances move the switch and turn off the vehicle. (It
wasn't just loss of propulsion; power steering, power brakes, and airbag power
were also lost.) GM has paid out $3 billion so far, recalled 800,000 cars, and
fired 15 people.

[1] [http://www.nytimes.com/2016/07/07/business/us-safety-
agency-...](http://www.nytimes.com/2016/07/07/business/us-safety-agency-
investigates-another-tesla-crash-involving-autopilot.html) [2]
[https://goo.gl/maps/TfcuZR67oXr](https://goo.gl/maps/TfcuZR67oXr) [3]
[https://en.wikipedia.org/wiki/General_Motors_ignition_switch...](https://en.wikipedia.org/wiki/General_Motors_ignition_switch_recalls)

~~~
aschobel
Tesla has now reviewed the logs and says that there’s “no data to suggest that
Autopilot was engaged at the time of the incident."

[http://electrek.co/2016/07/06/tesla-model-x-rollover-
acciden...](http://electrek.co/2016/07/06/tesla-model-x-rollover-accident-
tesla-says-autopilot-not-engaged/)

~~~
tristanj
> _Tesla has now reviewed the logs_

This is an extremely misleading interpretation of the article. The article
says Tesla has only reviewed the basic crash logs, which don't contain
information about autopilot state. The detailed crash logs, which have this
information, have not been received because the antenna was damaged in the
crash.

So Tesla has only reviewed cursory information that says nothing about
autopilot state; therefore, they can state they have "no data [suggesting]
autopilot was engaged". But you cannot conclude autopilot was not engaged,
because there is critical data missing.

Very misleading.

~~~
aschobel
> Tesla has now reviewed the logs and says that there’s “no data to suggest
> that Autopilot was engaged at the time of the incident."

is a quoted directly from the article. It is also the title of the story.

~~~
tristanj
Quoted from the part of the article the author crossed out and rewrote because
it's inaccurate. And the title is a quote taken out of context.

~~~
aschobel
I had not seen that edit. Agreed that the title is inaccurate.

The trend on these Telsa / EV blogs has been to take Tesla's logs as gospel
and discount human observation.

There have been other reports of users claiming autopilot being engaged when
it was not and reports of not noticing that summon was enabled even though one
can't really miss that (that has subsequently been redesigned).

Human factors for these systems is incredibly challenging since such a wide
audience of users useses them.

~~~
tristanj
For the record, the crossed out and updated section was added two days before
you posted your comment. At that time, the author added a link right at the
top to a newer, more accurate article. The author also added the word
"[Updated]" to the title so people would notice this change.

------
rjdagost
The thing that strikes me as more than a little off-key about the hive mind-
like push for fully autonomous cars is that most drivers don't want them:
[http://www.bizjournals.com/sanfrancisco/blog/techflash/2016/...](http://www.bizjournals.com/sanfrancisco/blog/techflash/2016/05/drivers-
dont-want-self-driving-cars-uber-google.html)

"...only 15 percent of respondents would want a fully self-driving vehicle as
their next purchase. Meanwhile, 39 percent said they would prefer a vehicle
that has some self-driving features. Nearly all respondents (95 percent) said
they would want access to a steering wheel, gas and pedal controls even if a
vehicle were self-driving, suggesting a wariness toward the technology.
Brandon Schoettle, the study’s author, noted that public attitudes toward
self-driving vehicles haven’t shifted much over the past couple of years,
despite well-publicized developments in the space."

This sort of reminds me of Google Glass- the tech early adopters were praising
it highly and worked the hype up to a thick lather, but "normal" people
wouldn't be caught dead using it and the product belly flopped badly. The
entire exercise seems like the self driving technology is being built for its
own sake, as though the coolness of the engineering challenge will stimulate
sufficient demand to justify the endeavor.

~~~
cylinder
That really means nothing. If you asked people "do you want a machine that
dries your clothes" or "do you want a computer you carry in your pocket" a
long time ago you probably would have received tepid interest.

~~~
rjdagost
It most certainly doesn't mean nothing. It means that that there are
significant trust issues that must be overcome before autonomous driving
technology can get widespread acceptance. This is not a pure engineering
problem that must be solved. Technologists can endlessly repeat "autonomous
driving is inevitable" but that alone won't make it so.

~~~
slavik81
How can you trust a thing that doesn't exist? It's certainly understandable
why people might hedge their bets. It's a lot to ask to get commitment to
something sight-unseen. I have faith that the trust will arrive soon after the
technology does.

In the article, they state that Google thinks it will be 30 years before cars
will be fully capable without steering wheels. If I were a survey respondent,
I wouldn't want to wait that long to get my next car either.

------
cpprototypes
Something that annoys me about all this self-driving focus is that they are
not targeting the obvious low-hanging fruit:

1) Focus on perfecting low speed bumper-to-bumper traffic fully autonomous
driving on freeways.

2) Focus on passing legislation to allow zero driver liability and awareness
when using the low speed bumper-to-bumper traffic self-driving mode.

The speeds are low and so the risk factor is reduced by orders of magnitude.
Even if an accident happens, it won't be fatal because of the low speeds. And
that should also help with the legislation issues. Also, this feature would be
very popular. Imagine a commercial with typical bumper-to-bumper traffic. Then
the camera zooms in and the driver is sleeping. When the car detects that
bumper-to-bumper traffic is going to end soon, it triggers a loud alert which
wakes the driver.

And yet all the car companies are ignoring this. I remember reading about
early lane keeping and automatic speed cruise control and it would not
activate under slow speeds. I was incredulous when I read it. They made this
not work for exactly the most useful scenario? I don't care about self-driving
for road trips, they don't happen often. And door-to-door self driving is the
ultimate goal, but it's going to take a long time to get there. But helping to
alleviate the horrible bumper-to-bumper driving by allowing drivers to do
something else can be done with current technology. The whole self-driving
effort feels like no one has done any market research to see how to make it
help with real world situations.

~~~
omonra
Volvo XC90 has this feature. It's basically lane keeping with automatic cruise
control that works between 0 mph and 30. It also requires you to keep hands on
the wheel (that's not a problem, imho).

The only drawback to me is that if the car stops completely, the autopilot
shuts down and has to be manually restarted (click button) to resume.

~~~
peteretep

        > It also requires you to keep
        > hands on the wheel (that's
        > not a problem, imho).
    

That stops you from reclaiming the time, so is probably nowhere near as
valuable/desirable as one that allows you to repost cat gifs.

------
jmspring
There was a recent story about a self-driving long haul trucking caravan
across Europe touting the success of such. It was done in the spring and lots
of high-fives around it...The route was in Sweden, Denmark, and
Germany...aside from perpetual autobahn construction...each of those have very
good roads and it was done in the spring.

"Self driving cars" still can't distinguish all driving hazards. The recent
Tesla fatality, many stories made it a point to point out the driver was
watching a movie.

So, let's move self driving cars to California roads. They will deal well with
traffic in places like 880, etc. But, can they deal with poor road conditions,
the overall shitty level of driver competence here, etc? I think that is many
years off and not yet proven.

Another area to consider - uptake in new cars. I just bought a 2016 Tacoma in
the last year. These trucks are known for lasting decades. I don't commute. I
don't see replacing it anytime soon, let alone, ever having a self driving car
for taking me places in the back country/forrest roads/etc.

It is hype and maybe it will happen for some fleet situations, but I don't see
autonomous cars being the mainstay for private individuals in my lifetime.

~~~
intopieces
Uptake in new cars will be a non-issue when Uber and Lyft make self-driving
taxis.

The movers in industries are never the end consumers, but the large
corporations.

Cities will replace HOV lanes with Autonomously Driven Vehicle lanes, and that
will include the buses.

~~~
jmspring
I've seen lots of stories about "cities" and their plans.

In the bay area, Morgan Hill to Mill Valley, keeping things to 101, are part
of the same suburban gridlock. Do I see that area changing by "autonomously
driven vehicle lanes"? No. And you are an idiot if you think it will happen.
Why?

HOV lanes aren't only used by long distance commuters. You aren't going to get
HOV/Autonomous lanes through SF. Sorry, After the 89 quake they tore down
freeway infrastructure. 880 -> Richmond-San Rafael, yeah no.

Please provide me with a valid scenario where I can get from any of the bay
area communities a commute over 30 miles where an autonomous vehicle would
work. This is the norm. If you assume anything about future plans, you lose.

Hwy 85 was supposed to help south bay commutes, 2 months after opening, it was
still faster to go 17 -> 280 -> Page Mill to get to Stanford from Santa Cruz.
Oh yeah, that was still in the 90s.

I have no idea where you live, but, at least in the bay area, maybe you need a
commute history lesson.

~~~
jmspring
And as someone who lived near what is now Almaden and 85 before 85 went in
(amd my parents were well aware of road plans, 85 was on the book for decades)
the freeway turned upper middle class neighborhoods into problem areas when it
went in.

"Autonomous lanes" much like light rail or Bart lines, have a community effect
beyond the "making it easier to move people" effect.

------
aab0
"Aware of the conventional wisdom that robotic cars are about to cause an
epochal “disruption,” automakers are eager to demonstrate that they are fully
engaged. A result has been a drumbeat of announcements auguring the imminent
arrival of robotic cars, almost as though they were the next generation of
iPhones. The breathless statements are especially beguiling for members of the
public without the engineering background required to understand the
challenges that remain. In other words, most people."

I'm glad we have a "former technology reporter" to tell us this.

~~~
rdtsc
This is strangely similar to "flying cars". Those have been "just around the
corner" for, I don't know, the last 30 years probably, if not more.

------
Noseshine
Related thread on the front page right now: "Simple questions to help reduce
AI hype (smerity.com)"

[https://news.ycombinator.com/item?id=12062294](https://news.ycombinator.com/item?id=12062294)

Excerpt:

> _...good performance on a dataset also doesn 't necessarily imply good
> performance in the real world. I'd wager that even trivial self-driving car
> models, given reasonable data, would be able to score in the high 90s if we
> decided on a reasonable notion of accuracy. The real stickler is the last
> few percent. When we make a mistake at high speed on a road, potentially
> under questionable conditions, the results can be disastrous. If the cost of
> a mistake is high, accuracy really doesn't mean anything, especially over a
> dataset that might not be representative in the first place._

~~~
peteretep

        > When we make a mistake at
        > high speed on a road,
        > potentially under 
        > questionable conditions, the
        > results can be disastrous
    

Given that that's no less true for humans, why do we require 100% accuracy
from cars, rather than just "on average better than humans"?

------
studentrob
I found this article to be on the mark.

It's interesting that Google's project director estimates full autopilot will
arrive between 3 and 30 years incrementally. Up to this point, most news and
my friends seem to think it'll be here within 5 years.

~~~
agildehaus
When Google says "incrementally" they mean where it will arrive, not level of
autonomy. They'll be driving in sunny California long before snowy Minnesota.

~~~
dbcurtis
The idea of having a "summer car" and. "winter car" is pretty common in
Minnesota. Summer car being a nice one, perhaps fun to drive, that gets stored
in the winter so that "road cancer" doesn't eat the body. A "winter car" has a
good heater, a good battery, and good tires.

------
henrik_w
There is also the problem of analyzing why certain AI actions where taken when
investigating accidents: [https://www.technologyreview.com/s/601860/if-a-
driverless-ca...](https://www.technologyreview.com/s/601860/if-a-driverless-
car-goes-bad-we-may-never-know-why/)

------
jbpetersen
What surprises me is that last-mile and local transportation of non-living
cargo hasn't gotten more attention. I can't wait to get cheap delivery from
practically anywhere in town and nearly on demand.

------
studentrob
I don't think Musk really understands the AI tech that he's managing. He
thought Open AI would be the path to true AI, and he can't accept that a
current flaw in Tesla's autopilot may be material to the company.

Musk has done a ton of great stuff. To be better, he could use someone who he
trusts and is an expert in the field, like LeCunn, to simmer Musk's
expectations of AI. Right now he seems brimming with confidence about AI to
the point he (1) is claiming we're already living in a simulated world, (2)
has started his own true AI venture, and (3) is ignoring the relevance of his
own product's failures. I can only imagine this is because he is surrounded by
Yes men.

I could be 100% wrong.

