
UX Issues That Will Make Or Break Self-Driving Cars - todayiamme
http://www.fastcodesign.com/3054330/innovation-by-design/the-secret-ux-issues-that-will-make-or-break-autonomous-cars
======
Animats
Finally, somebody is thinking about this properly.

I've been railing against the "deadly valley", automatic driving systems that
are good enough that the driver stops actively paying attention, but not yet
good enough to be trusted unwatched. Tesla's "autopilot" is in that category,
or at least it was before they put more restrictions on it. VW is trying to
figure out how to operate safely in the deadly valley.

This matters, a lot. Misunderstandings over who's doing what between cockpit
automation and pilots are a major problem.[1] There are lots of modes, often
too many. This will not work for cars; drivers don't get procedure training,
simulator time, and check rides. The number of modes must be small, and it
must be very clear to the driver what mode is engaged. VW's slightly retracted
steering wheel makes this very explicit. That's clever.

Modeling the design after riding a horse on a loose or tight rein (the proper
term is "collected") is interesting. But it's probably too much for a car. As
a longtime rider who's had some good horses, I know what that feels like. I
had a good Thoroughbred hunter teach me that. Trotting him up a switchback
trail, I could either give him a loose rein, and let him work each tight turn
his way, or collect him up, bend him around my leg, and control the turn
myself. But I had to do one or the other; if I was unclear approaching a tight
turn, he'd get through it, but not cleanly. Then he'd toss his head and snort,
annoyed that I'd made him screw up. Some people might like that kind of
relationship with car automation, but a lot of people won't be able to handle
it.

(Military aircraft, though... DARPA is trying to figure this out for the
F/A-18E Super Hornet, which is a single-seater with a serious cockpit workload
problem. Sometimes the pilot needs to focus on weapons and targeting, and
sometimes on flying the aircraft. Doing both at once causes at least one of
those tasks to suffer.)

[1]
[http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.571...](http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.571.7998&rep=rep1&type=pdf)

~~~
golemotron
The "deadly valley" is very real and it won't go away. It can be sidestepped
by taking human drivers out of the picture entirely.

Imagine what would be involved in doing that.

Some analysts think that trucking would be an easy foot in the door for this
technology. Freeways present fewer challenges in driving. There could even be
dedicated lanes for autonomous trucks. The problem is political. A lot of
peoples' livelihood depends on trucking. Those people won't go quietly into
the night.

Uber and Lyft want to use this technology for commercial passenger service.
They've already disrupted taxis. Dropping drivers would be less of a political
hurdle for them. I imagine Uber setting up driverless cars in pool routes like
buses. Ride services are also the best hope for completely driver-free tech
because no one is going to want to assume driving liability as a passenger.

Completely driverless is where this technology needs to go and I think that
taxi/ride-service replacement is the lever. Once that works you'll see people
buying cars that don't let you drive.

~~~
userbinator
I think you will see pilotless planes long before completely driverless cars
gain any popularity, because of the problem of human-automation handoff
latency.

In planes, when the autopilot disconnects because of some problem there's
usually plenty of warning (including a sound like
[https://www.youtube.com/watch?v=8XxEFFX586k](https://www.youtube.com/watch?v=8XxEFFX586k)
) and the pilot(s) have some time, seconds or more, to assess the situation
and take control. Despite how fast planes fly, the rules of separation and
general vastness of airspace make it much easier to smoothly transition
control. It may not sound like much time, but consider that in a self-driving
car, the latency required to avoid a collision could be in the range of
milliseconds. By the time the driver realises (s)he has to take control, it's
too late.

~~~
staircasebug
Agreed on the latency with the driver.

Let's also think about the diminishing skills that we'll experience once
automation takes over the majority of our driving. Imagine taking 3-4 weeks
off from driving and then being asked to take the wheel under a difficult
circumstance.

~~~
CPLX
> Imagine taking 3-4 weeks off from driving and then being asked to take the
> wheel under a difficult circumstance.

This is also called "living in New York"

------
jzwinck
_" He hasn’t replaced driving with, say, watching a movie or relaxing—instead,
he’s replaced the stress of driving with something worse. He looks at the
road, he looks at the wheel, he looks at his hands."_

It was much the same when I bought my parents a Roomba. It was meant to do
housework for them, and let them do other things. Instead, what used to take
30 minutes of pushing a vacuum cleaner now takes 50 minutes of watching a
robot.

Claims that we will all read books and join video conferences from our self-
driving cars are unlikely to be realized in the first generation. Of the
users, not the machines.

~~~
GuiA
I've seen a roomba in action, and it does a terrible job at cleaning a place.
The feeling I got is that it replaces 30 minutes of cleaning with 50 minutes
of watching a robot move around, and 20 minutes of cleaning yourself all the
spots it missed.

~~~
endymi0n
You missed the 10 minutes of rough tidying so it doesn't scoop up your kid's
puzzle pieces, the 5 minutes of cleaning the Roomba itself with a real vacuum
and the 2 minutes avg of finding and freeing it from the place or string it's
been stuck in. Our Roomba isn't running anymore since we have kids - it's
literally more work than not doing so :P

------
spuz
There is a great Econtalk podcast about this very issue. David Mindell has
studied human interaction with automated systems and found that in practically
every case there is some vestigal human supervisor of these systems. Lots of
really interesting detail is given in the show:

[http://www.econtalk.org/archives/2015/11/david_mindell_o.htm...](http://www.econtalk.org/archives/2015/11/david_mindell_o.html)

~~~
_delirium
Mindell also recently wrote a book on that subject. There's an extended blurb
here: [https://robotics.mit.edu/not-so-fast](https://robotics.mit.edu/not-so-
fast)

------
flashman
Important to point out that the Volvo hitting the men in the video _didn 't
have the optional collision avoidance installed_:
[http://fusion.net/story/139703/self-parking-car-accident-
no-...](http://fusion.net/story/139703/self-parking-car-accident-no-
pedestrian-detection/)

~~~
dogma1138
That's exactly the root of the problem, if your car runs over your dog because
you didn't had firmware 3.17.4b installed or because your license has expired
isn't really something any user should be dealing with.

I see Autonomous driving like iOS vs Android when you buy an Apple device you
know what to expect (or atleast used to before they decided to fragment their
SKU's to hell) each device comes with the same features and the software works
exactly the same. The Android ecosystem on the other hand is fragmented to
hell and each devices comes with completely different features, software and
user experience.

This can easily kill autonomous driving because it unlike poor app performance
self driving cars can actually kill people if the self driving experience I
get on a BMW is drastically different than what i on a Honda or a Volvo I
won't be confident in using any of those, not to mention that at the moment
you can't even get reliable experience on the same make and model and it's a
feature not a bug in most cases because car manufacturers want to monetize
self driving so for every (even basic) feature you'll have to pay extra.

Self driving cars need to get a standardization and they need it fast a fixed
set of features that all cars have to properly support to be able to market
their self driving cars in the US and Europe and this has to be a dichotic
standard you either support everything in that standard to a satisfactory
level or you don't get to turn any of those features on.

The last god damn thing I want to worry about is if my car's pedestrian
avoidance app was reactivated or not after the weekly firmware update, but
sadly enough i foresee some one getting run over because the driver of a
"smart" car did not accept the new EULA that was pushed over the weekend
before the industry gets any shred of sense.

~~~
Zigurd
The only thing necessary to make self driving tech highly valuable if that it
be significantly safer than human driving.

~~~
jacquesm
People are notoriously bad at statistics and will do what they can to
interpret this in any way that serves them.

"Human killed by robot car" will definitely be lapped up and even if
statistically it may be better than "Human killed by human driver" the one is
the exception, the other the rule and so the one will get airtime.

Marketing this kind of thing properly is going to be extremely hard once the
casualties start mounting (and they will, no matter how good the software is).

~~~
Zigurd
That's axiomatic in that we let cars kill a million people, and injure
millions more _every year_ and we haven't yet done much about that. Because we
suck at statistics and we are also convinced it won't be us because we are in
control of our vehicle.

But that's still a compelling number. Just as most people thought malaria was
some quaint disease that inconvenienced the British Empire until Bill Gates
got on it, I think road deaths are potentially at least as compelling a
target, especially once people look outside Europe and North America.

~~~
jacquesm
> Just as most people thought malaria was some quaint disease that
> inconvenienced the British Empire until Bill Gates got on it

I don't think that's even remotely accurate.

~~~
jessaustin
That wasn't what people dying of malaria thought, but it was what people
funding drug research thought, which might be more to the point.

------
slavik81
If I have to do anything, it's not self-driving. It's driving assist.

A legal definition of self-driving could help reduce consumer confusion here.
I feel like the goalposts were moved as companies rushed into this space.

------
ocdtrekkie
The 2008 Knight Rider pilot had a similar UI. When considering a risky
maneuver, the car displayed the intended route, and asked the driver if it was
okay.

A key difference, of course, is that KITT considered its driver a partner to
assist, whereas companies like Google consider the driver a flaw to be
corrected.

------
userbinator
Where have I seen something like that steering wheel before...

[https://www.youtube.com/watch?v=FlpPR_GgwEc](https://www.youtube.com/watch?v=FlpPR_GgwEc)

