
The problem with self-driving cars: who controls the code? - chei0aiV
http://www.theguardian.com/technology/2015/dec/23/the-problem-with-self-driving-cars-who-controls-the-code
======
spydum
Not sure I follow, how would these scenarios be any different if a human was
making the decision? Some humans will act with self interest, some with a
utilitarianism, and some just simply will freeze. I'm not so sure it would
make much a difference how the car chooses.

So, philosophical question aside, the concern for autonomous vehicle fleets
having backdoor and being abused is more concerning..

~~~
bnj
Not sure you got beyond the headline. Doctorow rejects the philosophical
question; any car that can be programmed to kill it's driver, a driver can
reprogram.

~~~
spydum
Which makes the point: it is indistinguishable from what happens now..

------
brownbat
I love the tee-up: "Should autonomous vehicles be programmed to choose who
they kill when they crash?!"

...because the answer provided is approximately, "that's a stupid question."

It's like guardian didn't even read Doctorow's editorial before publishing.

~~~
maxerickson
I think it's more pernicious than that, they aren't ignorant of the content
when they write headlines like that, they just don't care about the mismatch.

------
mnglkhn2
The main assumption made by Cory Doctorow is that the autonomous cars will be
fully automated and under the control of somebody other than the driver. That
scenario - with a Uber-like owner of a point-to-point fleet of self-driving
cars - is indeed a bit concerning, considering that you, the passenger, give
up control of the vehicle and you have to trust the machine with your life in
order to get you from point A to point B.

But this is exactly why I think that the consumer version of the self-driving
car will always have a manual drive setting that will completely disengage the
auto-pilot.

The insurance companies will never insure at an acceptable cost point a car
that cannot yield control to its passenger that is the designated driver.

Also, passenger will want the reassurance that at any point they can take over
the car and switch it off the auto pilot.

iPhones and any other mobile devices are a bad analogy with a self-driving
car. The more apt comparison is with the current modern airplanes: even if
these planes can drive themselves, the pilot is able at any time to take over.

This option is needed even more on the roads, where you don't have the
dedicated flight path a plane has in the air.

~~~
irq-1
> The insurance companies will never insure at an acceptable cost point a car
> that cannot yield control to its passenger that is the designated driver.

We insure medical devices, satellites, soldiers lives and many other more
complicated and important things. The insurance industry will figure out how
to collect payment for automated cars as well.

> Also, passenger will want the reassurance that at any point they can take
> over the car and switch it off the auto pilot.

Bus, train, plane, escalator, elevator, moving walkway and wider analogies say
otherwise.

------
MarcScott
Aren't these two separate questions?

Should autonomous vehicles be programmed to kill their owners in some
circumstances?

and

Should owners of autonomous vehicles be legally allowed to change the
vehicle's code?

Both have their own separate legal and moral quagmires.

~~~
tacos
Exactly. There's an awkward, obvious shift in the article where it turns an
age-old riddle that goes back way farther than 1967 into a rant that covers
everything from DRM and jailbreaks and chain of trust to "morally ambiguous"
and "exploitative overseas labour arrangements." It's not a tech article, it's
a manifesto.

Write the same article over and over for 15 years at a Zine and I guess
they'll eventually let you spew it at The Guardian provided you refer to the
subway as "tube" and spell labor with a "u."

These are important issues that deserve far better presentation. Sourcing
quotes from people actually doing the work and manufacturing the vehicles.
Getting statements of policy from regulators on the record. Strengthening the
ethical argument by having someone besides the author say it.

------
mhurron
Can someone explain to me how Cory Doctorow became a spokesperson for privacy
and technology policy?

~~~
arkem
Basically because of the EFF.

He worked full time at the EFF 2002-2006 and was named an EFF Fellow when he
left. That combined with his blogging and fiction writing left him a
reasonably high profile privacy and technology spokesperson.

~~~
mhurron
That's all I could find. I could find nothing in his background or what he
does now that would indicate he had any direct knowledge about what he gives
an opinion on.

~~~
TeMPOraL
He's a writer with experience and deep interest in tech issues. What kind of
credentials he needs to have for his opinion on those issues to be "valid"?

~~~
mhurron
> writer with experience

Ok, what experience? That's what I'm asking. A deep interest is not
experience.

> What kind of credentials he needs to have for his opinion on those issues to
> be "valid"?

Usually actual experience in the matter. The EFF does good work because it is
an organization of lawyers, not Science Fiction writers.

~~~
TeMPOraL
Have you read other articles by him, or seen his talk? Like this one I
personally consider very insightful and important:
[http://boingboing.net/2012/08/23/civilwar.html](http://boingboing.net/2012/08/23/civilwar.html).

But since you insist - he has a honorary doctorate in computer science from
one of the top UK universities[0]. He also co-founded a software company[1]. I
think it's enough credentials to give him right to be a tech advocate, with
the fact that he's also a journalist backing up the "advocate" part.

[0] - [http://craphound.com/bio/](http://craphound.com/bio/)

[1] -
[https://en.wikipedia.org/wiki/Opencola_%28company%29](https://en.wikipedia.org/wiki/Opencola_%28company%29)

------
makomk
Cory is of course completely and utterly wrong here on a technical level.
Indeed, what he's saying is the exact opposite of the truth. Locks that
prevent you from modifying the code running on your hardware don't make it
easier for an attacker to break in, it's the exact opposite - any feature that
lets you install modified code can also be used by an attacker to install
malicious code that will steer your car into a bridge on command. This is why
some experts recommend the use of locked-down devices like iPhones if your
hardware might be targeted by an attcaker.

Likewise, there is no requirement that "A digital lock creates a zone in a
computer’s programmer that even its owner can’t enter. For it to work, the
lock’s associated files must be invisible to the owner." You need that for DRM
because the purpose of DRM is to protect access to files, but a properly-
locked-down system will not let you modify the code even if you can look at
every single piece of code and data on the system.

~~~
extra88
Where did he say that locks make it easier?

> but a properly-locked-down system will not let you modify the code even if
> you can look at every single piece of code and data on the system

I think in the quote you used Doctorow is simplifying things for a general
audience. Please give an example of a "properly-locked-down system" that can
be protected from someone with physical possession of it. It's not the iPhone,
as he said iPhones can be "jailbroken," (I don't know if I would call it
"easy" as he does).

------
dawnbreez
Another reason why self-driving cars should have manual controls:

There is a road near my house that is closed by the neighboring school at
random. Google treats this road as a normal road when choosing routes via
Maps.

If a self-driving car drives up to the gate, that's at least a lot of wasted
time, and if the car can't detect the chain-link gate...

------
amelius
What I wonder about most is how we are going to do the certification of the
code. Given that the code is probably far too difficult to analyze manually, I
assume that certification will mostly be a matter of letting the car drive for
X miles and seeing if it doesn't crash.

The problem that arises then is that after every tiny update, do we have to go
through the complete certification process again? It is not possible to simply
record and replay sensor information, since every change in behavior may
result in a change in the environment, and you cannot predict the environment.

~~~
intrasight
There is a great deal of academic research on formal methods of code
validation, most of which is completely ignored by industry.

~~~
zaphar
The problem with code validation in the context of a self driving care is that
the space of inputs to the software is essentially the entire world. The code
is _necessarily_ complex and the invariants nearly impossible to adequately
specify. A general statement like "Does not collide with deer in road" is in
fact insufficient.

This isn't like an airplane autopilot or your typical other sorts of highly
vetted software. It's many orders of magnitude more complex.

~~~
TeMPOraL
And that's why I think questions like "whether the car will be programmed to
kill you or someone else" are missing the point. The car _won 't have an
explicit decision point like that_! Evaluating pros and cons is what humans do
because it's hard for us to do any better. A car can be continuously
observing, updating and navigating the solution space, and if it ends up
killing someone, it's either because of not having sufficient data, or not
being able to stay within its envelope on the solution space (i.e. a situation
when it's physically impossible for it to avoid a collision). I don't see why
a car would ever have an explicit if(something()) { killDriver(); } else {
killTarget(); }. "Killing" won't even be a concept in code.

------
intrasight
What I see as inevitable is a transition to software engineering being a
licensed profession. And additionally the application of formal methods of
system validation.

------
dang
Title changed from "Cory Doctorow on Software Security and the Internet of
Things". Submitters: please use the original title unless it is misleading or
linkbait, as the site guidelines ask.

[https://news.ycombinator.com/newsguidelines.html](https://news.ycombinator.com/newsguidelines.html)

