
Volkswagen and Cheating Software - jeo1234
https://www.schneier.com/blog/archives/2015/09/volkswagen_and_.html
======
edent
Some programmer - or team of programmers - implemented this software. Sure, it
may have been a PHB who came up with the idea and told them to code this into
the algorithm - but it was programmers like you or I who actually cranked out
the code that made this possible.

Can you imagine this happening with, say, an architect? When unsafe buildings
fall down, it makes international news. By and large, professionals have a
code of conduct which they must follow or there is a very real chance that
they will lose their livelihood.

I'm a Member of the British Computing Society - it has a fairly simple code of
conduct for members.
[http://www.bcs.org/category/6030](http://www.bcs.org/category/6030)

I can certainly see adding this functionality would probably be a breach of
...

> 1 a) have due regard for public health, privacy, security and wellbeing of
> others and the environment.

> 2 d) ensure that you have the knowledge and understanding of Legislation*
> and that you comply with such Legislation, in carrying out your professional
> responsibilities.

> 2 f) avoid injuring others, their property, reputation, or employment by
> false or malicious or negligent action or inaction.

> 3 e) NOT misrepresent or withhold information on the performance of
> products, systems or services (unless lawfully bound by a duty of
> confidentiality not to disclose such information), or take advantage of the
> lack of relevant knowledge or inexperience of others.

But, here's the kicker - if I were kicked out of the BCS for adding this code,
nothing would happen to me. Employers don't care about professional bodies -
except in terms of certification and, possibly, indemnity.

I'm quite happy being a member of a Trade Union, because I believe it offers
me the best protection against a malicious employer - I wonder how long before
more codes start joining professional bodies to help protect themselves from
being asked to act counter to their best interests?

~~~
TheCapn
In the same vein is why I advocate for "Software Engineering" being brought
into the realm of other engineers and having the status protected so people
who call themselves engineers are held to the ethics and governing regulations
that all engineers are.

Holding the P.Eng. designation myself I would be liable to lose my license had
I published this software, I really hope we can reach a point in our
profession where a higher standard of care to public and environnment is not
only expected, but enforced.

~~~
chroma
Scott Alexander (of slatestarcodex) once wrote about medical ethics[1]:

> Listen up, this will be on the test. You need to help patients, not hurt
> them. You need to be responsible. Try to avoid being irresponsible. Try to
> act professionally, and to be a positive asset to society. Don't abuse
> nurses. If someone asks you do to something unethical, tell them you won't.
> Did I mention you need to help patients, and not hurt them?

> If reading that paragraph seemed like a waste of your time, imagine how I
> feel after attending the three hour lecture version.

Engineering ethics would be a similar waste of time. Good people don't need
courses to know that cheating emissions tests is wrong. Bad people won't be
swayed by a three hour lecture.

Here's an idea that would be more effective at reducing future harmful
software: Run git/svn/hg blame to see who wrote the emissions-dodging code.
Then arrest those people and charge them with crimes.

1\. Against medical ethics
[http://squid314.livejournal.com/285785.html](http://squid314.livejournal.com/285785.html)

~~~
erikpukinskis
I would love to teach a software ethics course, and it wouldn't be me standing
up saying "be good" over and over. We would study contemporary ethics
questions:

\- what should you do if your government demands you secretly backdoor a
specific user?

\- what if your easiest recruiting channels yield a higher proportion of white
males than other, costlier recruiting channels do?

\- if your startup is struggling to even survive, should you invest time in
infrastructure that protects users' privacy from your employees?

\- how do you find nonprofits and companies that are aligned with your values
and need your help?

Those are just four questions off the top of my head. I think there are plenty
of thugs to talk about to fill a semester, let alone an hour.

~~~
logfromblammo
I believe this falls under the "Louis Armstrong Rule".

If you have to take a class on ethics, you don't have them.

(Q: What's the "Louis Armstrong Rule"? A: Man, if you have to ask, you'll
never know.)

~~~
erikpukinskis
If you have to take a class on math skills, you don't have them.

I fail to see what your point is.

~~~
logfromblammo
If you already have enough ethics, you can work out on your own what would be
ethical for _any_ given situation. You don't need the class. It may save you a
bit of time in reflective contemplation, but it's not going to give you
anything you didn't already have.

If you don't have enough ethics to figure it out yourself, the class is just
teaching you how to pretend to have ethics. And like jazz, you will fail just
as soon as you are called upon to improvise.

~~~
erikpukinskis
To me, there is an ethical "operating system" which is what you are
describing, very low level principles that you apply.

But your ability to be ethical in a specific situation depends on your
database of "ethical considerations" about possible decisions you might make
that will have serious ethical ramifications and how those things are wired
up. The purpose of a course is to build up that database.

You might have a deep and earnest desire to be kind to someone, but without a
cultural framework for understanding their feelings, for example, you could be
completely unable to do that.

------
jaredhansen
Here's the far more interesting question about this situation: What if nobody
at Volkswagen really even knew this was happening? In other words: systems are
complex, and it is not beyond the realm of possibility that this cheating*
could have arisen purely as an emergent property of a set of otherwise
innocuous changes, and then stuck around through something like environmental
fitness, as it was functionally useful for the organization as a whole.

If the Volkswagen case seems too clear cut for that, then think about
PageRank, or Facebook's software that decides what to show to whom. Are we all
so sure that every engineer who works with this code really knows, ex ante,
_all_ of the effects their changes might have down the line?

There are a lot of demands for criminal liability in this thread, but I'd
suggest we proceed carefully. While it certainly looks suspicious, there are a
lot of ways that weird behavior can creep into software that don't involve
malicious intent.

Look at it this way: could Volkswagen engineers conceivably have written code
that caused the system to fail all emissions tests all the time? (Yes.) Could
they have done so without realizing that they'd made an error? (Sure.) Would
we all assume that that bad code was deliberately introduced? I'm not so sure.

edit: See also a related discussion in the thread about the OPM data leak:
[https://news.ycombinator.com/item?id=10303950](https://news.ycombinator.com/item?id=10303950)

===

* Or maybe some other, hypothetical cheating. Sure, in this particular case, maybe a particular software engineer or set of engineers knew exactly what they were doing when they wrote the code that enabled the cheating. But the thing I'm interested in is what happens when that's not the case, and how close that day is.

~~~
revelation
This probably isn't what happened, but I think in this context we need to
remember that it is perfectly benign and normal to have software in the ECU
for detecting an emissions or motor test.

Modern cars have a lot of sensors and environmental inputs. A motor or
emissions test then is quite abnormal in that some sensors will report
standstill while you are going full throttle. The ECU needs to make sure it
doesn't detect this as an abnormal system state and brake, shut the engine or
otherwise perform corrective action that could put the test system or
personnel in danger.

------
slasaus
> Voting machines could appear to work perfectly -- except during the first
> Tuesday of November, when it undetectably switches a few percent of votes
> from one party's candidates to another's.

Apart from difficulties having manufacturers publish their code I think
another problem will be proving that some (open source) software is actually
loaded in some piece of hardware _and nothing else or extra_. Most products on
the market can be taken, teared down and tested for as long as needed. With
voting machines the time to proof some software was active during elections
and nothing else is the time you have before you officially publish the
results. This is an extremely short window and impossible for a large part of
the public to verify.

~~~
dj-wonk
Stability and controllability in terms of the full stack, from hardware to
software, is key for these kinds of systems.

~~~
slasaus
Yes, but especially controllability by the public at large and not only some
designated institute.

------
NickM
Forcing VW to release source code would not have prevented this. They could
have simply released a different set code from what is actually running on the
cars.

This is an extremely hard problem to work around. They could let you dump the
binaries of the software running on an individual car, and then you could
compile the source code and compare the resulting binaries, but how do you
know the car isn't feeding you a fake binary dump? It seems like a catch-22: I
can't think of any way around this problem short of tearing the car apart,
cutting all the chips open, and physically verifying them under electron
microscopes.

On the other hand, if emissions testing would actually test what's coming out
of the tailpipe under normal driving conditions, then that would seem pretty
foolproof.

~~~
alephnil
I agree that actually measuring the real emission is a good idea, but there is
a way of checking that the binary and the source does not match, at least in
theory.

The way is to require reproducible builds. If the source is built with a
specific compiler with known source and known parameters, and the source of
the system in question is available, then running the compiler will produce
the exact same binary every time if the compiler is sane. If all of this is
published, then it is possible to check that the binary in the cars firmware
and the binary produced by the published source match bit by bit. This isn't
just some sort of academic exercise, but something that have been demonstrated
to work in practice. In fact debian try to make this work for the whole
distribution. This may not be totally practical yet, but this kind of checking
can be done if the vendors are required to make reproducible builds and the
regulators bother to check. All of this are big ifs, but at least in theory it
is possible.

~~~
GhotiFish
is there a way to guarantee the binary we get from the control system in the
vehicle is the binary that is running on that vehicle? That was the concern
NickM had for this style of verification.

~~~
NickM
Yes, exactly. Reproducible builds can catch accidental differences, but
discrepancies introduced by a malicious actor can still easily be hidden if
they have complete control over the software and hardware.

------
com2kid
Here is an opposite viewpoint:

What if increasing government requirements are just not achievable? The laws
are written by politicians, not engineers. If car companies are being asked to
meet unreasonable performance metrics, what other choice do they have?

Cars have gotten a lot more efficient, I can get a 1.5L engine that puts out
over 150hp. A 2L 4 cylinder engine can put out over 200hp!

These are huge savings compared to what used to just a decade ago, but they
are still apparently not good enough to meet government requirements.

I understand that part of VW's cheating was a cost saving measure, but with
all the talk about every car manufacturer doing it, one has to one, if 10
different people all independently come up with the same solution, maybe there
is a problem?

~~~
csours
Disclosure: I work for GM.

All of the regulations CAN be met... but it is increasingly difficult to meet
emissions, MPG, and Safety regulations, AND satisfy customers.

Safety equipment increases weight, which lowers MPG.

More efficient engines are generally smaller and less powerful, which does not
satisfy customers.

In the VW case, reducing NO2 emissions lowered MPG due to the burn-off
process.

One change to the fuel economy law that is sorely needed is changing the
standard from MPG to Gallons per Hundred Miles. Differences between GPHM
measurements are easier to understand [1].

1\. [http://billso.com/2008/06/21/gpm/](http://billso.com/2008/06/21/gpm/) \-
See chart

~~~
com2kid
> More efficient engines are generally smaller and less powerful, which does
> not satisfy customers.

They are also less safe for freeway driving.

On a recent vacation, I recently visited Albuquerque NM, driving around there
requires a fast powerful engine to get up to speed ASAP, or accidents will
happen.

In comparison, when I visited Boston, I rented a Fiat 500 and loved it, it was
perfect for zipping around a those old narrow winding city roads, in and out
of traffic, and finding parking. Heck I didn't pay for parking in Boston once
while I was there!

Freeway driving is the norm for most of America though. A tiny engine won't
due, it will however get someone ran over.

------
draw_down
The problem with making it look like an accident is that, to really do that,
the testing mode would sometimes have to be "accidentally" enabled while
really driving. Which to a customer would just seem like the car occasionally
and unpredictably performs very poorly, which is not exactly going to inspire
confidence in the brand.

And if the testing mode never comes on during normal driving, well, that's not
going to look very much like an accident, is it.

------
tptacek
Worth knowing that this attitude is a bit of an about-face for Schneier
(albeit one that happened many years ago). For instance, this is what Schneier
had to say about full disclosure software security in the early 90s:

[https://www.schneier.com/blog/archives/2005/08/new_windows_v...](https://www.schneier.com/blog/archives/2005/08/new_windows_vul.html)

~~~
pjc50
That's not exactly a complaint about full disclosure, more about
sensationalist press releases, and it's a decade ago (not the early 90s)

I'm really not a fan of the classic British press discreditation technique of
yelling "U-TURN" every time someone says something slightly different.

~~~
tptacek
That was the first hit I found for what I remembered as a strain of Schneier
comments about researchers that pissed me off (I'm friendly with some of the
old eEye crew, is part of it). Here's a clearer-cut one:

 _We shouldn 't lose sight of who is really to blame for this problem. It's
not the system administrators who didn't install the patch in time, or the
firewall and IDS vendors whose products didn't catch the problem. It's the
authors of the worm and its variants, eEye for publicizing the vulnerability,
and especially Microsoft for selling a product with this security problem. You
can argue that eEye did the right thing by publicizing this vulnerability, but
I personally am getting a little tired of them adding weapons to hackers'
arsenals. I support full disclosure and believe that it has done a lot to
improve security, but eEye is going too far. As for Microsoft, you can argue
that the marketplace won't pay for secure and reliable software, but the fact
remains that this is a software problem. If software companies were held
liable for systematic problems in its products, just like other industries
(remember Firestone tires), we'd see a whole lot less of this kind of thing._

"The problem" here is "the Code Red worm".

------
rubidium
"Both transparency and oversight are being threatened in the software world.
Companies routinely fight making their code public and attempt to muzzle
security researchers who find problems, citing the proprietary nature of the
software. It's a fair complaint, but the public interests of accuracy and
safety need to trump business interests."

No one, I hope, thinks it's that simple. Businesses cannot be expected to put
all their source code on github. Instead, this needs to follow the route that
all other regulation goes. Instituting private but 3rd party review of the
source code and testing, for which the manufacturer needs to pay a fee to
support. I'm not saying this will catch 100% of the issues, but it's a lot
better then what we have now and much more likely to work for businesses.

~~~
JoeAltmaier
That's not obvious at all. Cars can make all of their control software open
source with little impact on profitability. Who buys a car based on the
software? Isn't the sole reason its kept secret, is to hid how incompetently
its been written?

~~~
abritishguy
>Who buys a car based on the software?

I imagine in the next few years more and more people will. Tesla have just
announced that they will have self driving cars in 3 years.

------
gjvc
this practice (of determining that the software is being tested, and thus
altering its behaviour in favourable ways) has been present, off and on, in
the anti-virus industry, for years.

Here's one of the latest occurrences:

[http://www.theregister.co.uk/2015/05/06/antivirus_testers_st...](http://www.theregister.co.uk/2015/05/06/antivirus_testers_strip_tencent_of_rankings_after_tweaks_put_users_at_risk/)

------
nmrm2
_> But transparency doesn't magically reduce cheating or improve software
quality, as anyone who uses open-source software knows. It's only the first
step. The code must be analyzed. And because software is so complicated, that
analysis can't be limited to a once-every-few-years government test. We need
private analysis as well._

I'm skeptical of whether VW would have been caught any sooner, or would have
changed their behavior, if they were forced to release source code; "and then
analyze" is far easier said than done, especially with generated code (which
is common in the automobile industry). I fear that if anything, forcing VW to
release source code would have simply resulted in uselessly obfuscated
"generated" code.

I'm skeptical of the proposition that taxpayers should take on the cost of
analyzing reams of generated code without any context or documentation.

And finally, I'm skeptical that these calls for public access to source code
are politically feasible, fair, or wise. The amout of intellectual capital
that's spent on ECU design is absolutely massive. I don't see anyone in the
tech industry calling on Congress to force Google or Microsoft to open source
core components or reveal their software to regulators, even though
vulnerabilities in their software could easily ruin or end lives.

It might make more sense to mandate that comapnies provide verifiable evidence
that their safety-critical or regulation-relevant systems are properly
designed, with a variety of avenues to compliance.

Releasing source code to the public and paying for at least one private
analysis (to be selected by government regulators) would be one way of
achieving this. This would probably be the easiest option for IoT companies
(e.g. run-of-the-mill smart lightbulb manufacturers) whose source code doesn't
contain any particularly valuable IP. And this would also force companies to
pay up when they release hopelessly obfuscated code.

But this also opens the opportunity for other paths to compliance which, if
designed properly, could address the safety concerns of the public as well as
the fairness/property rights concerns of private entities. For example, one
alternative path for companies whose IP concerns are legitimate could be use
of formal methods. The regulation/safety specifications could be open to the
public for criticism, and would be far more readable than a dump of generated
code. And a few regualtors could double check that a trusted formal methods
tool verifies that the specifications hold for the software running on the
car, at minimal cost to both the car company or the general public.

~~~
JoeAltmaier
Ok on the difficulty issue.

But forcing Microsoft to reveal their crown jewels (they sell software) vs car
companies reveal how an internal controller works (they sell cars), is _not_
comparable in any way. Its disingenuous to imply that.

~~~
nmrm2
_> But forcing Microsoft to reveal their crown jewels (they sell software) vs
car companies reveal how an internal controller works (they sell cars), is not
comparable in any way. Its disingenuous to imply that._

This makes two assumptions that I disagree with.

The first assumption is that software isn't a core component of cars. I think
this is already not true. And to the extent that it is true, it won't be in
2-5 years. Software is at the core of emerging differentiations, such as
self/assisted-driving features.

The second assumption is that that software doesn't or can't reveal sensitive
information about (other) core components of cars. I think you'd be surprised
at how much you can deduce about a physical system from its control software.

Finally, there's no clear bright line on which companies should or should not
get to protect their IP. For Microsoft it's open-and-shut, but e.g. Google
doesn't sell its search engine software. And what about IoT companies? "We're
a software company that sells IoT appliances, not a lightbulb/car/robot/etc.
company". So the only way to write such a regulation would be to write a
regulation for the auto industry -- which is insufficient for the same reasons
that Schneier talked about IoT and car companies in the same breath.

~~~
JoeAltmaier
There's nothing about the physical system you can't learn by buying the car,
and looking. So no secrets there to give away.

The controller we've been talking about control efficient engine operation,
and conformance to existing federal standards. Not auto-driving cars (yet).
There are powerful reasons to force them to be open, and only 2nd-order
reasons to keep them secret.

In future I'd expect auto-driving cars would _absolutely_ be completely open.
There are even stronger reasons that emission-control issues, by far. Not
running over kids for example. So we're likely to see a huge move in that
direction.

As for a bright line, how about: if I can breath what you emit, or get run
over by your software, then it belongs in the open domain? Pretty clear to me.

------
_pmf_
The audacity of claiming to "search those who are responsible", the sheer
hypocrisy of pretending that not every little requirement is fully traced in
multiple tools and databases alongside the information who in the chain are
the stakeholders and who has written the test specification and who has
released the component shows that this group of confirmed whore mongers [0]
will get away with everything here in Germany.

[0] [http://www.welt.de/wirtschaft/article1708914/Ex-VW-
Betriebsr...](http://www.welt.de/wirtschaft/article1708914/Ex-VW-Betriebsrat-
Volkert-muss-hinter-Gitter.html)

------
kbenson
> Computer-security experts believe that intelligence agencies have been doing
> this sort of thing for years, both with the consent of the software
> developers and surreptitiously.

What ever happened with that thing a few years back where some in the OpenBSD
community were claiming the FBI was attempting to insert a backdoor?[1][2] I
was always surprised with how little media attention that seemed to get.

1: [http://www.linuxjournal.com/content/allegations-openbsd-
back...](http://www.linuxjournal.com/content/allegations-openbsd-backdoors-
may-be-true)

2:
[https://cryptome.org/2012/01/0032.htm](https://cryptome.org/2012/01/0032.htm)

