

The Ken Thompson Hack - yla92
http://c2.com/cgi/wiki?TheKenThompsonHack

======
brudgers
The reference point is [Trusting Trust]. It was his Turing Award lecture in
1982. Honestly, it and similar materials should be required reading somewhere
in university CS and Software Engineering curricula.

[Trusting Trust]: [http://cm.bell-labs.com/who/ken/trust.html](http://cm.bell-
labs.com/who/ken/trust.html)

~~~
EthanHeilman
In my experience Trusting Trust is required reading in many CS programs.

~~~
Roboprog
Well, it wasn't required _reading_ back when I was in CS in the 80s, but it
was a discussion that we had in class.

I nervously laugh whenever somebody talks about "open source electronic
voting"

~~~
fnordfnordfnord
As opposed to "closed source electronic voting"?

~~~
dredmorbius
I read the comment as "open source doesn't ensure trustable" rather than "open
source is inherently worse than closed".

It's that _even_ OS would _not_ be sufficient.

~~~
Roboprog
You heard right!

------
nischalsamji
Don't think he told that he actually put a bug in the c compiler. He was just
explaining how he could have done it. A very interesting read though.

~~~
leoc
In the Eric Raymond-era Jargon File [http://www.catb.org/jargon/html/B/back-
door.html](http://www.catb.org/jargon/html/B/back-door.html) there's a claim
that the hacked version was distributed, and used at least once.

~~~
johnwfinigan
According to [1], the original Multics trap door that Thompson cites as his
inspiration was distributed, to an Air force installation no less.

Tangentially, I'm more fascinated by the fact that Paul Karger, the lead
author on [1], was a principal on a Class A1 secure hypervisor for the VAX,
which DEC essentially finished, but cancelled in 1991. [2]

[1] - see paragraph 3.1 - [http://hack.org/mc/texts/classic-
multics.pdf](http://hack.org/mc/texts/classic-multics.pdf) [2] -
[http://www.cs.dartmouth.edu/~ccpalmer/classes/cs55/Content/r...](http://www.cs.dartmouth.edu/~ccpalmer/classes/cs55/Content/resources/vax_vmm.pdf)

~~~
hurin
Thanks for this - I'm reading the Multics retrospective - do you know of a
follow up work? I'd be very interested to see a detailed account as to why
some of these features never got implemented.

The paper kind of just leaves it at:

 _With the growth of individual workstations and personal computers,
developers concluded (incorrectly) that security was not as important on
minicomputers or single-user computers._

~~~
johnwfinigan
I don't know of any explicit follow up, but there is a ton of interesting info
at multicians.org.

I think that, for multi-million dollar mainframe timesharing systems, it was
easier to make the cost argument for good security, since the customer would
pay for it so that they could spread out the cost of the machine on many
users, between whom there was no trust.

But once you got $100k minis and $10k single user micros, why not just buy a
second machine?

Of course, things turned out a little bit differently, but from a time before
the ubiquitous internet, I can see it making sense to many people.

It's amazing how much more diverse the hardware/OS ecosystem was in say 1985
than it is now. A lot of good stuff has happened since then, but I think a lot
of good ideas got lost, or at least are waiting to be dug up again.

------
bayesianhorse
The idea about KTH binaries being virtually everywhere is fascinating and
frightening. But I highly doubt that this is the case.

Sure, in theory, a perfect KTH scheme would be undetectable, since it suborns
every means of detection. But in practice it often wouldn't. A KTH virus would
have to anticipate all tools which may be written to detect it, and given the
modern open and closed source software world the complexity would explode.

~~~
Someone
Not only anticipate, but also guard against. For example, someone asked about
a debugger. A specific debugger is simple: patch the debugger's source at
compile time to hide the presence of the malware. Of course, that means you
have to handle the case where the debugger is used to debug itself, too. That
may pose a challenge, but given the existence of quines, I think it is
possible. Doing that not for just _a_ debugger but every debugger in the world
is left as an exercise.

Next, someone will look at timing and notice something strange. So, add stuff
that tweaks the real-time clock to correct for this when your binary runs,
when the OS gets built with your compiler. That way, running 'time' on your
binary will not show anything suspicious.

Next, someone may notice a discrepancy between real-time clocks and their OS.
Solution? Make sure analog clocks aren't that precise, hook all digital clocks
up to a time signal, and tweak that signal if you hit slower code paths that
the user shouldn't see. Also, make sure that all other computers in the world
slow down whenever one of them hits one of these paths.

Next, someone will notice a discrepancy between clock time and solar time.

For that, the NSA made up the leap second.

~~~
slowmovintarget
The way to detect a KTH is listed in the Wiki article:
[https://www.schneier.com/blog/archives/2006/01/countering_tr...](https://www.schneier.com/blog/archives/2006/01/countering_trus.html)

~~~
bch
It's interesting that when Bruce is talking about using an older, simpler
alternative compiler as a "checksum" compiler, the first thing I thought of
was pcc[1][2], but it's from the right lab and era that using it might not
actually be a good test.

[1] [http://pcc.ludd.ltu.se](http://pcc.ludd.ltu.se)

[2]
[https://en.wikipedia.org/wiki/Portable_C_Compiler](https://en.wikipedia.org/wiki/Portable_C_Compiler)

EDIT: in case you don't follow the links and realize the implications
yourself, from wikipedia:

"It was very influential in its day, so much so that at the beginning of the
1980s, the majority of C compilers were based on it."

~~~
seanp2k2
Also interesting re: reproducible builds:
[http://lwn.net/Articles/630074/](http://lwn.net/Articles/630074/)

------
willvarfar
I giggle every time I think about how Ken's compilers are alive and well even
today ... I mean, who'd trust a compiler written by such a twisted mind?
Something to chew on as you use the go toolchain ;)

------
rectang
A KTH virus doesn't have to be ubiquitous to do huge damage. All you need to
do is land a malicious compiler on a machine used to produce widely
distributed binaries.

------
Agathos
This inspires a major plot element in Ramez Naam's Nexus, a near future
transhuman sci fi thriller. It's risky to run other people's code on your
brain.

[http://www.goodreads.com/book/show/13642710-nexus](http://www.goodreads.com/book/show/13642710-nexus)

------
Schwolop
Could you not define a set of operations on your potentially compromised
hardware/software/system that should take time x if the system is
uncompromised, and verify with a separate clock that the real time is either x
or x+delta? This is based on the assumption that whatever the KTH does must
require some amount of computation time to achieve.

Yes, the clock could be compromised too - but building your own clock from
scratch is a lot easy than building your own computer, OS, lexer, compiler,
etc.

[pre-posting edit - I see others have discussed this possibility too. I thus
leave this comment purely as a testament to my own ascension to the rank of
"Flaw-spotter".]

------
lomnakkus
I'll repost something another HN poster posted at me in a previous
conversation about this. Thompson's attack _can_ be defeated:

    
    
       http://www.dwheeler.com/trusting-trust/

~~~
falcolas
I may be reading this wrong, but it appears to rely on trusting at least one
(if not more) of the other compilers to not be compromised as well.

Given that so many modern compilers can be traced back to GCC or LLVM these
days, this requirement would seem problematic.

EDIT: As pointed out below, it would take a nearly AI level compromise to
protect against this kind of attack, but at a theoretical level it would be
possible.

Of course, if you control the GCC compiler, you control the Linux Kernel, and
no programs run on Linux that don't rely on the kernel for reading and writing
files...

~~~
dllthomas
It does not rely on you trusting any of the compilers to be uncompromised. It
relies on you trusting at least one pair of the compilers to not be
_identically_ compromised.

------
boszpasaway02
y

------
boszpasaway02
ms mea

