Hacker News new | past | comments | ask | show | jobs | submit login
Declassified Cold War code-breaking manual on solving 'impossible' puzzles (phys.org)
268 points by sandebert 11 days ago | hide | past | favorite | 38 comments





It's worth saying that the book isn't entirely declassified. 18 out of 84 sections (including quite a few "Further remarks" or "Final remarks" sections) are completely whited out by the censor's box.

Also, the actual PDF is here: https://www.governmentattic.org/39docs/NSAmilitaryCryptalyti...


For anyone interested, I think these are the PDFs for the first and second books:

Part 1: https://www.nsa.gov/Portals/70/documents/news-features/decla...

Part 2: https://www.nsa.gov/Portals/70/documents/news-features/decla...


The first link, for Part 1, is great. Part 2 seems to have been published in 3 sections. The second link is to the second section of Part 2, with a REF ID of A64556. This contains parts 10 through 14 and 3 appendices.

If anyone is interested, a link to the first section, which contains parts 1-9, REF ID A64563, is at: https://archive.org/details/41752229079148 or https://www.nsa.gov/Portals/70/documents/news-features/decla...

And the third section, with appendices 4-9, REF ID A64556 is at: https://archive.org/details/41751819079101 or https://www.nsa.gov/Portals/70/documents/news-features/decla...

The Internet Archive has a large collection of documents from American cryptographer William F. Friedman, which was easier to search than the NSA's site.


How does content of this compares to e.g. newer (still 30 year old) book like Applied Cryptography by Schneier? I understand these PDFs are even more "applied" but in terms of concepts behind them?

This book, like the MILCRYP series, is focussed on cryptanalysis of "classical" (pre-computer) ciphers.

The "Principles of Cryptodiagnosis" chapter (which I think is the most interesting "reveal") is an update of the author's 1970 monograph "Ars Conjectandi". Around 1976/1977 when the book came out, the stronger "modern" cryptography was becoming public - DES, public key crypto, Diffie-Hellman key exchange, the paper "New Directions in Cryptography" etc.

The Schneier book is all about those - e.g. DES, RSA, GOST etc and applying known systems. It's not about cryptanalysis of pre-computer "classical systems" - at the start he says "This is not a book about classical cryptography". There's a brief note mentioning that "double transposition" is a very strong classical system, and that's about it.


Also, Schneier is trying to teach people how to implement cryptosystems, not how to break them or how to design them (which you allude to by saying "applying known systems").

One difference between classical and modern cryptography is that classical cryptography usually works at the letter or character level, while modern cryptography usually treats plaintexts and ciphertexts as uninterpreted sequences of bits and uses bit-level operations. Sometimes these two perspectives have equivalences or similarities, but mostly not if you're looking at stuff like block cipher S-boxes or block cipher modes of operation or Merkle-Damgård hash functions or whatever.


Thanks!

I love this footnote, it explains so much:

11 This is really not stealing. For the pure in heart, this should be thought of as conversion of raw data, and that the parties so generously supplying these raw data are, unknowingly, cooperating in government work.


This reads like the Devil's own penmanship.

> 6. Fundamental cryptanalytics in the solution of aperiodic systems.

That section is white. Is there any information on cyphers that generate codes indistinguishable from absolute randomness based on the properties of entropy?


The only cypher that satisfies your request is "XOR the signal with noise, where the noise is a shared secret".

Generally, you only require a weaker property: that a time-bounded attacker is unable to assert that the cyphertext is random with neglegible probability.


I think modular addition also has that property

XOR is modular addition on single bits!

For this kind of encryption, any operation between plaintext symbols and the random masking symbols that belongs to a commutative group can be used.

Nevertheless, modular addition, including addition modulo 2, a.k.a. XOR, is almost always chosen, because any other kind of operations are slower in hardware without having any advantages over modular addition (when the masking symbols cannot be distinguished from a random sequence).

For example, instead of using modular addition, you can use multiplication modulo 2^16+1 (a Fermat prime number), to encrypt each 16-bit plaintext symbol, but that is slower than modular addition without being better (unless your method of generating the pseudo-random mask is weak, which is easy to avoid).


https://en.wikipedia.org/wiki/One-time_pad

Note that for the code to be indistinguishable from absolute randomness, you need the key to be as long as the text.


In theory, yes. You might say that cryptography exists in the gap between the theoretically impossible and the practically achievable.

I mean all working (non broken) cyphers are indistinguishable from randomness since you could simply do an entropy attack against a cypher that didn't have this property.

Good to see [0] Polya's book mentioned (and linked) in the article. For anyone interested, it provides a pretty good general outline for problem solving techniques. [1] There's also a YouTube video of him explaining some of those methods.

[0] https://math.hawaii.edu/home/pdf/putnam/PolyaHowToSolveIt.pd...

[1] https://www.youtube.com/watch?v=h0gbw-Ur_do


Yes, there are strong parallels between Polya's video and the "Cryptodiagnosis" chapter. The chapter is based on a monograph called "Ars Conjectandi" ("the art of conjecturing") and Polya's video theme is "first guess, then prove".

"In all of one's endeavors, the successful cryptanalyst's precept should be kept foremost in mind: "Try the simplest thing first." It is surprising how often this admonishment is disregarded, to the later chagrin of the cryptosinner." -- Callimahos


It might be worth mentioning that the title "Ars Conjectandi" is taken (clearly deliberately) from a much earlier (1713) work of the same title by Jacob Bernoulli, about combinatorics and probability. That title may well be a nod to an even earlier (1662) work called "Logica, sive Ars Cogitandi" -- Logic, or the Art of Thinking -- which was published anonymously and also includes some discussion of probability; probably substantial parts were written by none other than Blaise Pascal.

My grandfather worked with Lambros and shared an amusing booklet from him called "A short list of even primes," which is a single digit: 2 (with a long and humorous introduction). It is not classified (obviously) and I will try to get around to sharing it on here at some point in the future.

The OEIS isn't known for its sense of humor, otherwise I'm sure there'd be such a sequence listed there as well.

Here they are: https://imgur.com/a/YGwWpZq

Does anyone have the other 6?


It is not widely reported that US cryptanalysts--in particular, a single husband-and-wife team, the Friedmans ACKed in the preface, and their students--decrypted Enigma messages by hand as readily as Bletchley Park with all its bombes, albeit in smaller amounts because they mostly intercepted South American traffic. They also decrypted all Japanese communications leading up to Dec 7, 1941, and after.

Notably, the FBI never had any capacity for cryptanalysis. What they got, they got from the Friedmans.

"Between Silk and Cyanide", by Leo Marks, is deeply enlightening, in particular about spycraft and its failings in Britain at the time.


An excellent presentation on the work of the Friedmans is A Life in Code:

https://bookshop.org/books/a-life-in-code-pioneer-cryptanaly...

but there are several other biographies. The government treated the Friedmans very badly considering how dependent it was on them. Ultimately it was only the Friedmans' sense of duty that prevented wholesale collapse of American cryptanalytic capability.


What’s even weirder is that breaking Enigma is often attributed to British cryptologists.

The Friedmans and the Poles did, independently. Brits only automated it, based on the Poles' work.

British spycraft during WWII and well after was absolutely abysmal, with rare bright spots.


Maybe I misunderstand something, but the article text says:

> in 1992, the US Justice Department claimed releasing the third book could harm national security by revealing the NSA's "code-breaking prowess". It was finally released in December last year.

When we move to the first page of the linked book it says:

> This text constitutes the third in the series of six basic texts on the science of cryptanalytics.

So there is still book 4, 5 an 6?

(im probably on some list just for asking)


> So there is still book 4, 5 an 6?

Yes, and they are still classified (for now).

Part III has been long sought after. Glad to see it’s out there (and that it’s release doesn’t jeopardize anything)


> Part III has been long sought after. Glad to see it’s out there (and that it’s release doesn’t jeopardize anything)

Well, 21% of it is released and out there. There's still 66 sections that were censored.


Author here - actually LDC died in October 1977, the same month this book came out. So there were more texts planned in the "Military Cryptanalytics" series, but they were never published.


Only the first three were written.

Are we guessing or knowing?

It’s the Cryptonomicon!

Are there any videos showing how these would be used in a real world situation?

Yes! I linked to the recent Zodiac Killer cipher solution - https://youtu.be/iuNyQ44JYxM - over many years, many people observed patterns in the cipher, and tried to determine how likely it was that each pattern was nonrandom.

Eventually, they were inspired to try a particular combination of substitution and transposition based on a pattern people had observed at width 19. I think this observation was first made in 2015, and the cipher itself is from 1969.

So in the "Cryptodiagnosis" chapter Callimahos says ...

After finding a phenomenon - or what is thought to be a phenomenon - an evaluation should follow. It's not enough to say "I think the doublets are high." How many are there? Go count them. And, even more important, how many doublets are expected at random in a sample of this size?

Then, the team showed how that last question could be answered for the Zodiac killer ciphers with Monte Carlo sampling (i.e. modern computer methods).


Forwarded to AMZN and GOOGL.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: