Also, the actual PDF is here: https://www.governmentattic.org/39docs/NSAmilitaryCryptalyti...
Part 1: https://www.nsa.gov/Portals/70/documents/news-features/decla...
Part 2: https://www.nsa.gov/Portals/70/documents/news-features/decla...
If anyone is interested, a link to the first section, which contains parts 1-9, REF ID A64563, is at:
And the third section, with appendices 4-9, REF ID A64556 is at:
The Internet Archive has a large collection of documents from American cryptographer William F. Friedman, which was easier to search than the NSA's site.
The "Principles of Cryptodiagnosis" chapter (which I think is the most interesting "reveal") is an update of the author's 1970 monograph "Ars Conjectandi". Around 1976/1977 when the book came out, the stronger "modern" cryptography was becoming public - DES, public key crypto, Diffie-Hellman key exchange, the paper "New Directions in Cryptography" etc.
The Schneier book is all about those - e.g. DES, RSA, GOST etc and applying known systems. It's not about cryptanalysis of pre-computer "classical systems" - at the start he says "This is not a book about classical cryptography". There's a brief note mentioning that "double transposition" is a very strong classical system, and that's about it.
One difference between classical and modern cryptography is that classical cryptography usually works at the letter or character level, while modern cryptography usually treats plaintexts and ciphertexts as uninterpreted sequences of bits and uses bit-level operations. Sometimes these two perspectives have equivalences or similarities, but mostly not if you're looking at stuff like block cipher S-boxes or block cipher modes of operation or Merkle-Damgård hash functions or whatever.
11 This is really not stealing. For the pure in heart, this should be thought of as conversion of raw data, and that the
parties so generously supplying these raw data are, unknowingly, cooperating in government work.
That section is white. Is there any information on cyphers that generate codes indistinguishable from absolute randomness based on the properties of entropy?
Generally, you only require a weaker property: that a time-bounded attacker is unable to assert that the cyphertext is random with neglegible probability.
Nevertheless, modular addition, including addition modulo 2, a.k.a. XOR, is almost always chosen, because any other kind of operations are slower in hardware without having any advantages over modular addition (when the masking symbols cannot be distinguished from a random sequence).
For example, instead of using modular addition, you can use multiplication modulo 2^16+1 (a Fermat prime number), to encrypt each 16-bit plaintext symbol, but that is slower than modular addition without being better (unless your method of generating the pseudo-random mask is weak, which is easy to avoid).
Note that for the code to be indistinguishable from absolute randomness, you need the key to be as long as the text.
"In all of one's endeavors, the successful cryptanalyst's precept should be kept foremost in mind: "Try the simplest thing first." It is surprising how often this admonishment is disregarded, to the later chagrin of the cryptosinner." -- Callimahos
Does anyone have the other 6?
Notably, the FBI never had any capacity for cryptanalysis. What they got, they got from the Friedmans.
"Between Silk and Cyanide", by Leo Marks, is deeply enlightening, in particular about spycraft and its failings in Britain at the time.
but there are several other biographies. The government treated the Friedmans very badly considering how dependent it was on them. Ultimately it was only the Friedmans' sense of duty that prevented wholesale collapse of American cryptanalytic capability.
British spycraft during WWII and well after was absolutely abysmal, with rare bright spots.
> in 1992, the US Justice Department claimed releasing the third book could harm national security by revealing the NSA's "code-breaking prowess". It was finally released in December last year.
When we move to the first page of the linked book it says:
> This text constitutes the third in the series of six basic texts on the science of cryptanalytics.
So there is still book 4, 5 an 6?
(im probably on some list just for asking)
Yes, and they are still classified (for now).
Part III has been long sought after. Glad to see it’s out there (and that it’s release doesn’t jeopardize anything)
Well, 21% of it is released and out there. There's still 66 sections that were censored.
Eventually, they were inspired to try a particular combination of substitution and transposition based on a pattern people had observed at width 19. I think this observation was first made in 2015, and the cipher itself is from 1969.
So in the "Cryptodiagnosis" chapter Callimahos says ...
After finding a phenomenon - or what is thought to be a phenomenon - an evaluation should follow. It's not enough to say "I think the doublets are high." How many are there? Go count them. And, even more important, how many doublets are expected at random in a sample of this size?
Then, the team showed how that last question could be answered for the Zodiac killer ciphers with Monte Carlo sampling (i.e. modern computer methods).