
Patterson's Cipher for Jefferson – Challenge Solved After 200 Years (2009) - Petiver
http://cryptiana.web.fc2.com/code/jeffers4.htm
======
brazzy
> a specimen of such writing, which I may safely defy the united ingenuity of
> the whole human race to decypher, to the end of time

Turns out to have been rather optimistic. But I'd say that holding up for 200
years before yielding to a combination of modern mathematics and brute
computational force that would have been utterly unimaginable to Patterson is
still quite impressive.

------
nothis
Anything with probability and randomness tends to break my brain but I find it
fascinating.

If I interpret the 200-years-late solution correctly, it relied to a large
part on trial and error (like one "trick" is to focus on adjacent rows, which
seems to be more of a "hunch" than a deep, mathematical strategy). Things like
this make me wonder why so many modern encryption standards rely on "simple"
mathematical concepts that are easily testable (even if testing might take a
million years). Even if that's your base, why not throw some random algorithms
on top, like swapping every 7th symbol, reversing the whole thing and then
adding a random one in the middle. Wouldn't that immediately make it more
tedious to decipher at not much of an additional cost? Is this done anywhere?

~~~
brazzy
It is done by amateurs who design "super secret encryption algorithms" that
nearly always turn out to be trivially breakable.

I'll assume you don't mean this "random algorithm" you "throw in" to be
secret, (otherwise see the other response) and instead you hope that
complicating the algorithm to make cryptanalysis more difficult.

The thing is - You _don 't_ want cryptanalysis to be difficult! You want the
world's cryptographic experts to look at your algorithm, immediately see the
potential ways to crack it, and quickly find that none of them actually work.

A complex encryption algorithm is _bad_ because there could be vulnerabilities
hiding in the complexity. That "swapping every 7 symbol" could make it
_easier_ to crack by creating subtle patterns in the output.

~~~
drb91
Well, it depends on your needs. I imagine a one time encryption algorithm (ie
not intended for reuse) could make good use of security through obfuscation or
obscurity. This kind of strategy is essentially meaningless in the context of
computers, the internet, and effort-free encryption that is mostly good
enough.

------
olskool
This type of transposition cipher has been easily cracked for decades.

