
Measuring Entropy and its Applications to Encryption - qubitsam
https://www.schneier.com/blog/archives/2013/08/measuring_entro.html
======
mtdewcmu
"My guess is that there wasn't enough crypto expertise on the program
committee to reject the paper." Oh, snap!

~~~
mayank
Actually, this is a problem that runs through all of academia, and is hardly
confined to crypto. Program committee members (at least in computer science)
seldom have the time to thoroughly review submissions, and computer science
papers submitted elsewhere (say, PNAS or Science) are rarely given a thorough
review by computer scientists.

As a particularly fun example, see this fun example of a computer vision paper
that was published in Science in 2008, titled "100% Accuracy in Automatic Face
Recognition", published by two psychologists:

[http://www.sciencemag.org/content/319/5862/435.abstract](http://www.sciencemag.org/content/319/5862/435.abstract)

The title should say enough to anyone who has the faintest idea about computer
vision. Then read this very politely-worded response:
[http://www.sciencemag.org/content/321/5891/912.3.full](http://www.sciencemag.org/content/321/5891/912.3.full)

~~~
mtdewcmu
That was a very patient and kindly-worded response. I'd think that a
psychology background would teach exactly the wrong lessons about the use of
statistics in hard technology problems. In psychology research, the function
of statistics is to persuade other psychologists, and persuading other
psychologists of the importance of your research is the sole measure of
success or failure. They're not used to find flaws in one's own work.

As for poorly-reviewed papers, it must be embarrassing when they make
headlines.

------
Jugurtha
Cool stuff. I'm using entropy (Kolmogorov entropy) to do multiphase (oil and
gas) flow pattern recognition, and transition detection from a flow pattern to
another.

------
memming
What is minimum entropy referred to here? Is it -log(max(p))?

~~~
pbsd
Yes. As opposed to Shannon entropy -sum(p log p).

------
anaphor
Yeah, when I originally read about this I thought more or less the same thing.
A low amount of entropy is already assumed in various attacks (like dictionary
attacks), or even just guessing based on your knowledge of the person who
created the password, it's the same deal. I don't see how you can assume the
same thing for RSA keypairs though...

~~~
betterunix
Slightly off topic, but low-entropy during RSA key generation is a serious
problem:

[https://factorable.net/weakkeys12.extended.pdf](https://factorable.net/weakkeys12.extended.pdf)

~~~
anaphor
Yeah, I meant that it's not exactly the same level. Certain implementations of
RSA were using broken sources of entropy, but it's not the same as "Bob always
uses his address as his password" which has no entropy.

~~~
betterunix
OK, sure, but if you want something more to do with ciphertexts than with key
generation, consider this: "textbook" RSA is not secure. The reason is that
the "textbook" RSA algorithm is not randomized; that is also the reason why so
much effort has gone into randomized padding schemes like this:

[https://en.wikipedia.org/wiki/OAEP](https://en.wikipedia.org/wiki/OAEP)

