Hacker News new | past | comments | ask | show | jobs | submit login

The entropy of a single object is a meaningful concept. It is usually called Kolmogorov complexity.



Kolmogorov complexity is definitely meaningful, but it's not (Shannon) entropy, just conceptually similar. Many people think of something like Kolmogorov-complex sequences when they think of "random" sequences, which is (IMO) why they have trouble thinking of entropy as being about a probability distribution.

The one case where they coincide (sort of) is if you believe your random sequence is generated by a randomly chosen Turing machine, which I've only really seen in philosophical settings.

A uniformly chosen 64-bit integer still has exactly 64 bits of entropy, regardless of how much Kolmogorov complexity the actual bits you generate have.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: