
A brief introduction to observational entropy - kgwgk
https://arxiv.org/abs/2008.04409
======
zaph0d_
I really love the strong connection between statistical mechanics and
information theory. Before attending a statistical mechanics lecture I was not
able to comprehend why entropy was a relevant quantity, beyond the definition
of the second law of thermodynamics. My statistical physics professor
motivated entropy, by saying: "We need something to measure how much
information we can extract from a system. Let's call it entropy, and it should
reach its maximum value in a closed system when we are only able to extract
the minimum amount of information from the system". And this opened my eyes on
how intertwined information theory and statistical mechanics is. In my opinion
one of the most beautiful connections in physics!

