
Information Theory and Statistical Physics - Lecture Notes - Anon84
http://arxiv.org/abs/1006.1565
======
jrp
The source is well-laid-out (start with main.tex). It's really cool that I can
go to the PDF, see something I like, and immediately look at how it was done.

------
URSpider94
Once you realize that entropy (disorder) and information capacity are one and
the same (thanks, Claude Shannon), this is a completely logical connection to
make. I hope I have chance to read through the notes at some point. This might
be in there already, but it would be fun to see a worked example calculating
the maximum practical storage density for a magnetic medium, taking into
account the potential for thermal energy to flip bits as the domains get
smaller and smaller, but counteracting that with error correction ...

