
Fifty Years of Shannon Theory (1998) [pdf] - avastmick
https://www.princeton.edu/~verdu/reprints/IT44.6.2057-2078.pdf
======
TFortunato
For anyone interested in a quick visual overview of some important ideas in
information theory, I'd like to give a shout out to colah's "Visual
Information Theory" post: [http://colah.github.io/posts/2015-09-Visual-
Information/](http://colah.github.io/posts/2015-09-Visual-Information/)

------
glial
I read Shannon's original papers and have been trying to grok information
theory off and on with only limited success for several years. The book
"Information Theory: A Tutorial Introduction" by Stone really helped the
principles sink in. For anyone who'd like a textual introduction, I highly
recommend it. It's semi-technical - high-school level math helps explain much
of the foundational concepts.

If you're feeling more adventurous, Cover and Thomas's textbook is the
information theory bible. It's very dense but absolutely packed with insight.

------
jatsign
Recently, I read a great book that goes into Shannon a bit. It's called "The
Information: A History, A Theory, A Flood" by James Gleik.

It starts by talking about African drum speech, and how it is a very
simplified version of speech, and how it deals with error correction over
great distances. From there it goes into a lot of great historical
explanations, including Shannon.

~~~
f00_
Richard Hamming developed error-correcting codes while working along side
Shannon at bell labs, his book The Art and Science of Engineering: Learning to
Learn and the associated lectures he gave at the naval postgraduate academy
are tremendous

[https://www.youtube.com/watch?v=a1zDuOPkMSw](https://www.youtube.com/watch?v=a1zDuOPkMSw)

~~~
MaysonL
PDF of the book: [http://worrydream.com/refs/Hamming-
TheArtOfDoingScienceAndEn...](http://worrydream.com/refs/Hamming-
TheArtOfDoingScienceAndEngineering.pdf)

------
anttipoi
Tip: Claude Shannon's new biography [[https://www.amazon.com/Mind-Play-
Shannon-Invented-Informatio...](https://www.amazon.com/Mind-Play-Shannon-
Invented-Information-
ebook/dp/B01M5IJN1P/ref=sr_1_1?s=books&ie=UTF8&qid=1515747422&sr=1-1&keywords=claude+shannon)]
is an enjoyable read.

~~~
cbHXBY1D
I wasn't a huge fan. It was more about his life and less about his ideas.

If you want a little bit of both, I highly recommend Seife's _Decoding the
Universe: How the New Science of Information Is Explaining Everything in the
Cosmos, from Our Brains to Black Holes_ or Gleick's _The Information: A
History, A Theory, A Flood_.

------
s-c-h
Are there any online tutorials or courses that guide us step by step to create
a non trivial fun project that uses Shannon Theory? For example a program that
simulates encoding a message on one end and decoding it on the other end.

~~~
sethgecko
This is my favourite channel on youtube, it focuses a lot on information
theory and cryptography.

[https://www.youtube.com/user/ArtOfTheProblem](https://www.youtube.com/user/ArtOfTheProblem)

~~~
zengid
I really like the use of music in these videos (or at least the main one
presented on the channel's homepage). There's something about simple
atmospheric 'textural' music behind people talking about stuff that gives me
enjoyment.

------
mrcactu5
Claude Shannon's "A Mathematical Theory of Communication" is available online,
and in bookstores.

[http://affect-reason-utility.com/1301/4/shannon1948.pdf](http://affect-
reason-utility.com/1301/4/shannon1948.pdf)

Khan Academy has done a information theory

[https://www.khanacademy.org/computing/computer-
science/infor...](https://www.khanacademy.org/computing/computer-
science/informationtheory)

------
messel
Focused on Information Theory in grad school at Stony Brook in 1996/97, and
enjoyed Nam Phamdo's courses. Sadly after reading 8-9 pages I feel like much
of this review is over my head now. The words make sense but the concepts are
pretty dense.

I do remember the algorithms (Huffman coding), but the high level concepts are
harder to remember - information being directly correlated to randomness
(entropy). Information is the unknown (random?) message, that sounds more like
noise. Channel capacity makes sense: based on noise there's an upper limit on
how fast you can communicate bits and groups of bits.

~~~
varlock
Re: information is the unknown message. Think of it this way.

You read the newspaper each day. Every day it's got the same news: "Today the
sun shines." This is no useful information (no entropy) and you can predict it
very well.

Second scenario. You read the newspaper, but each day it states that either
"Today the sun shines" or "Today is raining" with 50/50 probability. That's
maximum entropy. That's useful information, which you can't reliably predict.

