Hacker News new | past | comments | ask | show | jobs | submit login
Claude Shannon’s research laid foundations for modern communications (2020) (quantamagazine.org)
151 points by ColinWright on Jan 1, 2023 | hide | past | favorite | 25 comments



Shannon’s work has already produced a whole lot of course, but I don’t think we’re anywhere near seeing the full outcome of his contributions yet.

Currently our computers operate by filling the transistors with charge and/or dumping it to ground. Who even cares about in information-theoretic efficiency in that case? The cost of the actual work done is dwarfed by the ancillary cost of running the machine.

If we ever move on to something less brute-force, like reversible quantum cellular automata, I think we’ll see him as an invaluable part in the chain of formalizing what information and computation mean physically.

Kelvin/Maxwell -> Shannon -> Landauer -> maybe Bennett


The suggestion to use the word entropy by von Neumann is a great story:

My greatest concern was what to call it. I thought of calling it 'information,' but the word was overly used, so I decided to call it 'uncertainty.' When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, 'You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, no one really knows what entropy really is, so in a debate you will always have the advantage.'

https://mathoverflow.net/questions/403036/john-von-neumanns-...


It's a funny story, but now that we do understand better what entropy is it's clear that information and thermodynamic entropy are the same concept, so it was a very good call.


Von Neumann was incredibly clever.


von Neumann was an alien sent to Earth when we were about to get/were in the process of getting nukes, to make sure we advanced fast enough to not blow ourselves up. And the wild thing is that you, the reader, are only like 75% sure I’m kidding and don’t actually believe this.


Yes, it's likely he also understood that they were the same concept, and his recommendation was less of a joke than his words suggest.


Certainly it seems like Shannon is turning out to be the lindy Nicola Tesla equivalently impactful person but in the realm of bits.


> ...in the realm of bits.

Especially as an OG (first author to use the word?): https://people.math.harvard.edu/~ctm/home/text/others/shanno...

> If the base 2 is used the resulting units may be called binary digits, or more briefly bits, a word suggested by J. W. Tukey.

[Edit: also, consider the section "3. THE SERIES OF APPROXIMATIONS TO ENGLISH" to be an early exploration of MLMs: Microscopic Language Models]


John Tukey also invented FFT and apparently was the first to use "software" in a publication:

https://en.wikipedia.org/wiki/John_Tukey


Gauss invented FFT, in 1805. And "Between 1805 and 1965, some versions of FFT were published by other authors"

https://en.wikipedia.org/wiki/Fast_Fourier_transform#History

p.s. I started reading Tukey's textbook Exploratory Data Analysis just yesterday. It's wonderful so far, a pleasure to read. "It is important to understand what you CAN DO before you learn to measure how WELL you seem to have DONE it." "The greatest value of a picture is when it forces us to notice what we never expected to see."


yes but colloquially we take 'inventor' to mean whoever invented it next after guass


:-) It seems Cooley and Tukey were the first to realize/state that it's O(N log N), which is something.


Was rereading “Fortune’s Formula” this morning after dreaming about Shannon last night. A name that should be more widely recognized.


Every year I read Shannon's paper, and every year I realize I have no idea WTF he was talking about.


I learned recently from Alvy Ray Smith's _A Biography of the Pixel_ that Kotelnikov[1] was first to discover sampling theorem before Shannon independently did. And Shannon made it accessible to the West. It's pretty fascinating history.

[1] https://en.m.wikipedia.org/wiki/Vladimir_Kotelnikov


Not just communication, but many other fields. E.g. deep learning is rooted in information theory.


The Kelly criterion that is widely used in finance was invented at the Bell Labs. It was based on the Shannon’s theory.


Kelly criterion was originally invented by Daniel Bernoulli (using logarithmic utility) as a resolution to the St. Petersburg Paradox, IIRC:

https://en.wikipedia.org/wiki/St._Petersburg_paradox


Not specifically a solution to the St. Petersburg Paradox because, well, it's not a solution to the most general formulation of it (and Bernoulli admits as much in his paper, IIRC).

At least the way I read Bernoulli's paper, it was more of a general musing on how to reduce risk and make insurance decisions.


> widely used

A nice idea but not really.


I use it every time I’m in Vegas.


Well, the notion of bit as the unit of information, as well as digitization and digital communication, came primarily out of Shannon’s work.


Discussed at the time (of the article):

Claude Shannon Invented the Future - https://news.ycombinator.com/item?id=25507627 - Dec 2020 (68 comments)


Slightly OT: have you considered using some kind of automation to generate these messages? ( or perhaps this is an automated message).


I use software to help build the lists quickly. They're still hand-picked, though, because an automated solution tends to come up with a lot of duds (e.g. threads with no interesting commments).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: