Hacker News new | past | comments | ask | show | jobs | submit login

You quoted me before an edit -- what I meant was: Did things actually happen that way historically? That is, did engineers actually accept/design more heterogeneity in the physical network as a result of Shannon's ideas?

I'm talking about networks with heterogeneous physical specifications -- that happened long before the Internet. You have the same problem with just a plain analog circuit-switched phone system. No digital computers involved.

Static content isn't really distributed computing; it's more like networking. The type of breakthroughs I'm thinking of are more along the lines of block chain, homomorphic encryption, differential privacy, zero-knowledge proofs, etc.

In other words, different ways of collaborating over a network.

The thing that jumps out at me is that most of these technologies are fantastically expensive computationally.




You name fantastic cryptographic technologies! I was an R&D engineer in a European Telco, so maybe my understanding could be useful but my ideas may not be shared by all:

>> did engineers actually accept/design more heterogeneity in the physical network as a result of Shannon's ideas?

I would say engineers have little to say about decisions in a huge company like ATT. In a company having hundreds of thousands employees, decisions are taken 3 or 4 levels of hierarchy above them.

And those decisions are, as usual in human matters, mostly never rational from the company PoV, more than often a decision is taken between CEOs only, or because someone high in the food chain wants to go higher or gain some power on a related organisation or gain corruption money by pushing a manufacturer technology.

That said I think engineers were never bothered with Shannon work. It is quite arid and when we read it now, more than 50 years later, we may read things that were not written intentionnaly in the original paper. I must confess that when I read Shannon's seminal paper out of curiosity in the 2000', I was hugely disappointed.


> I would say engineers have little to say about decisions in a huge company like ATT.

The AT&T of the Bell Labs days was not like a modern telco that just takes bids for everything from Cisco et al. AT&T designed their own phones, phone switches, computers, microprocessors, programming languages, operating systems, etc.

> I must confess that when I read Shannon's seminal paper out of curiosity in the 2000', I was hugely disappointed.

That's always the way with seminal papers. The ideas are by now so integrated into modern society that we can't even imagine what it was like for that to be a revolutionary concept.


>> The AT&T of the Bell Labs days ... designed their own ...

True, I am old enough to have see my own employer (France Telecom) doing the same: Computers : SM80 and SM90 [0] and heavy involvement in other computers like the LCT3202. Operating system: A clone of Unix in Pascal! Phones indeed: S63 and older phones, Phone switches: E10 and software of 11F.

For me the 80" were the good old time, you well describe what came after that "just takes bids for everything from Cisco et al"

[0] http://www.feb-patrimoine.com/projet/unix/sps7.htm


> Did things actually happen that way historically? That is, did engineers actually accept/design more heterogeneity in the physical network as a result of Shannon's ideas?

https://en.wikipedia.org/wiki/Time-division_multiplexing

https://en.wikipedia.org/wiki/Packet_switching

Shannon's paper was 1948. TDM was in commercial use in the 1950s. Even ARPANET was in the design phase by the 1960s.

The thing about information theory is that you can use it in practice without understanding all the math. TDM existed in the 19th century. You can even study human language in terms of information theory, but that predates 1948 by a million some odd years. And we still don't understand all the math -- information theory is related to complexity theory and P vs. NP and all of that.

As to whether heterogeneous networks would have dethroned AT&T, we don't know because the government broke them up just as packet switched networks were becoming mainstream. Moreover, the telecommunications market even now is not a model for how market forces work. You still can't just go to Home Depot, pick up some fiber optic cable, connect up your whole neighborhood and then go collectively negotiate for transit.

It's a lot easier, regulation wise, to go into competition with AWS than Comcast. And AWS correspondingly has a lot more real competitors.

> Static content isn't really distributed computing; it's more like networking.

It's distributed data storage, but yes, the solutions there are very similar to the known ones for networking.

> The thing that jumps out at me is that most of these technologies are fantastically expensive computationally.

The solutions we use for networking and data storage trade off between computation (compression, encryption) and something else (network capacity, storage capacity, locality), which are good trades because computation is cheap relative to those things.

It's much cheaper computationally to send plaintext data than encrypted data, by a factor of a hundred or more. But the comparison isn't between encryption and plaintext, it's between encryption and plaintext that still has to be secured somehow, e.g. by physically securing every wire between every endpoint. By comparison the computational cost of encryption is a bargain.

But if you have to use computation to secure computation itself, the economics are not on your side. Processor speed improvements confer no relative advantage and even reduce demand for remote computation as local computation becomes less expensive.

Where those technologies tend to get used is as an end run around an artificial obstacle, where the intrinsic inefficiency of the technology is overcome by some political interference weighing down the natural competitor.

The obvious example is blockchain, which would be insurmountably less efficient than banks if not for the fact that banks are so bureaucratic, heavily regulated and risk averse. But if banks start losing real customers to blockchain they'll respond. Hopefully by addressing their own failings rather than having blockchain regulated out of existence, but either way the result will be for blockchain to fade after they do.


>The obvious example is blockchain, which would be insurmountably less efficient than banks if not for the fact that banks are so bureaucratic, heavily regulated and risk averse. But if banks start losing real customers to blockchain they'll respond. Hopefully by addressing their own failings rather than having blockchain regulated out of existence, but either way the result will be for blockchain to fade after they do.

That's assuming it is technically possible for them to be cheaper and faster and easier to use without blockchain technology. My non-expert understanding is that it's not possible.

Also, from what I understand a key reason financial institutions are so interested in the blockchain is it helps greatly with the trust and settlement problems that at present take so much work and expense.


A book well worth reading about this time is The Dream Machine [0] (all about the people leading up to ARPA and then PARC). Shannon's work had a massive influence on everyone in computers at the time.

[0]: https://www.amazon.com/Dream-Machine-Licklider-Revolution-Co...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: