Hacker News new | past | comments | ask | show | jobs | submit login

I'm not exactly sure where I fall on this. Ben is a really smart guy (way smarter than me), but I feel like this could be a classic case of hindsight.

Now, looking back, it makes sense that the next logical step after PCs was the Internet. But from each era looking forward, it's not as easy to see the next "horizon".

So, if each next "horizon" is hard to see, and the paradigm it subsequently unlocks is also difficult to discern, why should we assume that there is no other horizon for us?

I also don't know if I agree that we are at a "logical endpoint of all of these changes". Is computing truly continuous?

However, I think Ben's main point here is about incumbents, and I agree that it seems it is getting harder and harder to disrupt the Big Four. But I don't know if disruption for those 4 is as important as he thinks: Netflix carved out a $150B business that none of the four cared about by leveraging continuous computing to disrupt cable & content companies. I sure wasn't able to call that back in 2002 when I was getting discs in the mail. I think there are still plenty of industries ripe for that disruption.




Until I have something resembling Iron Man's Jarvis with at least a subvocal interface, I think there's still a long way to go for "continuous" computing. I currently still have to pull out a discrete device and remove myself from other interactions to deal with it. If I'm not on that device all the time, then I don't have continuous computing. Maybe continuously available computing is more accurate?


Right -- today you need to remember to charge your phone, you don't take it everywhere (and don't have signal everywhere, especially internationally), and you need to take it out of your pocket to use it, and type into it with your thumbs (though voice "assistants" are here, and some people get use out of them.)

The end-goal is being able to talk to anyone at any time, remember anything you've seen before, and know the answer to any question you can phrase that someone has already answered.

(Now, you might say that parts of it sound less than ideal, but I think we'll get there by gradient descent, though may be with some legal or counter-cultural hiccups.)


Everything you describe are very minor tweaks to what already exists today.

The bottom line is everyone does have their phone charged and with them always, is probably out of their pocket most of the time anyway, and can get the answer to pretty much any question you can phrase that someone has already answered. The voice assistants will continue to improve, but some people actually prefer thumb-typing for various reasons.

And the "improvements" you suggest probably bring even more problems from privacy, security, and mental health issues than any plausible benefits they might provide.


> Everything you describe are very minor tweaks to what already exists today.

Sure, and the iphone was the same thing - I had a 3G windows phone years before the iphone that essentially did all the same things. But the iphone was still a breakthrough nonetheless.


This captures my sentiment.

It’s hard to see how the incumbents could be beaten — precisely because how effective they are around data, buying potential competitors (Instagram, YouTube) .... but this is precisely because we don’t know what/if the next market shift is.

What happens if AI takes off? What happens if 3D printing magically becomes 100x more efficient and you can print anything you want from home?

We don’t know. It doesn’t seem like the big incumbents could be defeated, but history repeats itself.


> it seems it is getting harder and harder to disrupt the Big Four

Microsoft, IBM, Oracle... What is the other one?

Or, right, wrong decade.

(My point is, it completely not obvious if it is getting harder to disrupt the incumbents.)


The conclusion of the article is that it is getting harder to disrupt the incumbents. I'm saying that regardless of whether it is or isn't, there are still lots of new companies to come that can take advantage of technology to disrupt other, old-guard incumbents.

That, I think is where the metaphor Ben uses breaks down. The automobile is a single idea (move people around with an ICE). Tech is more like the ICE than the car. So, there might not be much disruption to consumer hardware (Apple) companies, or search (Google) companies, or cloud computing (Amazon, Microsoft) companies. But there will still be lots of disruption to come as tech (just like the ICE) gets applied to new features.


> Microsoft, IBM, Oracle... What is the other one?

Cisco, of course.


It was actually Oracle, Sun, Cisco, and EMC who were the four "horsemen of the Internet" in the run up the dot-com bubble.


Isn’t that kind of his conclusion too though? It matters in as much as we’re less likely to see new general purpose public clouds come into play, but he didn’t seem to predict there was no more room for change in the industry, just that were unlikely to see those incumbents toppled from certain foundational positions in the ecosystem.


Much of Ben's writing recently has been on the topic of regulation and anti-trust, specifically in relation to tech companies. If I had to summarize his thesis, I'd say it's something along the lines of: "Previous antitrust regulation prioritized price. Tech allows for better products by nature of aggregation and network effects, and to promote competition, we need a new prioritization in our regulation".

So, I see this article as being a part of that thread. The conclusion is that the Big Four are not going to get disrupted, which is bad, and drawing some conclusions we need a new framework of antitrust to allow for it. I might be putting words in his mouth, but I don't think it is really that much of a jump if you read his body of work, especially recently.


I read Ben's writing a lot and listen to the podcast, I think you did a pretty good job capturing his points.


Was it really that hard to predict the Internet? SF authors picked up on it almost immediately.


Which authors are you thinking of?

Up until the web existed, I think it was extremely hard to usefully predict the Internet's impact. TCP was invented in 1974, but it wasn't until 1993 that we started seeing things that really pointed to where we were going: https://en.wikipedia.org/wiki/List_of_websites_founded_befor...

Of course, everybody knew computers would be important. But that was true starting in the 1960s. E.g., Stand on Zanzibar has a supercomputer as a central plot element.


I mean, if you even look at popular sci-fi, nobody exactly predicted the internet as it is today. It wasn't until someone coined the "information superhighway" that gears started turning. Even then, the earliest commercial websites were basically just digital brochures and catalogs. It wasn't until SaaS, search and social took off that we grasped what the specific use cases were that were going to be the dominant money makers. And the internet evolved quite a bit as a result.

Some people like me still lament the loss of the 90s internet in some ways, as it felt like a more "wild west" domain and not saturated and stale like it is today.


The concept of an information superhighway dates to at least 1964:

https://en.wikipedia.org/wiki/Information_superhighway#Earli...


It looks like those terms from the 60s and 70s referred to "superhighway" in regards to communication, but didn't prefix it with "information". And whether someone incidentally used the word or not is sort of irrelevant. It started to become popular as a means of visualizing the possibilities of the internet in the late 80s and 90s, and that's when I think the first people started to imagine what this might become in the abstract.


I'm leaning the other way -- that the usages were significant.

The Brotherton reference in particular interests me -- masers and light-masers (as lasers were initially called) were pretty brand-spanking new, and were themselves the original "solution in search of a problem". I've since come to realise that any time you can create either a channel or medium with a very high level of uniformity and the capacity to be modulated in some way, as well as to be either transmitted/received (channel) or written/read (medium), you've got the fundamental prerequisites for an informational system based on either signal transmission (for channels) or storage (for media).

Which Brotherton beat me to the punch by at least 55 years, if I'm doing my maths correctly.

I've made a quick search for the book -- it's not on LibGen (though Internet Archive has a copy for lending, unfortunately the reading experience there is ... poor), and no library within reasonable bounds seems to have a copy. Looks like it might be interesting reading however.

Point being: Brotherton (or a source of his) had the awareness to make that connection, and to see the potential as comparable to the other contemporary revolution in network technology, the ground-transit superhighway. That strikes me as a significant insight.

Whether or not he was aware of simultaneous developments in other areas such as packet switching (also 1964, see: https://www.rand.org/about/history/baran.html) would be very interesting to know.

Not much information on him, but Manfred Brotherton retired from Bell Labs in 1964, and died in 1981:

https://www.nytimes.com/1981/01/25/obituaries/manfred-brothe...


That's a cool article on Baran, it looks like he predicted Amazon in 1968, and they were experimenting with early email type systems in that time, too. I'm sure the bulletin board followed shortly after.

Brotherton wrote a book on Masers and Lasers in 1964, you might find more info in that: https://www.amazon.com/Masers-Lasers-They-Work-What/dp/B0000...

is that the one you mean?


Yes, that book.

Baran's full set of monographs written for RAND are now freely available online. I'd asked a couple of years ago if they might include one specifically, and they published the whole lot. Asking nicely works, sometimes.

Yes, there's interesting material there.

https://www.rand.org/pubs/authors/b/baran_paul.html


I'd say networking was not incredibly difficult to predict, but the businesses and products it allowed for (and how we use them) was very difficult.


> but I feel like this could be a classic case of hindsight.

Well it's 2020 afterall.

> Now, looking back, it makes sense that the next logical step after PCs was the Internet.

But the internet existed before PCs.

> and I agree that it seems it is getting harder and harder to disrupt the Big Four.

I agree, but then again, people thought AOL was hard to disrupt so you never know. A company can look invincible one day and irrelevant a few years later.

> I think there are still plenty of industries ripe for that disruption.

Yes, but the low hanging fruits have already been taken. I suspect the next round of disruptions would be more difficult and less profitable.


This has happened in every industry and in every time period. The mistake made in the article is that the author doesn't actually appear to realise how important this effect is (it is always amazing to me that you have all these people writing about the same topics, always from the same angle...no-one thinks to just open a book, and check what happened last time...actually I am aware of one book that has done this, just one). Again: every industry, every time period. It is permanent.

Definitely, you see new industries replacing old ones. Acknowledging the above isn't denying progress. But every industry consolidates down to a few large companies.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: