
The End of the Beginning - nikbackm
https://stratechery.com/2020/the-end-of-the-beginning/
======
christiansakai
I've been thinking along the same line, albeit on a more personal take as a
software engineer.

Basically, starting around 15 years ago, there's the proliferation of
bootcamps teaching fullstack development, because software startups were the
new hot thing, and they desperately need generalist engineers that were
capable of spinning up web app quickly. Rails was the hot thing those days
because of this as well. Hence we saw many new grads or even people who change
careers to fullstack development and bootcamps churning out these workers at
an incredible pace (regardless of quality) but the job market took it because
the job market was desperate for fullstack engineers.

During that time, the best career move you can do was to join the startups
movement as fullstack engineers and get some equity as compensation. These
equities, if you are lucky, can really be life changing.

Fast forward now, the low hanging CRUD apps (i.e., Facebook, Twitter,
Instagram, etc) search space has been exhausted, and even new unicorns (i.e.,
Uber) don't make that much money, if they do for that matter. Now those
companies have become big, they are the winners in this winner take all filed
that is the cloud tech software. Now these companies these days have no use
for fullstack engineers anymore, but more specialists that do few things
albeit on a deeper level.

Today, even the startup equity math has changed a lot. Even with a good equity
package, a lot of the search space has been exhausted. So being fullstack
engineers these days that join startups don't pay as much anymore. Instead, a
better move would be to try to get into one of these companies because their
pay just dwarfed any startups or even medium / big size companies.

Just my 2c as someone who is very green (5 yrs) doing software engineering.
Happy to hear criticism.

~~~
alexashka
The tragedy of people who want to make money and believe it will be 'life
changing', is that no matter how many times they are told it won't be, they
think 'ha, you only say that because you got yours'.

What useless shit are you going to buy with 'life changing' money exactly that
a software developer's salary won't allow?

~~~
alasdair_
> What useless shit are you going to buy with 'life changing' money exactly
> that a software developer's salary won't allow?

I still have the childish dream of wanting to change the world. Specifically,
I want to be involved in certain kinds of political activism that will likely
piss off a number of people, and I want to have enough money that I can
support my family without needing to back off if my income source is
threatened.

I’m lucky enough to be almost at that point due to startup equity.

~~~
alexashka
Do you feel like you'll graduate to 'political problems are human problems and
human problems boil down to idiots having children, idiots raising children
and society being unwilling to rid itself of humans who have convincingly
shown to be counter-productive to society', coupled with 'for society to
function long term, it needs to have a shared goal, vision, not selfish goals
of one house, two cars, fence, big tv and plenty of derp entertainment for
each family'?

I feel like anything that isn't addressing these two issues is re-arranging
chairs on the Titanic as it's headed for a crash.

~~~
untwerth
Do you feel like you'll graduate to 'Political problems are not emergent
individual problems, but an issue of rules. The rules of the political game
are rigged to filter out competent people and only allow shallow ideologues to
power'? ;)

~~~
mistermann
This is an interesting counterpoint. On one hand, alexashka makes valid
observations, but to what degree is the behavior (s)he is observing a result
of high quality political leadership most anywhere on the planet? And if that
could somehow be changed, might people's behavior suddenly change as well?

------
oflannabhra
I'm not exactly sure where I fall on this. Ben is a really smart guy (way
smarter than me), but I feel like this could be a classic case of hindsight.

Now, looking back, it makes sense that the next logical step after PCs was the
Internet. But from each era looking forward, it's not as easy to see the next
"horizon".

So, if each next "horizon" is hard to see, and the paradigm it subsequently
unlocks is also difficult to discern, why should we assume that there is no
other horizon for us?

I also don't know if I agree that we are at a "logical endpoint of all of
these changes". Is computing _truly_ continuous?

However, I think Ben's main point here is about incumbents, and I agree that
it seems it is getting harder and harder to disrupt the Big Four. But I don't
know if disruption for those 4 is as important as he thinks: Netflix carved
out a $150B business that none of the four cared about by leveraging
continuous computing to disrupt cable & content companies. I sure wasn't able
to call that back in 2002 when I was getting discs in the mail. I think there
are still plenty of industries ripe for that disruption.

~~~
pthomas551
Was it really that hard to predict the Internet? SF authors picked up on it
almost immediately.

~~~
chrisco255
I mean, if you even look at popular sci-fi, nobody exactly predicted the
internet as it is today. It wasn't until someone coined the "information
superhighway" that gears started turning. Even then, the earliest commercial
websites were basically just digital brochures and catalogs. It wasn't until
SaaS, search and social took off that we grasped what the specific use cases
were that were going to be the dominant money makers. And the internet evolved
quite a bit as a result.

Some people like me still lament the loss of the 90s internet in some ways, as
it felt like a more "wild west" domain and not saturated and stale like it is
today.

~~~
dredmorbius
The concept of an information superhighway dates to at least 1964:

[https://en.wikipedia.org/wiki/Information_superhighway#Earli...](https://en.wikipedia.org/wiki/Information_superhighway#Earlier_similar_phrases)

~~~
chrisco255
It looks like those terms from the 60s and 70s referred to "superhighway" in
regards to communication, but didn't prefix it with "information". And whether
someone incidentally used the word or not is sort of irrelevant. It started to
become popular as a means of visualizing the possibilities of the internet in
the late 80s and 90s, and that's when I think the first people started to
imagine what this might become in the abstract.

~~~
dredmorbius
I'm leaning the other way -- that the usages were significant.

The Brotherton reference in particular interests me -- masers and light-masers
(as lasers were initially called) were pretty brand-spanking new, and were
themselves the original "solution in search of a problem". I've since come to
realise that any time you can create either a channel or medium with a very
high level of uniformity _and_ the capacity to be modulated in some way, as
well as to be either transmitted/received (channel) or written/read (medium),
you've got the fundamental prerequisites for an informational system based on
either signal transmission (for channels) or storage (for media).

Which Brotherton beat me to the punch by at least 55 years, if I'm doing my
maths correctly.

I've made a quick search for the book -- it's _not_ on LibGen (though Internet
Archive has a copy for lending, unfortunately the reading experience there is
... poor), and no library within reasonable bounds seems to have a copy. Looks
like it might be interesting reading however.

Point being: Brotherton (or a source of his) had the awareness to make that
connection, and to see the potential as comparable to the other contemporary
revolution in network technology, the ground-transit superhighway. That
strikes me as a significant insight.

Whether or not he was aware of simultaneous developments in other areas such
as packet switching (also 1964, see:
[https://www.rand.org/about/history/baran.html](https://www.rand.org/about/history/baran.html))
would be _very_ interesting to know.

Not much information on him, but Manfred Brotherton retired from Bell Labs in
1964, and died in 1981:

[https://www.nytimes.com/1981/01/25/obituaries/manfred-
brothe...](https://www.nytimes.com/1981/01/25/obituaries/manfred-
brotherton-81-a-physicist-and-writer.html)

~~~
chrisco255
That's a cool article on Baran, it looks like he predicted Amazon in 1968, and
they were experimenting with early email type systems in that time, too. I'm
sure the bulletin board followed shortly after.

Brotherton wrote a book on Masers and Lasers in 1964, you might find more info
in that: [https://www.amazon.com/Masers-Lasers-They-Work-
What/dp/B0000...](https://www.amazon.com/Masers-Lasers-They-Work-
What/dp/B0000CM7S4/ref=sr_1_1?keywords=masers+and+lasers+what+they+do&qid=1578478815&sr=8-1)

is that the one you mean?

~~~
dredmorbius
Yes, that book.

Baran's full set of monographs written for RAND are now freely available
online. I'd asked a couple of years ago if they might include one
specifically, and they published the whole lot. Asking nicely works,
sometimes.

Yes, there's interesting material there.

[https://www.rand.org/pubs/authors/b/baran_paul.html](https://www.rand.org/pubs/authors/b/baran_paul.html)

------
solidasparagus
Bah. He took three datapoints, built a continuum out of it and says that since
the third datapoint is at the end of his continuum, we must be at the end.

But this doesn't fit any of the upcoming trends. The biggest current trend is
edge computing where cloud-based services introduce issues around latency,
reliability and privacy. These are big money problems - see smart speakers and
self-driving cars. The cloud players are aware of this trend - see AWS
Outposts that brings the cloud to the needed location and AWS Wavelength where
they partnered with Verizon to bring compute closer to people.

But privacy in a world full of data-driven technology is still very much an
unsolved problem. And most of the major technology players have public trust
issues of one sort or another that present openings for competitors in a world
where trust is increasingly important.

------
nostrademons
I've seen similar analogies to the airline industry, but IMHO this misses the
forest for the trees. Tech isn't _an_ industry, like automobiles or airlines.
Tech is _industry_ , like machine tools and assembly lines. When industry was
first developed in the 1810s it meant specifically the textile industry, which
was the easiest existing manufacturing task that could benefit from power
tools and specialized workers on an assembly line. It was only a century later
that we could begin to dream of things like automobiles and airplanes.

Similarly, I bet that our great-grandchildren will look upon the Internet,
e-commerce, and mobile phones the same way we look upon railroads, paddle
steamers, and power looms. Great inventions for their time, and drivers of
huge fortunes, but also quaint anachronisms that have long since been replaced
by better alternatives.

Notice that the article focuses almost entirely on I/O and the physical
location of computation. This is a pretty good sign that we're still in the
infrastructure phase of the information revolution. When we get to the
deployment phase, the focus will be on applications, and our definition of an
industry focuses on what you can _do_ with the technology (like fly or drive)
rather than how the technology works. In between there's usually an epochal
war that remakes the structure of society itself using the new technologies.

FWIW, there was a similar "quiet period" between the First and Second
Industrial Revolutions, from 1840-1870s. It was very similar: the primary
markets of the original industrial revolution (textiles, railroads,
steamboats) matured, and new markets like telegraphs were not big enough to
sustain significant economic growth. But economic growth picked up
dramatically once a.) the tools of the industrial revolution could be applied
to speed up _science_ itself and b.) the social consequences of the industrial
revolution remade individual states into much larger nation-states, which
created larger markets. That's when we got steel, petroleum, electrification,
automobiles, airplanes, radio, and so on.

------
tudorw
Don't agree, comparing histories is not a reliable way to predict the future,
I think we'll see the growth of governance level disruption, a pushback that
will encourage home grown solutions for countries that are not necessarily
aligned with US interests. That field is wide open and growing!

~~~
camillomiller
Policy driven disruption is the only option I see to break the cycle. Let's
see.

------
legitster
I've been reading Zero to One, and one of the ideas the book pitches is that
monopoly and innovation are two sides of the same coin. Only monopoly-like
companies have time and money to dump into innovative products (Bell, GE, IBM,
Google). And people only invest in an idea if they think they can profit from
it (look at how crucial a patent system was for the industrial revolution).

Competition is important, but to drive efficiency - weed out bad ideas and
bring down costs of already created innovations. But the thing that usually
drives monoliths out of business is... new monoliths.

The somewhat contrarian takeaway is that some (keyword) amount of
consolidation is good.

~~~
hogFeast
That isn't right.

The truth is somewhere in the middle: definitely, you see some large companies
invest heavily but (more commonly) you see small firms nibble at the edges of
an existing product until it is too late for the larger companies.

Saying that monopoly produces innovation is like saying government produces
innovation. It happens but given a long enough period all things happen. The
question is about incentives: the incentives to innovate within large
companies are terrible, that is why it doesn't happen most of the time.

Also, consolidation has happened in all industries at all times. It is a
function of things that repeat: knowledge curves, lindy effects, etc.

Just generally: be wary of Thiel and his ilk. They have a predilection for
ahistorical nonsense. The history in this area, broadly business history, is
particularly difficult and not well known (the only tech person who I have
seen get close is Patrick Collison..and then...not really).

~~~
cdfky
That's not Thiel's argument at all. His argument is that the most innovative
companies tend to become monopolies.

However monopolies are not always due to innovation, nor our monopolies
inefficient. As you mentioned, it's a function of things that repeat, but also
due to stronger players that gobble up less efficient and/or innovative firms.

I would read between the lines. Business history is indeed difficult.

~~~
hogFeast
Yep, more of the usual basic errors.

First, I replied to a comment. The majority of your points should be directed
there. Second, your point about monopolies or why they happen is just
uninteresting (the question of "always" is not something that can be
answered). Third, your point about the most innovative companies tending to
become monopolies is wrong...I am not sure how little you have to know to
think this but it is certainly very minimal. The historical evidence is that
industries consolidate down to a few large companies, not that they become
monopolies. Fourth, again, I repeat what I said about ahistorical nonsense.
Neither in theory or reality is monopoly a natural consequence of capitalism.
Fifth, most monopolies that have existed in reality, by number, are not
privately-owned, they are not innovative. There is a fairly obvious inverse
correlation between monopoly and innovation (again though, the issue that is
confusing you is thinking that innovation -> monopoly...this isn't a thing).

~~~
andreilys
Thiel’s point is on the extreme end if you have hyper-competition, firms will
have no money leftover to invest in moonshots (self driving cars, cloud
computing, etc. ) instead they focus on pure survival.

I don’t see how a company that’s in a life or death struggle could pour
hundreds of millions/billions of dollars into R&D but perhaps I’m missing
something

------
mirimir
> What is notable is that the current environment appears to be the logical
> endpoint of all of these changes: from batch-processing to continuous
> computing, from a terminal in a different room to a phone in your pocket,
> from a tape drive to data centers all over the globe. In this view the
> personal computer/on-premises server era was simply a stepping stone between
> two ends of a clearly defined range.

Sure, that's what happened.

But what jumps out for me is that, at both ends of that range, users are
relying on remote stuff for processing and data storage. Whether it's
mainframe terminals or smartphones, you're still using basically a dumb
terminal.

In the middle, there were _personal_ computers. As in under our control.
That's often not the case now. People's accounts get nuked, and they lose
years of work. And there's typically no recourse.

As I see it, the next step is P2P.

------
magwa101
The current computing paradigm is all about "data entry", you are your own
"sysadmin". Slowly enriching others by working for them. Yes saved time is
valuable for you, but also, you create value for "them". We have moved from
mainframe to phone with very little design change. The current wave was about
convenience. There is a coming wave of redesign that is people centric.
Interfaces will be vastly different. This article is just a lack of
imagination.

------
tlarkworthy
Thats a very bold claim, that goes against Ray Kurzweil's hypothesis tech is
accelerating. Maybe (unlikely) that cloud/mobiles is the end game for silicon.
But what about quantum? What about biological? What about Nano? What about AI?
Literally there are a ton of potential generational changes in the making that
could turn everything on its head _again_

~~~
the_af
Why is Ray Kurzweil's hypothesis particularly important to contrast other
hypotheses against? What sets it apart in relevance and/or authority?

~~~
throw_14JAS
Because it's evidence is pretty straightforward: you can take wikipedia's list
of important inventions and plot their frequency on a chart.

Of course, there are debates around which inventions count as significant. And
there is recency bias.

Never underestimate the power in something easy to communicate.

~~~
the_af
> _" Never underestimate the power in something easy to communicate."_

Yes, I suppose that's the biggest thing with these "futurologists": their
predictions are both tantalizing _and_ easy to digest. I reserve the right to
remain skeptical about any of these Silicon Valley religions though.

~~~
throw_14JAS
This is not limited to Silicon Valley. Every predictor since before
Nostradamus has used this to their advantage. Those who haven't lose their
audience, because predicting the future in concrete terms is really hard. So
hard that nobody can do it regularly. Which causes people to stop listening.

~~~
the_af
Agreed!

What strikes me as particularly interesting about Silicon Valley and _some_
techie circles -- as opposed to Nostradamus -- is that many of these people
self-identify as hiperrational, agnostic, atheist or wary of traditional
religions, yet here they are, building their own religions under a more
palatable technological guise (I could list ideas like the Singularity, Super
AI good or bad, immortality, "we're living in a simulation", "every problem in
the world can be fixed with the right app", etc, but if the list of
absurdities goes long enough I'm sure to hit some raw nerve, so I'll stop
here).

These modern day Nostradamuses also tend to overinflate their own importance
in the wider world. Outside of techie circles Kurzweil is a nobody, and the
notion that his theories are some bar that other theories must somehow pass is
laughable.

~~~
tlarkworthy
Sure, some people are not critical thinkers. I am not a worshipper of Kurzweil
and not into Transhumanism, but I liked the historical charts he drew showing
exponential acceleration over a wide time interval and tech domains. I feel
his evidence of historical exponential progress was compelling, but what u do
with that evidence is up to you. If you have something to say about that point
I am happy to hear it, but you mostly seem to be lashing out at 'silicon
valley' types which is a mischaracterization of who you are talking to. I
respect Kurzweil because he is a real engineer, entrepreneur and a helper of
those less fortunate (the blind in-particular).

------
jiveturkey
As @oflannabhra said, I think this is a case of hindsight thinking, with
little predictive impact. Privacy issues can very quickly change everything.
Security issues (story on NPR today about medical devices pretty much all
vulnerable and ripe for random killing of people) could as well. Climate
change is going to be a large driver for technology in the near future. The
tech situation is very, very dynamic right now and it is way too early to say
we are going to settle down with the current tech giants.

Also, giants are giants. In manufacturing, there are absolutely _vast_
advantages to economy of scale. In tech, except for network effects, it's very
easy for a very broad array of upstart companies to dominate their respective
arenas at the 100bn level.

> today’s cloud and mobile companies — Amazon, Microsoft, Apple, and Google —
> may very well be the GM, Ford, and Chrysler of the 21st century.

Well, except google is not a cloud or mobile company. They are an advertising
company.

------
nl
_while new I /O devices like augmented reality, wearables, or voice are
natural extensions of the phone._

I don't agree with this at all. This is like saying "the internet is a natural
extensions of the operating system, therefore Microsoft Windows will remain
all powerful and the sole route to consumers"

Bill Gates in his famous memo realised that this wasn't the case, and Google
realized that mobile did to the internet what the internet did to Windows
(hence Android).

Wearables are radically different to phones. People want to use them
differently, and interact with them in different way to how they do with
phones.

To be clear: We are in the very early days of wearables, and Apple is far and
away the dominant player (and maybe Garmin). But there is huge disruptive
potential here.

~~~
F_J_H
Interesting point about Garmin. I wonder if it would ever make sense for Apple
to just buy them...

------
dgudkov
I disagree. A long period of evolution starts when a revolution before that
managed to find a more or less working solution. However at this point, there
are at least two big problems that haven't been solved properly yet, that get
worse every day, and where a revolution would be more probable than evolution
- social networks and payments.

I believe at least one more revolution would still be possible before we have
a long period of evolution. It will be a shift from centralization to de-
centralization (one more time), actually to federation. De-centralized
federated systems might be able to get social networking and payments to the
level where it finally works well and only needs to be gradually improved.

------
mooreds
The essay reminds me of The Deployment Age:
[http://reactionwheel.net/2015/10/the-deployment-
age.html](http://reactionwheel.net/2015/10/the-deployment-age.html)

------
gz5
>And, to the extent there are evolutions, it really does seem like the
incumbents have insurmountable advantages...

By definition, doesn't it _always_ seem like this?

Jim Barksdale (Netscape) said there are 2 ways to make money - bundling and
unbundling. What can be unbundled from the incumbent bundles, in order to be
offered in a more fit-for-purpose way, or with a better experience?

How might that answer change if the world's political structure changes? How
might that answer change if processing, storage and networking continue their
march towards ubiquitous availability?

------
Animats
His graph conveniently stops in the 1980s. Since then, there have been many
new US car companies, mostly in the electric of self-driving spaces. Lots of
little city cars, new imports from China, too.

~~~
the_watcher
He specifically mentions excluding imports. Outside of Tesla, what are the new
American car companies that made any kind of mark?

------
JohnFen
> there may not be a significant paradigm shift on the horizon, nor the
> associated generational change that goes with it.

That's possible, but I see things that lead to me think that we're not there.

Primarily, there are a number of rather serious problems with the cloud, some
of which are inherent to the paradigm and likely can't be resolved -- we'll
just have to live with them.

When a paradigm has such problems, the possibility always exists that a new
way of doing things can come about that sidesteps those problems.

------
ropiwqefjnpoa
The dealership model really helps manufacturers keep a tight reign on the
market, look at all the trouble Tesla had.

In a similar vein, Apple, Google and Microsoft control the medium and have
grown so powerful, I can't imagine there ever being a new "Google" that comes
about the old grass roots method.

Someday Apple will be bought though, probably by Facebook.

------
whatitdobooboo
I think if you abstract away the specific companies mentioned and stuck to the
technology, the point about people building on top of already "accepted"
paradigms is a good one, in my opinion.

The rest doesn't really seem to have enough evidence for such a bold claim.

------
LMo
Frankly not sure this piece really said anything other than the big 4 or 5 are
so unbelievably strong that we're all left playing in the spaces, usually
small, leftover.

------
graycat
For the OP, let me think ....

There is

> IBM’s mainframe monopoly was suddenly challenged by minicomputers from
> companies like DEC, Data General, Wang Laboratories, Apollo Computer, and
> Prime Computers.

So, to shed some more light on this statement, especially about "mainframe
monopoly", let me recount some of my history with IBM mainframes:

(1) Uh, to help work myself and my wife through grad school, I had a part time
job in applied math and computing: Our IBM Mainframe TSO (time-sharing option)
bill was about $80,000 a year, so we got a Prime, and soon with my other work
I was the system administrator. Soon I graduated and was a new B-school prof
where the school wanted more in computing. So, I led an effort to get a Prime
-- we did. IBM and their super-salesman Buck Rodgers tried hard but lost.

The Prime was easy to run, very useful, and popular but would not have
replaced IBM mainframe work running CICS, IMS, DB2, etc. Of course, in a
B-school, we wanted to run word processing, D. Knuth's TeX math word whacking,
SPSS statistics, some advanced spreadsheet software (with linear programming
optimization), etc. and not CICS, IMS, DB2.

(2) Later I was at IBM's Watson lab in an AI group. For our general purpose
use, our lab had six IBM mainframes, IIRC U, V, W, X, Y, Z. As I recall they
had one processor _core_ each with a processor clock likely no faster that 153
MHz.

Okay, in comparison, the processor in my first server in my startup is an AMD
FX-8350 with 8 cores and a standard clock speed of 4.0 GHz.

So, let's take a ratio:

(8 * 4.0 * 10 __9) /(6 * 153 * 10 __6) = 34.9

so that, first cut, just on processor clock ticks, the one AMD processor is 35
times faster than all the general purpose mainframes at IBM's Watson lab when
I was there.

But, still, on IBM's "mainframe monopoly", if what you want is really an IBM
mainframe, e.g., to run old software, then about the only place to get one is
from IBM. So, IBM still has their "mainframe monopoly".

Or to be extreme, an Apple iPhone, no matter how fast it is, does not really
threaten the IBM "mainframe monopoly".

Continuing:

> ... like DEC, Data General, Wang Laboratories, Apollo Computer, and Prime
> Computers. And then, scarcely a decade later, minicomputers were disrupted
> by personal computers from companies like MITS, Apple, Commodore, and Tandy.

Not really: The DEC, DG, ..., Prime computers were _super-mini_ computers and
were not "disrupted" by the PCs of "MITS, Apple, Commodore, and Tandy."

The super-mini computers did lose out but later and to Intel 386, etc. chips
with Windows NT or Linux.

> ... Microsoft the most powerful company in the industry for two decades.

Hmm. So now Microsoft is not so "powerful"? Let's see: Google makes it easy to
get data on market capitalization:

Apple: $1,308.15 B

Microsoft: $1,202.15 B

Alphabet: $960.96 B

Amazon: $945.42 B

Facebook: $607.59 B

Exxon-Mobil: $297.40 B

Intel: $256.35 B

Cisco: $201.47 B

Oracle: $173.73

IBM: $118.84 B

GM: $50.22 B

Microsoft is still a very powerful company.

Uh, I'm no expert on Apple, but it appears that the Apple products need a lot
of access to servers, and so far they tend to run on processors from Intel and
AMD with operating system software from Microsoft or Linux -- that is, Apple
is just on the _client_ and not the _server_ side.

It appears, then, that in computing Microsoft is the second most powerful
company and is the most powerful on the server side.

Sure, maybe some low power ARM chips with 3 nm line widths and Linux software
will dominate the server side, but that is in the future?

And personally, I can't do my work with a handheld device, need a desktop, and
am using AMD and Microsoft and nothing from Apple. A Macbook might suffice for
my work but seems to cost maybe $10,000 to have the power I plugged together
in a mid-tower case for less than $2000.

Broadly it appears that the OP is too eager to conclude that the older
companies are being disrupted, are shrinking and are fading, are being
replaced, etc.

Maybe the main point is just that in the US hamburgers were really popular and
then along came pizza. So, pizza is popular, but so are hamburgers!

I also go along with the point of zozbot234 at

[https://news.ycombinator.com/item?id=21986141](https://news.ycombinator.com/item?id=21986141)

> Software is still eating the world, and there will be plenty to eat for a
> long time.

------
streetcat1
This is wrong on merit, and I am not sure why it is presented this way.

The difference between a car company and a software company is economy of
scale. I.e. economy of scale dominate the physical world but does not exist in
the software world since I can replicate software at zero cost.

In addition, new tools and new processes for software has increased the
productivity times fold, which means that you need fewer developers for new
software.

I predict two shifts in the tech world:

1) Move to the edge. Specially for AI, there is really no need for a central
public cloud due to latency, privacy, and dedicated hardware chips. I.e. most
of AI traffic is inference traffic which should be done on the edge.

2) Kubernetes operators for replacing cloud services. The value add of the
public cloud is managing complexity.

~~~
mooted1
If you read more of Ben's writing, he talks extensively about how software
companies dominate market share through network effects and vertical
integration.

You don't hear him talk about economies of scale because marginal costs are
negligible for software companies. Besides, network effects and vertical
integration are sufficiently powerful to control the market.

> In addition, new tools and new processes for software has increased the
> productivity times fold, which means that you need fewer developers for new
> software.

There are other barriers to entry besides the cost of writing software, like
product, sales, operations, and most importantly, network.

~~~
streetcat1
However, the network effect in tech can be leapfrogged due to the zero
marginal cost (as shown in this post). I.e. what network effect do you get
from doing ML inference in the cloud?

The case for big tech today is still the economy of scale and not network
effects (maybe facebook have those, but it exists only if the interface to
facebook does not change).

The big tech players have economy of scale, due to their ability to use
automation and offload the risk of managing complexity (I.e. one AWS engineer
can manager 1000's of machines with AWS software).

No wonder, that the software that manages the public cloud is still closed
source.

However, with Kubernetes operators, there is a way to move those capabilities
into any Kubernetes cluser.

~~~
mooted1
Did you actually read the post?

> The case for big tech today is still the economy of scale and not network
> effects (maybe facebook have those, but it exists only if the interface to
> facebook does not change).

This is only true if you believe that the greatest cost of developing software
is running hardware. The greatest cost of developing software is developing
software. Not only are economies of scale in compute management negligible
except at massive scale, the cost of compute has declined dramatically as the
companies you've described have made their datacenters available for rent
through the cloud. Yet the tech giants persist.

Facebook, Google, Netflix, Amazon all have considerable network effects that
you're not considering. For each of these companies, having so many customers
provides benefits that accrue without diminishing returns, giving them a firm
hold on market share. See [https://stratechery.com/2015/aggregation-
theory/](https://stratechery.com/2015/aggregation-theory/)

Ben is saying that the only way to topple the giants is by working around them
and leveraging new computing technologies better than them. He makes the
(admittedly speculative) case that this is no longer possible because we can't
bring compute any closer to the user than the mobile devices.

> However, with Kubernetes operators, there is a way to move those
> capabilities into any Kubernetes cluser.

Kubernetes, at the scale of technologies we're discussing, is a minor
optimization. Introducing k8s costs more than it helps far until far into a
company's infra maturity. Even if most companies deployed k8s in a manner that
significantly reduced costs, it's not enough to overcome the massive
advantages existing tech companies have accrued. Not to mention all of the big
tech companies have internal cluster managers of their own.

~~~
streetcat1
I don't think that the amount of current customers is any indication of
network effects or any other kind of moat.

See: Walmart -> Amazon, Nokia->Apple, MSFT -> Andriod.

I mean, what more of network effect did MSFT had in the 90's. It was
dominating both the OS layer AND the app layer (office). And yet, it does not
have ANY share in mobile.

Kubernetes is not minor optimization if you think about what it is. Yes, if
you see it as mere container orchestration. But it is the first time that a
widely deployed, permissionless, open API platform exists.

