
Intel and the Danger of Integration - ingve
https://stratechery.com/2018/intel-and-the-danger-of-integration/
======
Symmetry
The thing is, though, that tying design and manufacturing together has been
really good for Intel. Historically Intel's performance in terms of driver
current and so forth at a given process node has been better than its
competitors and it has reached those nodes more quickly. Intel manufacturing
paid for that lead partially by accepting more restrictive design rules for
transistor layout than a merchant silicon shop could ever get away with
because they knew they only had one customer and one design they had to make
work. With Intel's numerous missteps at 10nm that lead is over and the
manufacturing debacle is taking down the design too but there's no reason to
think Intel would be starting from such a dominant place if they had split up.
If I were to go back to 2013 I'd tell them "10 nm is too early for cobalt
interconnets!" rather than "Split up your design and manufacturing".

And really you could also look at the problem as the design team being too
wedded to x86 and unwilling to consider other ISAs just as much as the
manufacturing problem. Intel could have done an ARM processor for the iPhone
but they wanted to make a go at an x86 mobile chip instead. And while the x86
tax is small potatoes when you're paying for a deep OoO pipeline it's pretty
noticeable in the world of in order processors or graphics processing units.

~~~
macintux
That's exactly what Ben concludes with.

> To demand that Intel apologize for its integrated model is satisfying in
> 2018, but all too dismissive of the 35 years of success and profits that
> preceded it.

Many business failure stories are like that. "It worked great until it
suddenly didn't."

~~~
pm90
Right. Its really hard to predict this kind of shifts in processes, or how
processes that you're relied on for a long time have been made obsolete. When
that happens, big businesses have an especially hard time because they're
usually not agile enough to reorient themselves/adopt new processes quickly
enough.

A peer commenter mentioned IBM. As an ex-IBM'er, I certainly think that is
true. The team I worked on adopted agile methodologies (which I think has been
largely adopted in the emerging tech teams). But you could still see remnants
of old processes and incentives, like a weird emphasis on writing _patents_.
That was kinda shocking to me as a SE to see a patent writing as a part of my
job, but I realized it was a holdover from earlier times. Ultimately, while I
enjoyed my time there, broken processes for recognizing contribution was among
the most important reason I jumped ship.

~~~
nostrademons
It's funny, though, that it's often very easy for _an outsider_ to predict
these kinds of shifts in processes, because they have no vested interest in
the status quo and are evaluating everything with fresh eyes.

As a kid growing up with computers in the 90's, I knew IBM was doomed. Why?
Because buying an IBM-brand PC when a clone cost 1/2 the price and did exactly
the same stuff was ridiculous, and Microsoft owned the software you actually
used (and had to make sure all your apps were compatible with). Older folks
would tell me "Well, IBM makes most of its money selling to enterprises, and
they can charge millions of dollars for a computer" and I'd be like "Why would
you do that when you can buy a PC for a couple thousand and hire a high-school
kid to program it for you?" and they'd mutter something about transaction
processing and customer support and how nobody got fired for buying IBM, and
I'd be like "That's stupid. Their customers are all going to go bankrupt, and
then IBM will too." I was a pretty insufferable teenager, but I was right
about a lot of that.

I'm 37 now, I've lived through four generations of technology (PC, web,
mobile, and cloud) and retrained for each, and I find I need to make a
conscious effort to try and see everything through college-student eyes. If I
were a new grad just coming out of school, how would I evaluate the world? And
I try to keep that in mind whenever I read eg. a thread on crypto here. The
total volume of all data on the Ethereum blockchain is roughly what I would
process in 10 minutes with a small MapReduce when I was working at Google. But
it's very likely a college student today would be like "So? What's the point,
when nobody trusts the answers you give them because you're just a giant
corporation that keeps all your data secret?"

~~~
gascan
I use to ask my father why he bothered using a desktop email client. It seemed
like useless overhead, and webmail was the future. Well, webmail is indeed
everywhere- but I now use a desktop email client daily.

Outsiders don't tend to talk about the predictions they got wrong.

~~~
nostrademons
Heh, that's very true, but I think that the predictions I've gotten wrong
actually tend to err on the side of being too conservative about the future
rather than too radical, or of betting on the wrong horse between radical
alternatives. I was skeptical of the WWW at first ("Gopher is better
organized"), continued to use Eudora and then Outlook right up until GMail
came out ("you have to be online to read your e-mail?"), and still don't
really get what's so great about Facebook (LiveJournal was better in just
about every way, and basically every feature that Google+ forced Facebook into
implementing was actually cribbed from LJ a decade before).

I think there's a cognitive bias somewhere about people preferring the mental
model they already hold over some unknown future. Realistically, pretty much
the only thing you _can_ be certain of is that the future will not be the same
as today, it's just that picking between many different futures gives you
terrible odds of selecting the right one. It seems like your options are
assuming that the status quo will continue, which is guaranteed wrong, vs.
choosing between multiple competing alternate futures, where you are most
likely wrong.

~~~
perl4ever
If you predict the weather tomorrow will be the same as today, you will be
right most of the time. It's easy to do worse with a sophisticated model.
That's probably a significant reason why evolution leads to humans and other
organisms generally not using too much logical analysis, but simply doing what
works until it results in a major disaster.

------
sumanthvepa
I understand the point that Ben Thompson is making: that a vertically
integrated Intel lost out in the mobile revolution because it wouldn't
unbundle it's manufacturing from its processor design. But its current crisis
(not referring to the CEO's affair) has nothing to do with vertical
integration. It has everything to do with Moore's law scaling coming to an
end. This gives fabs like TSMC and increasingly SMIC from China the ability to
catch up. Unless there is a breakthrough in technology -- molecular
transistors or some such, Intel's ability to dominate will end soon.

~~~
rwmj
From the quoted tweet in the article: _" I said that ICL should be taken to
14nm++, everybody looked at me like I was the craziest guy on the block, it
was just in case"._

Imagine if Intel were two companies, Intel Design and Intel Manufacturing.
This wouldn't have arisen - Intel Design would have produced the new ICL
design and could have worked with Intel Manufacturing and TSMC on how to build
it, ultimately using TSMC if IM couldn't make it.

~~~
sumanthvepa
I suspect you may have hit on Intel's eventual fate. The company may indeed
split up into a design unit and a manufacturing unit. That might unlock
significant shareholder value.

~~~
PedroBatista
It would take many years since the spinoff that significant players would be
comfortable enough to manufacture their designs with "Intel Foundry" since
they known damn well there would be several hot-lines between Intel Foundry
and Intel Designs.

~~~
quanticle
Was that true of AMD/Global Foundries, when they split?

------
PaulHoule
Good article, but it misses plenty.

One has been marketing that damages its own brand, particularly the "Atom"
phenomenon. Intel had such an obsession with phones that it just had to make
x86 processors that were too weak for phones, never mind anything else, full
of bugs, etc.

It might be a classic case of "boiled frog" because few have realized that
ever since Intel switched to PCIe, Intel chips have been starved for PCIe
lanes. There has been no point to SLI in a consumer configuration because
there just is not enough I/O bandwidth to support two graphics cards. In fact,
Intel hopes you will settle on "integrated graphics" (the same as on a phone)
so you won't have a reason to buy a computer instead of a phone.

Optane is a story like 10 nm. Just add tone-deaf marketing that makes a
potentially revolutionary product seem like nothing.

Probably the thing that has hurt the PC industry the most is the slow
transition to SSD. Given a choice to a brand new computer that has an i7 chip
and an HDD which will show you a spinning cursor most of the time, or an eight
year old computer with a new SSD which can boot before you get old, the choice
is obvious.

If Intel had said something like "an i5 chip has to come with an SSD" or if
Microsoft had required SSDs for the Windows 8 launch people would be like "Wow
this is much better than what I had before." As it is, Intel has mandated a
Meh experience for a long time and the tech press has let them get away with
it. (It's murdered the tech press too since now all Tom's Hardware talks about
is RGB lighted fans and the really cool monitors that might get released
someday...)

~~~
grigjd3
It's pretty clear the PC market went into decline as the smart phone market
ate up their sales and I find it highly doubtful a faster harddrive was going
to overcome the convenience factor of the smartphone.

~~~
PaulHoule
The smart phone has an SSD in it and that is why the smartphone has many
convenience factors.

For instance, a netbook with an SSD can ride in your backpack and play music
to bluetooth headphones, just like a phone. If either of those had an HDD you
would have trouble with the music skipping caused by the vibrations of your
footsteps. Probably you'd trash the HDD before long.

The fast responsiveness you expect from a phone also comes out of the use of
SSD. Computers with HDD frequently "go out to lunch" and that's a convenience
negative for a computer (if it doesn't have an SSD).

------
jl2718
The problem with Intel, as I see it, is an inability to get new designs to
production, and a low tolerance for market growth versus market sustainment.
They’ve been a disaster at acquisitions, being totally fine with wasting
billions on failed outside investments without any program for internal
ventures. The latest focus on “Intel is now a data company” is just complete
denial of the reality.

But the margins are awesome, and I’d reject any notion that they should erode
that with fab services. They need more of the high-volume, high-margin Intel-
designed chips that have been their hallmark. Everything else is a
distraction.

Some of that depends on identifying new forms of computing. Therefore, it
makes sense to do technology experiments, but lately Intel has focused on
market experiments, where they are not really developing new technology, but
building internal organizations with sales, marketing, and product management
that dwarf engineering.

This is a problem not just because it diverts resources, but because nobody
really knows what Intel is anymore. Engineers don’t feel like they are working
in an engineering company anymore.

Intel’s margins are great, and despite the negative prognostication, their
market position is still dominant. Even the engineering talent at Intel is
still some of the best, although the management is somewhat deliberately not
their best representation. They Just need to focus on being an engineering
company again, ignore sales, ignore acquisitions, and invest in their own
people to do the experiments that will lead to the next ubiquitous high-margin
chip.

------
throwaway2048
I really think this article fails to make its point about what exactly intel
stands to gain from say, manufacturing processors for Apple. It trades in its
major competitive advantage, advanced processes (dispite the propaganda intel
is far from out of the game process advantage wise) coupled tightly with its
processor designs that ensure margins that would make most companies blush,
for low margin hypercompetitive contract work.

Intel is under absolutely no real threat of being unseated as the company that
powers virtually every server in existence, throwing that away to chase
contract mfging work dosent make any sense, even if some other companies do it
well. The arguement it makes would be more like intel going back into DRAM
production, and abandoning x86.

~~~
sho
> throwing that away to chase contract mfging work dosent make any sense

It wouldn't be throwing anything away. Intel Fabs could still make Intel
Design's Xeons same as always. It could just make everyone else's stuff as
well as capacity and economics permitted. Why would you leave that on the
table?

The best example is perhaps Samsung. Not only do Samsung's fabs make all the
chips for samsung phones, they make them for others as well, including up
until recently even their deadly competitor Apple! If you've got fabs, why
wouldn't you do that? Even if Ryzen totally destroys intel on the desktop it's
heads I win, tails you lose.

~~~
throwaway2048
Becuase it gives their competitiors a pretty substantial leg up.

~~~
JumpCrisscross
Intel’s advantage was built on scale. They could build more which meant they
could invest more which meant they could build better and build more. By
leaving scaling opportunity on the table, TSMC can now build more which means
they’re investing more and oh no they just passed Intel.

------
AstralStorm
It is not like Intel has not repeatedly tried.

Atom is their attempt. As are more advanced MCP51 micros.

Intel's problem is that their CPUs were performance focused and as such didn't
scale down as well - plus vendors didn't have to care as OSes were portable
enough while software has to be rewritten anyway, so going Intel brought no
compatibility advantage. This is the same reason why Microsoft has trouble
penetrating mobile market.

~~~
tsenkov
> Intel's problem is that their CPUs were performance focused and as such
> didn't scale down as well

Don't know the first thing about electronics, but I am interested to learn
more about what you meant here. Highly-optimized code (usually) isn't too
readable or extendable, are you saying Intel has the same problem with their
processor's design?

~~~
tlb
Fast processors work by “speculating” — calculating several possible next
steps in advance before knowing for sure what the next step will be. Modern
Intel chips do this several steps ahead, at multiple levels. It uses more
power, because it does work that is ultimately wasted.

Large amounts of speculation makes sense for data center CPUs, but not for
battery-powered devices.

~~~
std_throwaway
I'd say this is a myth unless you have actual numbers.

------
usefulcat
On one hand, this article seems to make a lot of sense. OTOH, INTC is up 40%
in the past year. It seems likely that there are quite a few parties out there
who probably don't agree with this analysis.

~~~
xenadu02
Yes and Microsoft profits were rising at the same time the web, iPhone,
Chrome, et al were eating out its foundation. Bell Labs invented VoIP; AT&T or
Verzion could have built Skype at any time. Xerox and PARC. RCA and LCDs. This
sad story repeats itself continuously.

That's the whole problem with the innovator's dilemma and why so many people
look up to Apple - one of the few companies able to cannibalize an existing
product line in pursuit of the future. Mac & (Lisa/Apple II), iPhone & iPod.

It is extremely rare to be able to see the future coming, time it correctly,
then undercut your current extremely successful product line while you prepare
for the change. Everything in business school 101 screams "NO!". Managers are
paid to optimize returns and the current business.

Ben is 100% correct about Intel's history. They clung to their identity as a
DRAM company and it almost killed them. This time they're clinging to their
identity as an integrated processor company when most of the world wants fully
or slightly customized SOCs. Intel directly passed on the most lucrative
mobile SOC business (iPhone). It is unfortunate that Intel missed this second
opportunity. Maybe new management can pivot, but whoever it is will only have
about 90 days to make the change.*

* You typically have about 90 days where you're still an "outsider" who can proffer opinions on problems with current practices. For management this is also the best time to make sweeping changes since most people will be willing to jump on the new direction. IMHO after that you have to spend a few years building credibility to get things done without people ignoring or sabotaging you.

~~~
ajross
> Yes and Microsoft profits were rising at the same time the web, iPhone,
> Chrome, et al were eating out its foundation.

So... the scenario you have in mind for Intel's coming collapse is... a not-
quite-as-dominant major industry player in the coming decades? Microsoft got
passed by Google and Facebook and Amazon, sure. They're still making, y'know,
crazy bank.

------
baybal2
Intel's biggest threat is the decline of WinTel, not its inability at making
stuff for consumer products.

Intel has legions of top tier IC designers, they can make Apple A10 or
Qualcomm 845 class chips as weekend projects. What they don't have is
determination to abandon the proven WinTel model business, which is still an
extreme cash cow.

~~~
AgentOrange1234
Weekend project? I thought the Apple chips were getting pretty competitive
with Intel chips despite the thermal/power constraints of a phone. I thought
AMD was getting really catching up with Ryzen and thread-ripper. I thought
Intel was slowing from tick-tock to tick-tock-tweak. I thought Intel had a
couple of large layoffs in the past years which axed a lot of their older^w
underperforming workers. Are they truly still such a powerhouse?

~~~
baybal2
>Are they truly still such a powerhouse?

Even after layoffs, they still have a humongous IC designers headcount.
Possibly, still, largest in the industry.

------
wyldfire
Intel embracing RISC-V would be a super-bold move. Given the support we can
already see from the long list of RISC-V foundation members, it's bound to
succeed. In the next couple of years it will probably wrestle a few royalties
away from ARM.

Beyond that, I think it will end up a legit heavy-duty apps processor for
Android phones. It might start out on the mid/low-end phones, but
manufacturers will absolutely love the idea of cutting out ARM from their BOM.

~~~
phkahler
It would be very interesting to see and Intel or AMD designed RISC-V chip. I'm
curious just how the ISA could do given similar resources brought to it.

AMD should be in a better position to try because they were already planning
to put ARM cores in their SoCs a few years ago. While that effort was dropped,
I assume they still have a bunch of people around that know what the issues
are and are ready to tackle them. Problem is ARM doesn't really have _enough_
of those people to divert to alternatives.

~~~
wyldfire
AMD or Intel could beat others to the punch and make a mobile SoC with RISC-V
applications cores. They'd be the first to face the pain of porting Dalvik and
NDK apps, but in the end they'd be way out ahead of the competition.

~~~
phkahler
I've just been assuming Google would bring Dalvik to RISC-V. They're already
porting Go and some other things. If they were to release Android for RISC-V
it could be devastating to ARM. Not that I think Google cares about ARM,
they've been enjoying competition among SoC designers due to ARMs reasonable
licensing (that's a relative term of course).

~~~
wyldfire
I meant the more abstract pain of introducing the new ISA and probably taking
blame for problems with functionality and performance. Yes, Google would
probably be the ones executing this task.

------
ksec
I don't think, right now Integration is something wrong about Intel. It is how
they miscalculate every thing every time for the past years, even before BK.
Planning / Vision, and Execution, I am not sure if there are anything else
other then these two attribute that is important to a company, and Intel
lacked both.

Intel used to win because they have an effectively monopoly in PC market. The
high price, high margin and high volume CPU drove investment in tech and their
Fab. That was possibly the only item I know other then iPhone that had those
three factors together. When the Smartphone revolution started, everything
changed. The scale changed, TSMC now produce possibly close to 5x volume then
Intel do at Intel Fabs, and their leading node 7nm will likely reach a similar
quantity to Intel's 14nm in 12 months time.

TSMC now has 2X+ the Total addressable market in leading node compared to
Intel. So while new leading node investment cost continue to increase, TSMC
also had increasing market size to help spread itself. If you note Intel's
mainstream CPU die size, the trend has been going downwards over recent years,
i.e They are optimising for cost.

Intel could also have expanded their Scale. They could have kept their
integration and make more chip, expanding their Fab advantage in both scale
and technology. Although they did but it was either late or taken far too
long. Their GPU is only looking to launch in 2020, as a first gen product I
doubt they will make much an impact and volume. Imagine they had a dGPU
prepared 4 years ago, when the crypto wave hit, they ride the wave and gain
money earned from Crypto, or they would have a dGPU for a market prices that
is cheaper then what ever Nvidia or AMD's bumped up prices. Their 14nm process
would actually be miles ahead of 28nm or 16nm that those GPU had.

It took them 8 years, 8 years! from acquiring Infineon modem to actually
produce a modem in their own Fabs. Next time some M&A talks about synergy one
should ask them, how long does it take. Now they have the contract for iPhone
modem and could produce it themselves instead of TSMC, they face a new problem
that no one has asked so far. How is Intel going find additional capacity to
produce an additional 100M modem for Apple over the next year? My guess, the
original plan of Intel was the move to the much delayed 10nm will finally be
fixed in 2018, leaving some room for the Modem. The modem is much smaller in
size, so it is more like 40M unit to Intel's median chip size. But that is
still roughly 20% additional unit Intel has to chip on their leading node.

And if you had read some rumours about Intel delaying, or renaming their
chipset originally going from 22nm to 14nm. That is a possible reason why. May
be they shouldn't have defer Fab 42's construction in the first place.

I am pretty sure BK's departure has more to do with his performance then
whatever they wrote in that PR. And may likely has something to do with Apple.
I wouldn't be surprised if Apple has been unhappy about Intel's lying and
execution. Remember when Apple started to design their "thinner" MacBook and
MacBook Pro, it was obvious they had 10nm and 7nm Intel's CPU in mind. First
was suppose to happen last year and the latter schedule for next year. This
two year delay, which Apple may or may not know, put Apple in an awkward
position. Their custom foundry, were suppose to attract Apple to built an
integrated SoC with Intel Modem inside, which is now even less likely to
happen when Intel has not been as open about its problem. When Apple
contracted Morris in TSMC, they managed to bring up a new Fab running for
Apple in less then 6 months. It was a big bet on both side, billions of dollar
invested. Could you imagine Intel doing that? These relationship takes time to
built, and TSMC right now has literally zero fault.

I do agree Intel now faces another turning point, last time they had Andy
Grove, now I am not sure who they have hire to get this behemoth running. The
task is enormous, and lots of Intel's best engineers and executive have left
the company. The only one who may have a slim chance to completely transform
Intel is Pat Gelsinger, if Andy Grove saved Intel last time, it will be his
apprentice to save Intel again. Unfortunately given what Intel has done to Pat
during his last tenure, I am not sure if he is willing to pick up the job,
especially the board's Chairman is Bryant, not sure how well they go together.
But we know Pat still loves Intel, and I know a lot of us miss Pat.

~~~
baybal2
>I am pretty sure BK's departure has more to do with his performance then
whatever they wrote in that PR. And may likely has something to do with Apple.

Yes

>When Apple contracted Morris in TSMC, they managed to bring up a new Fab
running for Apple in less then 6 months. It was a big bet on both side,
billions of dollar invested. Could you imagine Intel doing that? These
relationship takes time to built, and TSMC right now has literally zero fault.

Apple contacted both TSMC and Intel for their own modem. They came with "We
are ready put $n billions cash bond on the table today." And Intel refused.

------
mozumder
Well Microsoft is about to be fully integrated if they make their own CPUs for
their PCs like Apple is about to do (and sorta like how they designed their
own XBox processors)... Let's see how that plays out for them and if this
anti-vertical-integration theory holds.

------
reality_czech
Does this mean we can have more than 16GB in our laptops now?

Also, is "stratechery" named based on the old Celebrity Jeopardy skits on SNL?

------
HillaryBriss
i love the article's general quotation about disruption:

 _... what makes disruption so devastating is the fact that, absent a crisis,
it is almost impossible to avoid. Managers are paid to leverage their
advantages, not destroy them ..._

as other firms innovate, those advantages become obsolete practices so
leveraging them means taking on more risk, not less risk.

------
jpeg_hero
Does anybody know a good technical history of Intel?

I teas “Inside Intel” and there was some good stuff but seemed too
superficial.

------
blattimwind
I think the most common issue would be neglecting constant values.

------
dvfjsdhgfv
> it was already clear that arguably the most important company in Silicon
> Valley’s history was in trouble: PCs, long Intel’s chief money-maker, were
> in decline, leaving the company ever more reliant on the sale of high-end
> chips to data centers;

Where PCs were replaced, they were succeeded by laptops, and for the most part
they use Intel processors.

~~~
huebnerob
The term “PC” refers to both desktops and laptops. Both are in decline as
general purpose computing moves to the smartphone.

~~~
gsich
I wouldn't overestimate the impact of smartphones. It's just that PCs (both in
desktop and laptops) don't need upgrades anymore from year to year. You can
use your PC from 2012 (Intel i5-2xxx if you remember), possible even farther
back and you'd still be fine for almost all tasks. Gamers and other high-
performance tasks are a niche. The biggest performance gain for these machines
is a SSD, not a new PC.

The same will happen to smartphones. You can already see it. Manufacturers
trying so hard to invent features nobody needs, just to have an excuse to buy
a new device.

~~~
dvfjsdhgfv
That's the point. The so-called "post-PC era" has never arrived. Personally I
don't know any adult who doesn't own a computer. We no longer upgrade so often
but we use them on a daily basis. Pretty much anyone doing any kind of white-
collar work will use a PC (that includes Macs...) Your needs outside of the
work environment will vary, but people are still not throwing their laptops
away just because they use their phones more often.

~~~
oblio
The question is: do new adults (think 18-24 year olds) buy desktops or
laptops?

~~~
apetresc
At least one, yes? There's no university student without one, and there's no
white-collar worker without one.

Literally nobody is typing up their 10-page essay on their phone.

~~~
oblio
Tablets + keyboards.

~~~
michaelmrose
Tablets remain truly mediocre devices, in the case of Android running a
mediocre os.

\- garbage multi tasking

\- slightly laggy typing

\- no real ability for multiple apps to work on the same data

-10" is still a tiny screen and almost as unwieldy as a small laptop 7" is too tiny yet still not pocket size.

Since a tablet usually does not have by definition a cellular modem it can't
replace your phone plus the cameras and microphone normally blow.

I find the idea that most people are using a tablet in place of a real
computer laughable.

------
dkrich
I’m not so sure that not choosing to produce ARM based chips for the iPhone
was such an obvious miss for Intel. Why would they want their business subject
to the whims of a single company over which they exert zero control?

Just as an example, a couple of months ago Apple seemingly out of the blue,
announced they would no longer be using Intel chips in future laptop models
and the markets barely noticed. Were they more reliant on PC sales that would
not have been the case.

Perhaps they are way behind AMD with respect to the latest chip designs, but
even that is far from proven.

I think this article is attempting to make a company with a very complex
history and product lineup sound simple.

~~~
hmottestad
\- 260 509 000 products with ARM (iphone + ipad)

\- 19 251 000 products with Intel (macs)

These are the numbers for 2017 from
[http://investor.apple.com/secfiling.cfm?filingid=320193-17-7...](http://investor.apple.com/secfiling.cfm?filingid=320193-17-70&cik=320193)

Apple sells twice as many ipads as they sell macs. And more than 10x as many
iphones as macs.

Revenue wise it's a little bit closer.

\- 160 541 000 000 $ for iphones + ipads

\- 25 850 000 000 $ for macs

~~~
endorphone
Intel makes ten to hundreds of dollars for each x86 processor they sell. They
would make pennies to dollars manufacturing chips for other companies. If
Intel made their own mobile chips they would have to similarly dramatically
reduce their enormous margins.

In any discussion about Intel this is the point that really reigns supreme --
Intel was in the enviable state where their primary concern was always putting
their own profits at risk. In the long term of course they should to move to
new, growing markets (e.g. mobile), but from a closer term perspective they
knew that such products, if not seriously crippled, would start chipping into
their higher-end processors.

Hence why their mobile processor offerings were built on laughably dated
processes, using very obsolete designs. If Intel made them better they knew
that someone would figure out a way to drop 32 of them on a motherboard with a
shared memory controller, etc.

~~~
oblio
Well, according to the Innovator's Dilemma, if you're Intel now you throw 1-2
of those billions of dollars of yours into a totally separate startup that you
fully own.

That startup is free to make ARM chips or whatever the markets wants. It's
better long term if you cannibalize yourself than if other companies do it.

~~~
perl4ever
If you do that, as a public company, don't investors come along and say "hey,
you really need to spin that off"? I mean, I see there are quite a few
companies that just seem to have a grab-bag of random stuff they make, as
though they were a mutual fund of different tangentially related businesses.
And it certainly occurs that people decide they are more valuable broken up.

From the larger perspective of society in general, why should a company be
eternal; why does it need to always have a way to pivot?

