Wow, and ouch. I give them credit for ceding the ground for now but this is another sign of how much ARM has been encroaching on Intel's space.
Scott McNealy said early on that retreating to the Data Center was where companies went to die. And at the time was selling workstations on desks and crushing DEC, then when that business got eaten away by Windows the data center was where Sun was going to make its stand. And then it died.
I think a lot about what the fundamental principle is here. How do seemingly invincible forces in the market get killed by smaller foes? Christensen's Dilemma doesn't really explain it, it describes it, but it doesn't tease out the first principles behind it.
At this point I think it is a sort of Enterprise "blindness" which was something Steve Kleiman at NetApp shared with me. A company can be so good at something that they focus all their energy on it, even when it is vanishing beneath them. Consider a fictional buggy whip company when automobiles came on the scene, right up until the last day they made buggy whips they could be the best whip you could buy, all the secrets to making a great buggy whip where mastered by this company, all the "special sauce" that made them last longer, work in a wide range of conditions, and yet the reality was that the entire reason for buggy whips existing was evaporating with the loss of carriages. By focusing on what the company was the undisputed leader in doing, they ride the wave right into the rocks.
When the company is so stuck on what used to work for them, even after the technology has moved on, they become blind to the dangers. Challenging to watch, even harder if you feel like you can see the train wreck coming. And of course soul crushing if nobody driving the bus will listen to the warnings.
The next sign I'm waiting for is Apple to ship a Macbook laptop with their own 64 bit ARM processor in it. Then it gets really really interesting if AMD can pull off an ARM server chip, going where Intel won't.
Scott McNealy said early on that retreating to the Data Center was where companies went to die. And at the time was selling workstations on desks and crushing DEC....
Maybe. That was before cloud computing based on a much cheaper, faster and more pervasive Internet became such a big thing, plus there were at least 3 things that caused them to ultimate fail to where they "retreated" to:
A problem with insufficient error checking and correction plus SRAM chips from IBM that Sun claimed had higher error rates than they anticipated was handled dreadfully, they wouldn't help customers without forcing them to sign NDAs, they initially blamed customer environmental conditions, etc., In short, they showed they couldn't be trusted, which ought to be fatal in the enterprise market.
They consciously decreased the quality of their Intel servers, two things I remember was penny pinching by removing one tried and trusted Intel Ethernet chip and substituting a Marvell chipset one, and kept changing the lights out hardware without changing model numbers, which is what Joylent blamed for their stopping buying from them.
If you couldn't charge it to a credit card, or bought enterprise quantities of stuff, like more than a couple million dollars, you simply couldn't buy their stuff back when it was higher quality, they insisted you go through a VAR or reseller, but few of those were really interested in selling the hardware, and many of the few that might were squirrelly. So before AWS et. al. became a thing, a lot of companies reluctantly bought Dell hardware simply because they could, and by the time they graduated to buying hardware at "enterprise" levels they already had the in-house expertise to manage cheaper but lower quality Dell systems.
And the latter problem, making it hard to buy their stuff (which damages or kills more companies than you think, it's also been a big problem for HP and one reason they didn't get this business), is one of the things that killed DEC.
So comparatively simple failures to execute can be a big part of a company failing at the same time it's facing the Innovators Dilemma.
> it's also been a big problem for HP and one reason they didn't get this business
This is a little OT already, but I have a friend who does IT purchasing and the horror-stories she tells me about how difficult it is to buy HP products scare me. On top of that their client-service is horrendous, and from time to time the local HP office even tries to "cheat" on its direct clients (among other things by "helping" other, preferred clients, to get lower prices and to win tenders). I'm still wondering how they're still in business.
I'm still wondering how they're still in business.
A "least worst" enterprise vendor?
Why is the dollar still strong? The theory I like is because all competing currencies are worse. And while I don't know the competing enterprise market as well, I don't get the impression that today HP has seriously better competitors in full service classic enterprise space (this of course ignores paradigm shifts like the cloud, where I'm sure AWS and company are eating the lunch of various enterprise categories, storage for one).
Who's noticeably better than them? Sun and DEC weren't, IBM sure doesn't sound like they have been any time recently (and for service service they've joined the race to the bottom), who else is there? I don't know, it's not a field I follow anymore.
... this of course ignores paradigm shifts like the cloud, where I'm sure AWS and company are eating the lunch of various enterprise categories, storage for one
HN has its own distortion bubble here. It takes a long (decades) time for new technology / best practices to penetrate into large organizations in stable, established markets.
But I think two things are changing that. One: upstream vendor migration to cloud / more modern solutions (e.g. Salesforce). Which decreases in-house footprint. Two: younger talent that's more familiar with modern ways of doing things (working with AWS, cattle services on commodity boxes, etc).
I think one of the greatest failures of enterprise companies is that they've historically done a terrible job at making their hardware / software available to students. Which means no reasonably-priced supply of knowledgeable potential employees.
And if the TCO of a mainframe / COBOL solution includes scouring the globe for the one remaining person who knows how to work on it and then paying them a premium to do so...? Well, that'll get enterprise (still slowly) moving.
Sun was killed by the confluence of a poorly timed loan and the GFC. Their revenue/cash reserves were insufficient to cover payments on loans taken at the peak before the crash. When GFC happened, their primary Wall Street customers suddenly turned cost conscious. I think that if they had negotiated slightly better terms on the loan they'd still be around today.
That sounds like a final straw; if they'd been well managed and healthy, would that have necessarily been a killing blow (I don't know about those details of their fall, just the decline that preceded it and made it inevitable).
How do seemingly invincible forces in the market get killed by smaller foes?
Back around 2009 I had a little visibility into Intel's mobile efforts. My information is way out of date, but unless Intel management changed drastically, my hunch is that this explanation is still valid.
At the time Intel had already introduced some mobile silicon, but there was little uptake. So they were iterating; they wanted to improve for each succeeding generation. But they had a kind of design-by-committee process. One person or group wanted a certain feature, another group wanted something else, a third group though that yet another thing was important. And so on. Sorry if that sounds vague, I won't write anything more specific.
The end result was a chipset that had a lot of features. A LOT OF FEATURES. Gold plated features. But that meant higher power consumption than the competition, higher cost, larger form factor, longer time to market.
At the time, there was nobody like a Steve Jobs running Intel mobile. Nobody had the intuition and the gravitas to say: "we want A, B, C, but not D, E, F. Quit wasting time and build the fucking thing! I need something that's competitive!"
This is similar to something that an Intel chip design manager told me about 30 years ago. His view was that there were really only two ways to motivate engineers:
1) show them a path to lots and lots of money. This is the path that startups take. Focus, build something quickly, get rich or die trying.
or 2) engineers will "play in the sand", they will add in all sorts of neat features that they think are really cool, that they want to implement, that they are convinced are important. But all that crap isn't very useful to the average user, it just results in complex designs that miss their market window in so many ways.
Intel probably continued to choose path 2 for their mobile efforts.
Intel didn't fail because of any variation on The Innovator's Dilemma. They understood that mobile was the future, that it was very important. They expended tremendous amounts of capital trying to "win" in mobile. They just didn't have the organizational structure that let them execute.
The thing is, it's not gonna help to have a "Steve Jobs" nor will any degree of understanding the competition help if you're used to ship an expensive high-margin product and along comes a low-margin competitor. Neither ARM nor TSMC attempt to make the kind of profit that Intel is used to making and this is the real problem for Intel; even if it wins 75% of the mobile market by shipping adequate chips at adequate prices, the profits won't be anywhere near what it gets in the PC market. (Of course this is just as true about Apple as it is about Intel, and I think their latest quarter results testify to it just as much as Intel's. Actually, the surprising thing about Intel (which is not true about Apple, Steve Jobs and all) is the degree of their dominance in the market for anything resembling a PC - a computer where you can get work done; one might expect both AMD and non-x86-based workstation products to do better, certainly Android is a bigger threat to iPhone/iPad than any competitor ever was to Intel in the PC space.)
Something about margins I don't get: Intel's Gross margin is about 80%. In the fabless ecosystem, the fab usually gets 50% and quallcom is usually above 50% , something even 65% .
So the total combined margin is as high as Intel or more. So why does Intel fail?
All of these companies sell many different products and I'd guess that their reported overall gross margins will be different from the margins of any given product. For instance, I rather doubt that either TSMC or Samsung get 50% margins when making Apple's chips.
In Apple's case it makes sense to lower margins to increase profits.Just common sense. But why not in general ? why does Wall Street hates that so much ?
Benefits through competition? Intel is only one company, it's competing with a whole ecosystem of companies, who can all iterate and find efficiencies on chip design, and on manufacturing. And those improvements which are specific to the processor type will inevitably get passed around over time.
> Intel didn't fail because of any variation on The Innovator's Dilemma.
Depends how you look at it. My understanding of the Innovator's Dilemma, is that when faced with disruption, which implies radical change within the organization, nobody in that organization really wants to do that, because it messes up too many of their processes/products, etc.
So what do they do instead? They try to make their existing technology/product fit the new paradigm. This is what Nokia tried to do with "touch-enabled Symbian", instead of writing a new touch-focused OS from scratch (they did do that with Maemo actually, but also because of Innovator's Dilemma reasons, they failed to focus on it and invest in it).
It's also what Intel did with Atom, and by the way, why they killed the ARM-based Xscale division as well a few years before the iPhone. Atom has too high cost to compete in mobile. This is mainly what killed Intel's mobile division. Intel even tried to license it out to cheap Chinese chip makers, but it was still not competitive in performance/price.
And it's also why Microsoft failed for so long to conquer tablets, by retrofitting desktop Windows for them. Before you mention the Surface Pro, first off I don't consider it a major success, it's too expensive for most people, and 90% of the reason people get it is to use it as a PC, not as a tablet (other than perhaps designers, but that's not a mainstream market, and it's more inline with the Wacom tablet professional market).
It's also why hybrid cars or regular cars being retrofitted as EVs will fail against pure EVs, too, etc, etc
Is there a problem with Intel's technology? I think their technology is great, probably better than the competition. It's the licensing and packaging. Intel doesn't sell a core, doesn't license a core for you to mask your own SoC, doesn't let you graft on other parts, they decide and sell a SoC. A SoC that they fab
> At the time Intel had already introduced some mobile silicon, but there was little uptake. So they were iterating; they wanted to improve for each succeeding generation. But they had a kind of design-by-committee process. One person or group wanted a certain feature, another group wanted something else, a third group though that yet another thing was important. And so on. Sorry if that sounds vague, I won't write anything more specific.
> The end result was a chipset that had a lot of features. A LOT OF FEATURES. Gold plated features. But that meant higher power consumption than the competition, higher cost, larger form factor, longer time to market.
I have some experience in this field, and this sounds utterly bizarre. Most of the customers are fairly selective in what bits they want, so provided everything (including the kitchen sink) in a product is useless.
Being able to comfortably ship any variant of your SoC without certain parts is important.
> How do seemingly invincible forces in the market get killed by smaller foes?
In terms of value ARM is a far smaller company than Intel. But in terms of cores manufactured, ARM is way larger. ARM (through their partners obviously) shipped 10 billion cores in 2013. Intel shipped about half a billion.
Of course I'm aware of how ARM operates. We have the sales figures from 2013 as well, and Intel is larger than the ARM partners combined. However I think that just indicates that (a) ARM processors are lower featured and hence cheaper and (b) there's real competion in the ARM processor market which is mostly absent from the x86 market.
You hit the nail in the head. Intel has built up amazing margins on the x86 business over the years by beating out competitors. ARM margins are razor thin by comparison and Intel can't bring themselves to cannibalize their cash cow by becoming another ARM manufacturer.
Steve's point is certainly valid. In this case I'd add that Intel seemingly convinced themselves that a number of things that would be good for Intel such as x86 across both mobile and desktop represented market reality and customer desires whether or not they did.
> Scott McNealy said early on that retreating to the Data Center was where companies went to die.
That is a striking thought. I wonder how (if at all) does it apply to open source.
GNU/Linux for example always had strong aspirations for desktop [1], yet this never materialized. The only place where Linux (but not GNU) succeeded in a big way outside data-center is becoming a golden-standard hardware abstraction layer for CPEs, Android and to some extent other embedded computing devices.
Now with Linux ABI being embraced first by Joyent in SmartOS [2], and now by Microsoft [3], GNU/Linux is pretty much all-in data-center nowadays. Does that mean it too, is going the way of DEC and Sun in due course?
[1] See "Year of the Linux Desktop" motto from 1998 is pretty much every year thereafter.
The buggy-whip analogy is interesting and applicable from a certain viewpoint ("big core" thinking in a low-power market) but I'm not sure it fully works: mobile chips are fundamentally converging toward desktop chips microarchitecturally (big out-of-order engines and wider pipelines, larger cache hierarchies, higher clock speeds, the move to 64-bit, etc), and Intel's desktop/server parts are also converging toward mobile in many ways (very power-efficient, more on-chip SoC-like integration).
In other words, they're different markets, but I'm not sure that they're so different that Intel's existing expertise couldn't allow them to build a kickass part if they played cards right. They still employ some of the world's best microarchitects and design engineers. My feeling here is that this is more of a business/execution issue than a fundamental big-incumbent-doesn't-get-it failure.
> The next sign I'm waiting for is Apple to ship a Macbook laptop with their own 64 bit ARM processor.
They have all the pieces already. OSX already runs on ARM and can handle apps that support multiple architectures and the Apple 9X is more than fine for 90% of their user base.
The only problem is around Carbon. I can't imagine them investing the effort into porting it to ARM and it has been deprecated for a while. But still some people would lose the ability to run some existing apps.
I think this is the most worrisome thing from Intel's perspective. Fighting off "good enough" (in both mobile and enterprise/data center).
It always seemed like the hyper-scale vendors' roadmap had to include "And when Intel reaches the end of process scaling, switch to lower-cost commodity vendor CPU" at some point in the future.
ARM x86 partnerships are one source. But resurrecting and improving other ISAs is another.
Of course, this reorg may be Intel putting all their chips into the data center basket to ensure they're always more competitive. Because, hats off to everyone there, they can work chip magic when they have to.
What is in there for Apple to gain from switching macOS to ARM? Apple save about $100 BOM on most of its MacBook in exchange for incompatible PC ecosystem? Mac Pro and iMac?
Tim Cook has already said iPad is the future of PC. We may as well move to the iPad ecosystem. For content creation let it stay on the x86 and macOS, and if Apple wanted to do some cost saving they now finally have an option, AMD. I bet Zen CPU will be more then enough for Apple needs.
Plus all the people that need to run Windows in VMs or Bootcamp. And ultimately, what would they gain by switching to ARM? ARM isn't yet better at the types of chips being integrated into MacBooks, let alone iMacs and Mac Pros.
I'm going on a limb here, but nowadays, 60% of macbooks (mainly the pro) are used for software development and 30% for creative work (video and editing). While something like half of those professional applications would be fine with the 9X, the other half would be left hanging for a better machine.
However, if they want to make the switch in the near future, I think that the new MacBook will first be released with an ARM processor. This way they won't alienate their core market and they will have a testing ground. Then, I'd predict two or three generations until the pros catch up.
> The only problem is around Carbon
They have the money, if they want to change to ARM it won't be Carbon stopping them.
Those numbers sounds wildly off to me. I'd be shocked if even 10% of MacBooks are ever used for software development of any kind. They are mostly used for web browsing and Office.
> I'm going on a limb here, but nowadays, 60% of macbooks (mainly the pro) are used for software development and 30% for creative work (video and editing).
Your numbers are way out on a limb.
In a single year, Apple sells more Macs than there even are software developers in the world.
College students alone form a much bigger market than software developers.
Scott McNealy said early on that retreating to the Data Center was where companies went to die. And at the time was selling workstations on desks and crushing DEC, then when that business got eaten away by Windows the data center was where Sun was going to make its stand. And then it died.
I think a lot about what the fundamental principle is here. How do seemingly invincible forces in the market get killed by smaller foes? Christensen's Dilemma doesn't really explain it, it describes it, but it doesn't tease out the first principles behind it.
At this point I think it is a sort of Enterprise "blindness" which was something Steve Kleiman at NetApp shared with me. A company can be so good at something that they focus all their energy on it, even when it is vanishing beneath them. Consider a fictional buggy whip company when automobiles came on the scene, right up until the last day they made buggy whips they could be the best whip you could buy, all the secrets to making a great buggy whip where mastered by this company, all the "special sauce" that made them last longer, work in a wide range of conditions, and yet the reality was that the entire reason for buggy whips existing was evaporating with the loss of carriages. By focusing on what the company was the undisputed leader in doing, they ride the wave right into the rocks.
When the company is so stuck on what used to work for them, even after the technology has moved on, they become blind to the dangers. Challenging to watch, even harder if you feel like you can see the train wreck coming. And of course soul crushing if nobody driving the bus will listen to the warnings.
The next sign I'm waiting for is Apple to ship a Macbook laptop with their own 64 bit ARM processor in it. Then it gets really really interesting if AMD can pull off an ARM server chip, going where Intel won't.