Although not many people realize, tech is a _mature_ sector of the U.S. economy now - it grows at about 3% a year, pretty much in line with GDP. But the perception is different - just ask your friends - and the reason for this is that there are highly visible _pockets of growth_, like Uber/SpaceX/name-your-favorite-unicorn, and people often extrapolate their growth to the rest of the industry.
Now what happens with many tech companies is that they often have a product with exceptional margins and double-digit growth rate, and it makes sense to invest all the available resources into it - better from ROIC perspective - ignoring all the alternatives which either lack volume or high margin to look attractive. This inevitably leads to problems once your exceptional product stops growing, and you realize you have barely invested in anything else.
Just much like Intel with x86 and ARM chips, or Qualcomm, or EMC, or RIM, ... - the list goes on and on.
Even when you look at Google, most of their resources are invested into search/ad business, so when that's stops growing - or, rather, they got all the share from the TV and start growing at a GDP rate - they will be in the same boat.
Is AirBnB tech, or hospitality? Is Uber tech, or transportation? Is the Apple Watch tech, or a watch? Is SpaceX tech, or aerospace? Is Tesla tech or automotive? Is Instacart tech or groceries? Are Twitter/Medium/Wordpress tech or media?
Not a rhetorical question, I'd actually like to know what's included in the 3% growth figure. I could believe it if tech is defined as "companies that sell computer hardware or softare", but it seems very low if tech is defined as "companies that provide a service using software or hardware that wasn't possible a decade ago."
SpaceX is aerospace: ~$12B valuation vs ~$600B Aerospace & Defense industry size, and Tesla is automotive, $28B vs ~$750B Auto + components market.
Note that Uber, AirBnB, SpaceX and Tesla are probably the biggest unicorns out there.
But you are raising an important point - many industries will be redefined when new competitors are using IT much more heavily. Still, that growth mostly comes from existing players, and I believe the growth of the IT sector itself is limited now.
edit: typos, minor clarifications.
Uber is seven years old, Airbnb is coming up on eight years old, SpaceX is 14 years old, Tesla is 13 years old. And they're all very substantial enterprises at this point.
I'd suggest that companies of that size and operational maturity should be considered to be well beyond the semi-early stage billion dollar valuation companies that have tended to get the unicorn tag. They were unicorns, now they're just like Netflix or any other established company.
I disagree. What I see is a boom/bust cycle as major technological shifts lead to growth and then consolidation:
Mainframes are invented, an ecosystem of small players flourish, the market grows an order of magnitude or two, the technology finds product/market fit, the market consolidates around a single major player (IBM). Defeatism settles in amongst entrepreneurs as more and more opportunities are swallowed up by Big Blue.
The PC is invented, an ecosystem of small companies flourish, the market grows an order of magnitude or two, and then consolidates around a single player (Microsoft). Defeatism settles in amongst entrepreneurs as more and more opportunities are swallowed up by Embrace and Extend.
The internet is invented, an ecosystem of small companies flourish, the market grows an order of magnitude or two, and then consolidates around... well, now there are more players. Now there's Google and Facebook and Apple. Defeatism settles in among device inventors as the iPhone/Android hegemony seems unassailable.
Some people think VR will be another land grab that fuels an order of magnitude jump in the tech market, and AI. I don't think that's crazy. Personally, I think there's a massive economic explosion waiting to happen in making programming tools more accessible to average people (Wordpress/Squarespace/Salesforce/etc are beating on that door, but no one knows how to get through it yet).
So I guess what I'm saying is I agree with you, but I think this is a trough. There are more peaks ahead.
If you look back, that was always the hope when personal computers got introduced and has been ever since. It never seems to pan out: BASIC on micros didn't do it, Excel/FoxPro/Access didn't do it, HTML didn't do it (RIP Geocities).
Here is a keynote Gordon gave discussing some of his analyses: https://youtu.be/Z-4X6dkESEk
It's called Excel.
Intel has invested in plenty of things that haven't worked out, for instance, Itanic, Ultrabooks, Realsense, etc.
Also companies like Intel and Microsoft have bought into the Clayton Christensen theory of "disruption" to the point where they really are throwing their existing customers under the bus. That is, Windows 8 was perceived as a tablet operating system, Intel only sells high-end chips under the Xeon brand, etc.
Had Intel stuck more piggishly to making better PC's, we might have mainstream water-cooled desktops running at 5Ghz+ and more reasons for people to upgrade.
I'm not saying tech companies never invest in anything outside of their core products. It's just often those projects are given low internal priority precisely because their (presumed) impact to the bottom line will be small.
Innovation isn't about ideas, its about finding ways to execute against those ideas in a way that others haven't or ideally can't.
They probably got this right.
Amazon is another company that is innovating disruptively despite it's size.
In a more general context, innovation takes on the other meaning.
They could've fooled me. Clayton's theory wouldn't recommend changing Windows to adapt to the mobile form factor. It also wouldn't recommend Intel investing what are now probably over $10 billion in Atom and in making x86 work for mobile.
Quite the opposite. It would tell Microsoft to create a new operating system that is dedicated for those form factors, and it would tell Intel to adopt ARM on their own. Both should also be done under an isolated division, so they have no "conflicts of interest" with the incumbent products, whether it's about product features (like say Intel refusing to make Atom too strong so it doesn't encroach on Core territory), or setting up expensive (and unsustainable) infrastructure to compete against the leaner competitors.
I would also say both Microsoft and Intel failed in these markets because they were also late to try and become players in those markets, which is what the disruptive theory always warns incumbents about. It may be difficult to remember now, but even Windows 8 was considered "late" for the mobile market. So was Windows Phone 7 compared to iOS and Android. Same goes for Atom "catching up" to ARM chips in performance/low power (though not in price, a typical failure of a disrupted incumbent). If the "Wintel" monopoly wasn't so strong, you'd already see Celerons and Pentiums be replaced by ARM chips in notebooks and hybrids and whatnot, because those chips are the same ones that failed against ARM in mobile.
"The most important component of evolution is death
Or, said another way, it's easier to create a new organism than to change an existing one. Most organisms are highly resistant to change, but when they die it becomes possible for new and improved organisms to take their place. This rule applies to social structures such as corporations as well as biological organisms: very few companies are capable of making significant changes in their culture or business model, so it is good for companies eventually to go out of business, thereby opening space for better companies in the future.
Computer software is a counter-example to this rule, with ironic results. Software is incredibly malleable: it can be updated with new versions relatively easily to fix problems and add new features. It is easier to change existing software than to build new software, so software tends to live a long time. To a first approximation, software doesn't die (compare this to the hardware in a computer, which is completely repaced every few years). At the same time, it is difficult to make major structural improvements to software once it has been shipped, so mistakes in early versions of the program often live forever. As a result, software tends to live too long: just good enough to discourage replacement, but slowly rotting away with more and more problems that are hard to fix. I wonder if the overall quality of computer software would improve if there were a way of forcing all software to be replaced after some period of time."
Can you think of what those reasons might be? I can't. My office computer is an i5-3570 and it sits idle most of the time because it's just not that hard to have Outlook and Firefox open. A/V work and AAA gaming drive upgrades now like they always have, what other reasons would there be for Intel to push beyond their current efforts like you suggest?
Servers are probably Intel's single biggest market today, and the companies running these miles of racks of servers are demanding more cores and lower per-core power. That goes in line with mobile improvements too. Intel is a perfectly reasonable cpu option for tablets today, within 3-4 years they'll probably have something suitable for phones.
I've always liked the concept of a phone with everything that you dock into a workstation, or a laptop form factor to use, but as a phone it's still great. There's a lot of possibility there. More than that, with the likes of services that go beyond echo, we may see the watch form factor take over from phones... relying on larger systems in your home/office/car to fill the void. Who knows where things will go.
> Can you think of what those reasons might be?
Some applications that I would want that might need to wait for high-cpu (some high-memory and/or high-disk) to become more mainstream, since I want them all to run locally until FTTH reaches my part of the boonies:
A scanner I can literally wave a piece of paper at, have it process all of the ultra-high speed video frame grabs into a high-dpi scan, then reconstitute the scan into an appropriate LibreOffice document or PDF, with better OCR and better vectorization of raster pictures. Then I auto-file all my invoices, bills, statements, receipts, etc.
A personal digital assistant that transcribes, annotates, indexes and categorizes all of my conversations across all mediums (voice, text, chat, social, etc.). This doesn't need to be ML-fancy at first (except for the voice recognition piece), just lots of memory+cpu+disk to drive intensive, continuous indexing.
A millimeter-accurate 3D positioning system, using differential GPS off of a geodetic mark on my property, with relays of signals to indoors. This would drive my robotic chicken tractor, curb edger, and vacuum cleaner. I could keep a detailed inventory of household items with this, then query the house computer over a voice link the next time I forgot where I set down $thing (some RFID and 3D positioning integration needed here). Outside, it keeps track of exactly where various pipes, outlets, irrigation heads, etc. are located.
A software-defined radio that scans for chatter on interesting bands like police and fire, performs voice recognition on them, and pings me when it hears keywords about my neighborhood, street, nearby streets, or region when combined with other keywords (like "tornado").
A house computer that can tell whether or not I'm in the house or not, if I'm asleep, getting ready to sleep, or still undergoing my morning routine, at my computer, at my tablet, on my phone, etc. And do this for all occupants and visitors. Lots of visual recognition from lots of camera input. For myself, I want this information to drive what information is presented to me, when, and in what context. Bloomberg Radio after I've been up for 10 minutes, and after I've put on my headset, before my first call (incoming or outgoing).
A process that continuously combs through my Emacs Org agenda, and compares against my current "state". If I'm driving away from the house, and confirm a query from the computer that I'm going out to buy groceries and run errands for the day (the query is launched by continuously-computed probability cones narrowing possible agenda matches against my GPS position until a threshhold is met), and no one else is in the house, then automatically put the house HVAC system into energy-conserving mode.
A dumb robotic mechanics driven by updateable software in my desktop computer to fold my laundry, turn over my compost bed, rotate stop valves monthly/quarterly (otherwise they freeze in place, the primary challenge the automatic flood sensors face today), change HVAC filters, open and file incoming snail mail, etc.
Intensive AR to turn nearly all parts of my house into smart surfaces. "What's inside that drawer?" "What's behind that wall?" "What's inside that freezer?"
If we are talking only about pure desktop applications constrained to the physical desktop/laptop computer itself, then yes, you may have a point. However, when we include extending the reach of those systems to our environment, and there is an explosion of data to crunch, then today's typical consumer desktop computer (4-8 GB RAM, 2-3 GHz x86_64, 2-4 TB at the top end) doesn't have the capacity to manage all of those demands simultaneously.
I sometimes muse if on-prem will get a second look as containerization, self-hosted AWS Lambda-like microservices, and similar advances become more mainstream on servers, and we'll see more hybrid solution ecosystems evolve. I strongly suspect we will look back in decades hence and lament that governments are Why We Can't Have Nice Things In The Cloud; while the US has been particularly egregiously public, we shouldn't be surprised at other nation state actor reveals in the future, either. If I'm on point about that, then we'll probably see hybrid cloud+on-prem (COP?) instead of an overwhelming dominance of one or the other for the near future at least.
So this will never happen in my household unless I control the data, and it is stored/analyzed locally.
An ASIC can perform some of the computational tasks you mentioned (such as image recognition) with much better thermal/performance levels; it might be hard to make it a commercial success.
It's like back in the 80's it was neat to imagine GB-scale disks with ubiquitous, continuously-indexed search across all files on the system. We could even implement it, and tinker with it on a toy scale for practical speeds, or on a system-wide scale at slow demo speeds. It took better hardware before that feature was really practical.
I'm wondering if ASIC-based products will come out for "settled" niches that large swathes of the workstation market agree upon for a kind of standardized co-processing set of sockets/slots in workstations, and workstation-embedded FPGA's as software-reprogrammable become available for niches that are not as settled, but not quite amenable to GPU/CPU processing. I don't know enough yet about hardware at that level to discern what field of application could possibly penetrate mainstream computing that would need those devices at such a low level, though, that couldn't be addressed with PCIe cards today, if not through CUDA/OpenCL for accelerators/co-processors and general-purpose CPUs.
Basically, Intel needs to sell huge numbers of chips in order to make their numbers, ie giant investments in fabs, work. While Xeon may generate all the profits, they still need to sell millions of low end pc chips to pool costs. If that stops happening, even assuming they sell the same xeons for the same prices, the R&D costs to make those fabs eat them.
It was intended to be the x86 sucessor.
All the computer press was pretty sure it would replace x86. Microsoft released Windows port for Itanium, which was unseen of them.
Also, not every tech company can compete in other realms of tech. How could intel really break into mobile? A non-ARM chip means lots of testing and cross compiling for .1% marketshare that no one will do. Or worse, battery and performance draining hacks to get ARM compiled binaries to run on x86. There's an ASUS (?) phone series that does this. Its terrible and panned by cell phone reviewers.
I think this is a lot more deterministic than we care to admit. I don't care how much "leadership" and "grit" and "listening to your customers" intel could have done better, its clear that they couldn't break the ARM mobile monopoly because no one can. Becoming a 3rd or 4th tier ARM producer amongst many would have probably gone to shit as well.
Its easy to arm-chair quarterback companies but the reality is that there isn't a lot of 'win-win' paths in business. Sometimes you just can't enter a new market or beat the new hot startup regardless of what you do.
Lastly, no one questions mass hirings that don't seem sustainable, but we all freak out when mass layoffs are announced. Its incredible how we think jobs only make sense as permanent fixtures, when in reality they're subjected to the same market forces everything else is. If anything, intel's biggest screw up was hiring too many people too quickly.
'Letting your company get chased upmarket,' is a pretty good synopsis: yielding ground rather than trying to get competitive.
Clayton talks about the steel industry as a common example, with big "integrated steel" companies not adopting mini-mill tech that allowed for lower capital investment costs, and a big price reduction, first at some quality limitations to protect high margin integrated business, but with the low end market ongoingly overtaking more market up until they "won".
I am curious though- Intel is still charging ~$281 for your run of the mill, very boring grade laptop chip. In some ways, it seems like their obstinacy is finally proving it's payoff, even as they abandon market segments. Which is both how disruptive innovation continues to happen- with the low-level market getting eatten out from under them with Rockchip and MediaTek chromebooks, and perhaps gaining more ground- but it also seems to have proven to some degrees how effective their protectionism has been, that they have not had to get competitive on laptop pricing.
Making chips Apple would've wanted would have put Intel at a remarkably different price:performance scale than what they've stood on.
A related anecdote - at my previous job, I was at an annual sales conference event, where Mr. Christensen was an invited speaker. Both before and after his speech, the management kept bragging about "how unique we are", "how fat our margins are", and "how none of our low-cost competitors could match our offering", despite real mid-to-long term danger to their core business, completely ignoring the innovators dilemma, or hoping that it's not applicable to them. It was almost painful to watch. My prediction is that my old company will get disrupted and ceased to play such an important role in just 5-7 years.
Intentionally limiting Atom performance, selling of their ARM division, etc. was all done in order to not harm their main cash cow. By the time they woke up and really tried to push for x86 on Android it was too little too late.
Just from an engineering perspective it was always going to be a monumental task. Because guess what, that small dev shop with the hit-of-the-month mobile game is not going to bother cross-compiling or testing on the 1% of non-ARM devices. And if "Ridiculous Fishing" or whatever doesn't work flawlessly, your device is broken from the consumer perspective.
But what should really have Intel pissing their pants is the recent AMD x86 license deal with Chinese manufacturer's to pump out x86 server class chips. I'l love to hear if they're taking it seriously at all, or dismissing it as usual.
BK came up through the fabs, and therefore is more open to fabbing others' designs (Intel Custom Foundry) because it drives volume. However, he has overseen one hell of an awkward reduction in force (I'm ex-Intel, but still have many friends there). Small offices were told they were closing, and a few weeks later they were told where they should relocate to keep their jobs, and a few weeks later they might learn about the relocation package. It's almost as if they drew a line on a spreadsheet filled with numbers and now they're struggling to figure out how it affects the stuff they still want to keep. Odd.
> @sebbrochet @fdevillamil Embarrassing mistake, don’t know what to do other than grovel and hope Medium can undo.
> Big Thanks to @Medium. My fat fingers killed 9 yrs of Monday Notes this am. Awful feeling. They fixed it quickly. Much appreciated.
QCT’s operating margin fell from 16.9% in fiscal 1Q16 to 5% in fiscal 2Q16. The margin was on the higher end of the low- to mid-single-digit guidance as the ASP (average selling price) of 3G and 4G handsets equipped with Qualcomm chipsets rose by 6% YoY to $205–$211. The price rose due to a favorable product mix and higher content per device.
The Margins on cell phone chips are terrible. QCT made 2.5 bil on 17bil in revenue.
Would it really make sense to invest in Cellphone business when every dollar you put in gets you less ROI compared to what you have now. From a finance perspective it would make more sense to return it to the shareholders and let them invest in QCOM if they want to.
As it stands today you can't really say it was a mistake to not get into mobile. They don't have the competitive advantages they had with x86. They had a monopoly there, but in the ARM space they would be just one of many.
There is certainly that third path though, where Intel embraced fabrication, then regretted it.
Still, just the fact that anyone is in a race with Intel at all would have been unthinkable 10 years ago.
> The Margins on cell phone chips are terrible.
> QCT made 2.5 bil on 17bil in revenue.
Gasse's point, as I understood it, was that Intel couldn't see past it's self imposed margin requirements on chips to see that this business could be additive to its revenue, margin, and keep it in the game when the world switched their primary IT gadget from the laptop to the phone.
And I think Gasse is dangerously ignorant to the basic fact that new Apple chips, even with less advanced fab tech, are quite competitive with chips Intel sells for $281. Core M is somewhere in the same field performance wise, and M's are frequently priced the same as the vastly larger ultrabook cores that themselves only offer incremental gains. And the A9X's vast GPU likely applied much of the pressure on Intel beefing up it's Iris, just to stay competitive (with a hat tip to Apple's colossal drivers/software advantage that no one else has).
I think the point was that phones/tablets are a differentiable enough space that you could go there with a lower margin SoC, especially if you went tit for tat and made it an ARM architecture.
ARM SoC's typically have a memory port that supports exactly one memory chip. So the most memory you will see on them is typically 2GB (32 bits x 512K) although from the Anandtech article on the A9x (http://www.anandtech.com/show/9824/more-on-apples-a9x-soc) it seems to support 4GB. Looking at the iFixit teardown (https://www.ifixit.com/Teardown/iPad+Pro+12.9-Inch+Teardown/...) on the iPad Pro it has two LPDDR4 chips (SK Hynix H9HCNNNBTUMLNR-NLH).
So that makes it one of the first I've seen that actually can take two chips. The issues generally are that they burn up a lot of pins and at the speeds the run signal integrity is really hard (for example all of the data and address PCB traces have to be the same length so the bits all arrive at the same time on the CPU side and the memory side!)
The ARM ecosystem and economy run on different margins, and are put together in an entirely different way.
Ax makes sense because Apple is vertically integrated and they get more value out of outspending their rivals on ARM R&D to be in a unique position than they do from whatever effect vertical integration in mobile CPUs has on their margins.
That means Intel is squeezed between a lo-margin ecosystem and strategic vertical integration. Doubly ugly. There is no good answer short of somehow coming up with a significantly better proprietary technology. That somehow appears to be elusive.
The issue for Intel is that they seem to have found an unintentional local optimization point where margins and volume are effectively in sync in a way that prevents them from effectively growing into new sectors. I used to joke that every ARM chip you buy includes the purchase of an Intel Xeon in a data center somewhere, but that is now something less than 10:1 and probably more like 100:1.
Now that IBM has divested from their fabrication capacity almost entirely, and they sold it to a company owned by Abu Dhabi, it will be interesting to see what happens with Intel's government revenue share. I could see them working together very intentionally with Amazon and the NSA on projects. I think the purchase and quasi-customer fabbing of Xilinix may be an attempt to go that direction.
Shareholder to Kodak: Screw that. We need to make earnings next quarter or I'm selling. Don't miss your earnings by even $0.01/share or you're toast.
Consider, Google paid 1 billion for YouTube, that's relatively speaking chump change. Microsoft could have just bought Nintendo vs Dumping 10's of billions on X-Box. Remember, buying means you get revenue from day one to offset that initial investment where R&D takes years and might not pay off. Of course you need to time this when the competitors are still affordable.
Which has also been a razor-thin margin commodity business. Fujifilm does provide some insights into how Kodak could have moved forward but they struggled too and were a smaller company.
The high-end camera makers have done OK in digital photography (although there's been churn in that space as well) but Kodak had long ceded that market to the Japanese by the time digital photography was much more than a glimmer.
The print consumables market was also pretty good for a while but that ended up being relatively ephemeral.
Even with the advantage of 20-20 hindsight, Kodak was in a tough position to directly leverage either their existing tech or their channels.
The iPod and going from Apple-the-PC-company to Apple-the-gizmo-company could be considered one, but I think it's giving Jobs too much credit to claim he started development on an MP3 player because he knew in advance how wildly successful it and the products it spawned would be.
“I couldn’t see it. It wasn’t one of these things you can make up on volume. And in hindsight, the forecasted cost was wrong and the volume was 100x what anyone thought.”
…and, perhaps more importantly:
“The lesson I took away from that was, while we like to speak with data around here, so many times in my career I’ve ended up making decisions with my gut, and I should have followed my gut. My gut told me to say yes.”
I personally love data,facts,hard science, but I find too often many can ignore gut feelings. I almost always follow my gut instinct, It has proven itself to me over and over again even while taking what is often perceived as long shots but somehow my gut tells me "you got this".
In particular I would say gather your own data on how often or not your gut instinct is correct and use that as a data point in addition to the hard science, facts etc.
Instinct evolved to keep you alive, it is often wise to not ignore it.
> The lesson I took away from that was, while we like to speak with data around here, so many times in my career I’ve ended up making decisions with my gut, and I should have followed my gut. My gut told me to say yes.
vs. this famous quote from Jim Barksdale:
> If we have data, let’s look at data. If all we have are opinions, let’s go with mine.
Now that everyone is praying at the altar of big data, I find it important to keep both these quotes in mind. In my mind, it's not that the data is wrong, but too often I've seen attempts to interpret data it ways that can undercut long term planning and vision. For example, Steve Jobs had a famous saying about why Apple never put a ton of "xyz inside" stickers on the bottom of their laptop keyboards. It would have been easy to look at the data and say "Company pays us X million dollar to add this sticker, but only 0.001% of people decide not to buy us because of the sticker, so we should add the sticker." Judging the long term damage to your brand because you cover up your machines with crap is harder to put numbers to.
The average consumer probably thinks something along the lines "I have paid a lot of money for this phone. If intel had joined this market then they could have made a lot of money" when in reality intel would have gotten only a few dollars for every sold iPhone. Even if they produced ARM chips themselves and completely replaced Qualcomm they would only make $2.5 billion profit from $18 billion revenue. This is the best case scenario. In reality they probably wouldn't even make $1 billion in profit because of the how competitive the market is.
Data is great, but it has this one problem. The best results normally occur when one uses data to keep the gut feeling honest.
In any case, to believe that your [survival] instincts are equally applicable to making sound business decisions is a bit silly, as that's pretty far removed from what they've been optimized to do.
That's why I encourage people to actually keep track of such things and treat it as a single data point.
I would also say there is a difference between listening to your experience/confidence and listening to your ego. A lot of good things can go wrong because of Ego.
Organizations continue to evolve and change direction. I am not saying leadership has not failed spectacularly in mobile, but some of the 12,000 jobs are a natural progression of acquisitions.
They should just fire everyone from that acquisition and close down the whole division. It's nothing more than a huge stain on the rest of the company. It's really sad that the world's largest chipmaker reduced itself to being a malware purveyor.
now add a change in strategy. better do it at once and move on.
Things have been getting very efficient both on the client and the server side. With Cloud, they will have some momentum behind - but long term, I think the glory days are gone where they can just produce chips and someone would take it.
At one point at the height of the bubble, Intel was involved in mobile devices.
I worked for a company that developed secure, efficient wireless communication middleware for that space. Our client hardware was mainly Pocket PC and Windows CE at that time.
We partnered with Intel to port our stack to the Linux device they were developing (codenamed "PAWS"). This was around 2000-2001, if I recall.
Thing were going very well, when practically overnight, Intel decided to pull out of this market entirely. They shut down the project and that was that.
It didn't bode very well for that little company; we gambled on this Intel partnership and put a lot of resource into that project in hopes that there would be revenue, of course. Oops!
Here is the google cache link for quick access : http://webcache.googleusercontent.com/search?q=cache:https:/...
And here is the raw text(without any links) : http://pastebin.com/e10Yw0zi
Back when the PPro was launched there was a low double-digits percentage of the die taken up with the x86 translation hardware (for turning x86 instructions into internal, RISC-like micro-ops). Now that number is some small fraction of a percent. It kills me that I still read this nonsense.
The reason that Intel didn't get into cell phone chips is because margins were (and are) crap, and Intel is addicted to margins. The reason everyone else went ARM is because a) licenses are dirt cheap, and b) everyone knows that Intel is addicted to margins, and they know about all of Intel's dirty tricks and the way Intel leveraged Windows to keep its margins up in the PC space and screw other component vendors, so everyone has seen what happens when you tie yourself to x86 and was like "no thanks".
Of course Intel didn't want to make ARM chips for Apple (or anyone else), because even if they had correctly forecast the volume they'd have no control over the margins because ARM gives them no leverage the way x86 does. If Intel decides it wants to make more money per chip and it starts trying to squeeze Apple, Apple can just bail and go to another ARM vendor. But if Apple went x86 in the iPhone, then Intel could start ratcheting up the unit cost per CPU and all of the other ICs in that phone would have to get cheaper (i.e. shrink their margins) for Apple to keep its own overall unit cost the same (either that or give up its own margin to Intel). Again, this is what Intel did with the PC and how they screwed Nvidia, ATI, etc. -- they just ask for a bigger share of the overall PC unit cost by jacking up their CPU price, and they get it because while you can always pit GPU vendors against each other to see who will take less money, you're stuck with Intel.
(What about AMD? Hahaha... Intel would actually share some of that money back with you via a little kickback scheme called the "Intel Inside" program. So Intel gives up a little margin back to the PC vendor, but they still get to starve their competitors anyway while keeping AMD locked out. So the game was, "jack up the cost of the CPU, all the other PC components take a hit, and then share some of the loot back with the PC vendor in exchange for exclusivity.")
Anyway, the mobile guys had seen this whole movie before, and weren't eager to see the sequel play out in the phone space. So, ARM it was.
The only mystery to me was why Intel never just sucked it up and made a low-margin x86 chip to compete directly with ARM. I assure you there was no technical or ISA-related reason that this didn't happen, because as I said above that is flat-earth nonsense. More likely Intel didn't just didn't want to get into a low-margin business at all (Wall St. would clobber them, and low-margin x86 would cannibalize high-margin x86), and by the time it was clear that they had missed the smartphone boat ARM was so entrenched that there was no clear route to enough volume to make it worthwhile, again, especially given that any phone maker that Intel approaches with a x86 phone chip is going to run the other way because they don't want x86 lock-in.
Today it probably matters little. But 5-10 years ago it did matter. And x86 was magically worse than ARM simply by not being ARM. Sure, ARM proper w/o thumb would do nothing but ultimately savings simplicity brought to the table paid off. Today power is less of an issue and manufacturers (esp. low- and mid-end ones) care about space the most.
And sure, if Intel weren't addicted to margins they'd demolish ARM. And this, not architecture, causes their pains today. But as much as I love you man (signed copy of Inside... is one of my most treasured possessions), your "no ARM performance elves" is just way too simplistic a view.
Culture. Some companies are willing to sacrifice today's revenue for tomorrow's growth. Intel isn't.
A low-end, low-power, low-margin x86 system would cut a huge swath thru Intel's margins. The trend of "upgrade to newer! better! faster!" CPUs would be hard-hit.
Even Apple is seeing that now with slowing iPhone sales. The phones are powerful enough that upgrading doesn't get a lot of benefit. So people aren't.
> Culture. Some companies are willing to sacrifice today's revenue for tomorrow's growth. Intel isn't.
Exactly. A tale as old as time; IBM is addicted to high-margin mainframes, ignores low-margin minicomputers, gets their lunch eaten by Digital. Digital is addicted to minicomputers, ignores PCs, lunch eaten by Apple, Commodore and, quelle ironie, IBM. Microsoft is addicted to the desktop platform, ignores web and mobile, left in the dust by Google and (irony again) Apple.
Sometimes it's not even so much a margin thing as simply being unable to visualize alternative ways of making revenue. Missing link in the previous paragraph is Microsoft eating IBM/Apple/Commodore's lunch when they failed to see that software was an even greater profit opportunity than hardware. Xerox famously ignored the steady stream of insanely lucrative guaranteed mondo profit cash cows produced at PARC because management didn't understand any business besides selling copiers. Excite turned down a chance to acquire Google for $700,000.
At least historically, Intel was. Remember what happened to Intel's SRAM and DRAM in 1981?
AMD has no impact on the overall market because it's not a serious option for the big box manufacturers.
the problem is that any low margin x86 chip that would work phones could be used for cheap servers, low end desktops and laptops and Intel wouldn't want that and there is only so much you can do with crazy contract restrictions on manufacturers. It is a dangerous gamble, but it seems to have worked for now.
Intel did feel the possibility of an attack on low end servers from ARM, and did release the Xeon-D which, at least for now, crushed any hope of ARM breaking into the datacenter.
High-margin addiction is typically deeply embedded in a company's culture. Every manager who is accountable for revenue is keenly aware of the incentives of toeing the high-margin-culture line. As a result, you have low-margin products and ideas killed off by mid-level managers. For big companies, there is no choice: you either get lucky with high-margin addiction, or you die.
So that's yet another economic reason that Intel wouldn't have wanted to sell chips to Apple in this space.
Last year Apple introduced distribution via LLVM bitcode, and Microsoft has been doing it for a while as well, with MSIL being compiled to native at the store servers.
So going forward, iOS, Android and WP can be processor agnostic. Of course there are always some app developers that would rather use Assembly or native code directly, but that is an app specific issue.
What Intel had for years was a process advantage. Apple would have paid a premium for that. Not desktop CPU premium, but a premium nonetheless. And Intel wouldn't be a distant also-ran in the mobile SoC space.
If you read the Oral History of x86 documents the same arguments almost killed the first few Intel CPUs. The memory guys were worried they'd offend their memory customers and CPUs were not a high volume or high margin business.
So even matching the low-power ARM chips early on wouldn't have helped.
I would like to see a scientific comparison between Apple, Qualcomm and Intel...
I bet that if Apple or Qualcomm wanted, they could create a desktop ARM that would be a generation ahead of Intel.
What happens when low-margin ARM cannibalizes high-margin x86?
Isn't there any financial engineering trick to enable companies to solve this dillema that often leads to their death?
(But to be fair to Intel they did, and do - they keep producing better and better CPUs. It's just that demand for that whole category has shifted)
They started our making DRAM's then moved to making microprocessors...I wonder why they didn't realize they couldn't rely on the same business model forever.
But if intel wanted to do that, it would need to sell a lot of fab services to said company, at a lower margin that it's used to, so it's not a solution.
When I voiced the words, "mobile CPU" to anyone there, people were oblivious and silent. The company was just out of touch, thinking everyone was going to keep buying tower PCs and laptops. It seemed the only variable they thought customers cared about was performance. People would buy AMD if it was just a little faster. They didn't realize it was simply the wrong product/market. Performance wasn't nearly as important, sigh.
This isn't true. There's easily a single thread win, clock-for-clock, of 2-3x over this period.
Intel Core i3-4370 @ 3.80GHz 2,215 pts
Intel Pentium 4 @ 3.80GHz 822 pts
Intel Core i7-4760HQ @ 2.10GHz 1,922 pts
Intel Pentium M @ 2.10GHz 660 pts
The Atom 330 vs current gen Celeron (haven't found J3060, but his bigger brother J1900):
Intel Celeron J1900 @ 1.99GHz 530 pts (10 W)
Intel Celeron J3060 @ 1.60GHz ? pts ( 6 W)
Intel Atom 330 @ 1.60GHz 251 pts ( 8 W)
I do think that ChromeOS (or whatever it's called) offers a distinct difference to most users from what Windows is. That changes things up a lot.
I feel that within 4 years, Intel will have some competative x86 offerings compared to ARM... On the flip side, by then, ARM will be competitive in the server/desktop space much more than today. It's kind of weird, but it will be another round of competition between different vendors all around. I'm not sure what other competitors will come around again.
That's not even mentioning AMD's work at their hybrid CPUs... also, some more competition in the GPU space would be nice.
It really reminds me of the mid-late 90's when you had half a dozen choices for server/workstation architectures to target. These days, thanks to Linux's dominance on the server, and flexibility and even MS's Windows refactoring, it's easy to target different platforms in higher-level languages.
There will be some very interesting times in the next few years.
That should be 2005.
When the margins on x86 cross below that of ARM chips, Intel will come in and destroy all the ARM manufacturers.
And I'm not so sure they'll be able to go into ARM fabbing that easily. Margins of X86 are high, but they are selling less of them, so expanding to new market would have been a wise move.
To their credit, they tried twice and couldn't make it work. Not sure if things could have been different considering ARM is a licensable architecture other people can use.
Unless the world has moved on. Profitability happens over time, so even declaring a market unprofitable may shut you out for ever.
Intel turned down a boat load of free money.
Intel's comment about IoT makes me wonder: do they think they just lost the first mover advantage for mobile & the industry got hooked on ARM ISAs? Do they still believe the story that their chips will become cheaper and more powerful than ARM if they change nothing?
Well, I'd guess fundamentally it was about tying yourself to the Microsoft sociopath mothership.
Intel could have taken Linux by the reigns and made an OSX-equivalent and certainly windows-beating decades ago.
But they didn't.
So they missed the boat.
I disagree. Hardware companies have a long history of trying and failing to make good software. It's not in their core business, so the company culture and talent pool isn't right. The same is true the other way around, just look at how rough Microsoft's entry into the console market was.
On top of that, the "last mile" of usability between the current state of Linux and something that can realistically compete with Windows requires experienced UX designers and an infusion of design knowledge into the software engineering team. This isn't realistic for most companies.
Just to be clear, not saying it would have been impossible, but I would bet against it.
Think about that: BeOS with real backing.
They has this handheld touch screen device that resembled a modern smart phone that was driven by voice recognition. This was 1999 way before the iphone came around.
In my opinion, it is the internal politics and the leadership that has caused them to lose out.
Chip design books in academia were already moving in the direction of low power designs during the late 90s. They just did not take any action.
Killer, for us fans of rational strategy.
The article contains the sentence "Jobs asked Intel to fabricate the processor" and "Intel styled itself as a designer of microprocessors, not mere fabricator".
Question: according to your own experience, is it common in English to use "fabricate" and "fabricator" as synonyms of "manufacture" and "maker"?
I am much more familiar with the negative meaning (e.g. "fabricated lies") but since I'm not a native speaker my vocabulary might be limited.
Intel didn't miss anything, they sell the hardware that powers the infrastructure behind these new, always-connected devices.
AFAIK, Paul Otellini also made the same argument about servers however, the point is really about processors that run inside consumer products and not server infrastructure. Intel Inside just isn't the case.