Hacker News new | past | comments | ask | show | jobs | submit login
Intel CEO Bob Swan to step down, VMware CEO Pat Gelsinger to replace him (cnbc.com)
572 points by totalZero 4 days ago | hide | past | favorite | 322 comments





Here's a fun fact (and I suspect shows what a loser I am :-)) Pat Gelsinger and I were colleagues at Intel in the early 80's. We both worked in what was called "MIPO" (Microprocessor Operation) but I was in Systems Validation and he was on the customer facing side with field engineering. My career went off through a variety of engineering roles, his went into management and up. When he joined EMC, I told him I had always thought he would have been Intel's CEO. He told me, "Hey, maybe I still will be :-)" That was like 10 years ago, and here he is.

There are a lot of things you can say about Silicon Valley, but one of the more interesting aspects of it for me has been how "small" it is in terms of people have an oversized impact. I have never been one of those people of course, just a part of the entourage. But it has been interesting to watch and learn from folks who are good (and bad) role models.


Pat went to my high school and computer science program. It’s in a small rural, Pennsylvania farming town - pop 2000. Somehow we got an amazing math teacher turned CompSci and he turned out world class students - we always ranked high enough each year to compete internationally in ACSL.

A few years back when the teacher retired, we all gathered together to thank him, including Pat! World-class guy, and makes us proud! So great to share a story with students, who come from nothing, how you can leave our hometown and eventually be the CEO of Intel.

Wishing Pat the best of luck in his new role!


A single good teacher can make all the difference in the world.

From an economics perspective, it's kinda insane that teachers are paid so poorly, since they are such force multipliers. A single brilliant teacher can positively influence the future educational trajectory (and future earnings) of thousands of kids. Investing in making that happen more often seems like it would have great dividends.

Which is kind of sad, isn't it?

Doesn't that mean that a lot of people will never be on the receiving end just because they did not win the teacher-lottery (arguably the chance of getting such a teacher is rather slim)?

Shouldn't our systems be more robust? (Not talking about normalizing these effects down, but creating a better environment for everyone, using these effects en masse)


We must pay teachers on par with Senior Software Developers at least.

We need our very best to go into teaching. It's the ultimate multiplier role.


I hear my wife and her friends constantly talk about XYZ job that makes more than they do and it's very demoralizing to them. I can definitely see why there aren't many passionate teachers. My wife has been teaching 6 years and is just now at the $42,000 mark. The state health insurance is decent at least.

I have a few former colleagues who were equals (or close) with me early in my career, and are now senior executives at Fortune 500 companies, and it does always make me feel kind of like a loser!

But also I think about how much crap they put up with in those roles, and how relatively low-stress my life is, and then I feel better about it. They're cut out for that kind of job, and I most certainly am not.


If I may ask, any pointers on how is the stress level in your role currently/over the course of years and what is the difference in your compensation and your colleagues compensation cumulatively?

Not the GP, but I'd guess the difference is an order of magnitude for both stress and comp.

This question makes me groan.

“TC, bro?”


I see a remarkable number of people who have good personal experiences with Pat but I've got this concern.

Intel's technical failure with 10nm has gone hand-in-hand with financial success with 14nm. That is, without 10nm chips on the market in a meaningful way they've been able to raise prices for 14nm parts -- Intel put up better financial numbers than ever in a time when it has not been investing in future success.

VMWare was a big thing in 1998 but it was obsolete by the time Pat got involved -- a hypervisor is naturally part of the kernel and there is no way cloud providers are going to spend their margin on VMWare. Yes, many people in business are terrified of open source software and want a proprietary product so they have somebody to sue (VMWare) or they need somebody to hold their hand (Pivotal.) Either way, VMWare and Pivotal are units that can be merged and spun off whenever a company based in Texas (Dell) or Massachusetts (EMC) wants to look like it has a presence in San Francisco -- you see the vmware logo on CNBC every morning and somebody thinks the king is on the throne and a pound is worth a pound but that doesn't mean anything in the trenches.

Like Intel in the past 10 years, VMWare is entirely based on a harvesting business model. In the short term Intel made profits by pandering to cloud providers; but in the long term cloud providers invested their profits in better chips. (What if Southwest Airlines had developed a 737 replacement designed from the ground up for a low cost airline?)

Pat might be able to keep the game of soaking enterprise customers going for longer, but someday the enterprise customers will be running ARM and the clients will be running ARM and the coders will be thinking "did they add all of those AMX registers just to put dead space on the die to make it easier to cool?" and falling in love again with AVR8.


Speaking as an ex-Pivot (but only for myself), a couple of thoughts:

- VMware grew almost 2.5x under Pat. Obviously their core franchise is about harvesting, but new adjacent products like NSX and VSAN helped a lot with growth. They just took a long time to get traction given the customer base, who don’t like change. VMware Cloud on AWS has been surprisingly strong even to skeptics like myself.

On the other hand, companies would get rid of VMware if they could, and they tried and failed with OpenStack. The hypervisor is commodity, but the overall compute/network/storage private cloud system isn’t trivial. Turns out “holding hands” whether by software or services is pretty valuable?

Public cloud of course is a substitute, though private clouds and data centres have survived and thrived as well (for now).

- Pivotal had a boutique software and services model that would be difficult to scale as a public company given the current shifts in the enterprise fashion away from productivity-at-any-cost (PaaS and serverless) and towards perceived low cost building blocks (Kubernetes and its ecosystem).

But it would be an gross oversimplification to suggest this was merely a vanity project for EMC and Dell. It took in $500m+ annually on open source software (and another $300m in services), which is no small feat, though still minuscule given what surrounds it. But anything new has to start somewhere. Not enough to impact the mothership’s balance sheets, but there was something special there that could be nurtured. Whether it can be, or whether the differences matter enough is anyone’s guess.

Pat could take Intel in a direction we don’t anticipate. VMware was written off for dead when Pat came on but he managed to buy it another 10-15 years and at peak a doubling of the stock price. He’s probably learned from that.


This is spot on. And I think the TL;DR of the hypervisor argument goes as follows:

Hypervisor is a commodity, however management and support of hundreds or thousands of them is not. You can either pay people to support them and fix the software when it breaks or you can pay <vendor name here>. Given the former requires expertise and planning it's often more cost effective to go the latter.

Disclaimer: I'm employed by VMware (less than 1 year) and chose to come here based on pivots I feel they are making.


Any thoughts on what made him special (either from your perspective or from those choosing to promote him)?

Keeping in context that I've not worked with him in (gulp, 30 years!) at Intel he was one of the folks who put problems into the whole picture. So a customer would say, this chip doesn't work, over at Systems Validation we would get hopefully enough information to re-create their problem on an Intel built board, and Pat's job (at the time) was to co-ordinate between us, design engineering, and marketing to figure out how to tell the customer to proceed.

The "actual" problem could be anything from the customer misinterpreting the datasheet (Marketing/Comms problem), to insufficient testing (Factory/Production problem), to chip function (Design Engineering problem). As a "new college grad", or NCG in the lexicon, I admired how he dug out details from various folks to get to the real problem. He always had ideas for things Systems Validation (SV) could do to maybe trigger the problem that made sense to me. He really embodied the philosophy of fixing the problem not fixing blame on some group.


People that can ask the right question are much more impactful than people who can “only” find the answer. (NOT trivializing the smarts it takes to find the answer. )

My Pat Gelsinger story: I worked in a successor org to MIPO, in those days called MPG. One day during pre-silicon validation on the Pentium II, one of my NCG’s came to me and said: “An old test from the historical test archive is failing in the simulator. I want to find the author and ask him about it. Do you know a Pat Gelsinger?”

Me: “Well, he is an Intel Fellow now, so he might not remember what that test was supposed to do. “


> There are a lot of things you can say about Silicon Valley, but one of the more interesting aspects of it for me has been how "small" it is in terms of people have an oversized impact. I have never been one of those people of course, just a part of the entourage. But it has been interesting to watch and learn from folks who are good (and bad) role models.

I believe this is what you feel when you come to the industry when it is just starting in a given locality, and then it explodes. Senior workers who join in the earliest days tend to drift from company to company, but they rarely leave the place entirely, or change industries.


A sociologist I met described it as the "generation effect" which is that people live in a cohort of other people about their same age +/-10 years. Throughout one's life the composition of the cohort changes as you begin to specialize starting with college/first job and continuing if you focus on a particular area (for me it has been systems analysis).

This effect can drive people to over-pay to go to "good schools" or work for less than they think they should be paid at the "good places to work" because it lets them join a cohort with members who are statistically more likely to be "successful" (for some arbitrary definition of success). I personally never paid much attention to it but thought about it when talking with this person.

I know from experience that it happens in Silicon Valley that someone will say, "Oh I know someone at some previous company who is really good at that, let me see if they would be willing to change jobs." That is why companies pay people bonuses for "referrals."

I suspect it happens in LA in the entertainment industry as well. If you watch the old American television series "ER"[1]. Produced by Steven Spielberg's Amblin Entertainment production company, it is amazing to see people who were on that show first as guest actors and then show up later in series.

[1] https://www.imdb.com/title/tt0108757/


"Notable absent from that list is he fired Pat Gelsinger. Please just bring him back as CEO." - [1] 2012 on HN, when Paul Otellini Retired.

"The only one who may have a slim chance to completely transform Intel is Pat Gelsinger, if Andy Grove saved Intel last time, it will be his apprentice to save Intel again. Unfortunately given what Intel has done to Pat during his last tenure, I am not sure if he is willing to pick up the job, especially the board's Chairman is Bryant, not sure how well they go together. But we know Pat still loves Intel, and I know a lot of us miss Pat." [2] - June, 2018

"This is the same as Intel pushing out Pat Gelsinger. The product people get pushed out by sales and marketing. Which are increasingly running the show at Apple." [3] 30 Days ago.

And numerous other reference since 2009. Many more around various other forums and twitter. I am getting quite emotional right now. I cant believe this is really happening. ( I am wiring this with tears in my eyes! ) I guess Andy Bryant retired makes the decision a little easier. And Pat has always loved Intel. I guess he is pissed those muppets drove it to the ground.

This is 12 years! 12 years to prove a point! Consider 4 - 5 years of work in lead-time since he left in 2009. That is 2014. Guess what happen after 2014?

May be it is too little too late? Or May be this will be another Andy Grove "Only the paranoid survive" moment?

The King is Back at Intel, despite being a fan of Dr Lisa Su, I am little worry about AMD.

[1] https://news.ycombinator.com/item?id=4804875

[2] https://news.ycombinator.com/item?id=17391707

[3] https://news.ycombinator.com/item?id=25435150


Pat's a great guy and first class engineer, but I think it's just too late for him to turn Intel around at this point. The problems have become too entrenched. They needed someone like him at the helm 10 years ago. Apple's M1 shows that the world has passed Intel by.

I wish him luck, though.


"It's just too late to turn them around" is pretty much what people said about Apple when Steve Jobs returned, though (in fact, it's more or less what I thought as well).

As Adam Smith said, "there is a great deal of ruin in a nation", and likewise, big companies generally get more opportunities to reinvent themselves than one would expect.


Similarly some (many?) thought Microsoft was a lost cause due to Ballmer (sales/business development background), but they seem to be doing okay under Nadella (engineering background). That said, only time will tell.

I think the telling thing is would you see a Nadella (and indeed, a Jobs)-like pivot when he takes over. i.e. axe old design/business models that don't work and fully embrace new ones.

Intel's problem isn't really that they're involved in too many businesses that don't make sense or they aren't good at. Their core businesses are still their core competencies. The problem has been a catastrophic set of execution misfires.

> As Adam Smith said, "there is a great deal of ruin in a nation"

Tired brain read this as "there's a great deal in a ruin of a nation." Close enough, I guess?


That would make a truly horrifying business book: "The Art of the Deal of Ruin in a Nation"

Evidently, 40% of people would buy it.

Must have been reading the news while typing it!

How often do big companies like Apple and Intel succeed in righting a sinking ship though? You can’t keep pointing to the one guy that succeeded.

Maybe classifying Intel as a sinking ship may be too much of an exaggeration.

Competitors maybe passed them in the last couple of years, but Intel is still in a really solid position to turn things around.

Apple's situation was a lot more dramatic. Intel still has a very good market share and can probably take a couple+ years to safely get back on track, in my opinion.


Yeah, absolutely. There's still a lot of excellent engineering being done at Intel that you can't find elsewhere. It's not like Apple where their core product had eroded into irrelevance, they were making tons of undifferentiated products and they literally had nothing of value except the dying embers of fan loyalty to sustain them.

Amongst other things:

- SGX is really far ahead of AMD SEV. The latter is probably easier to sell because it's marketed as "drop in" (encrypts whole VMs), but SEV has been repeatedly hacked or shipped with obvious gaping design holes that they patch later and call features. SGX is a lot more focused, a lot more flexible and frankly a lot more secure.

- Optane NVRAM is completely unique, as far as I know. It's only 10x slower than DRAM which is nothing, but it's persistent! It totally changes the whole IO hierarchy.

- AVX512 / DLBoost are able to hold their own against mid-range GPUs for some AI tasks, which is impressive and useful.

- Their core chips are still very fast. AMD chips are selling more for less, which is a good position for them to be in, and TSMC's process advantage is helping them out for now. But they don't have a truly massive edge in tech like Apple's competitors had when Jobs returned.

- Intel have a long tail of obscure features that AMD doesn't, although it's often hard to know this. SGX and AVX512 are high profile but there are others that are less well known.

I don't count side channels as an issue because it's also an issue for all their competitors, and frankly I found the near single-minded focus on Intel by the security community to be rather misleading. I even read a side channels paper that admitted they suspected AMD had the same problem but they didn't bother to check simply because they didn't have access to any AMD hardware in the first place, which was unusually honest.

Apple's M1 is very impressive in its space, but for high performance cases like servers it's only quad-core - you can't use the little low power cores for much. Apple have shown no interest in making server parts for a long time, and Apple's engineering is bespoke so it tells us nothing about what other ARM vendors can do. So in that space it's still just AMD vs Intel and Intel is a long way from being on its back yet.


> Apple's M1 is very impressive in its space, but for high performance cases like servers it's only quad-core

this is true, but does not account for the fact that apple M1 is just the first (public) iteration of such cpu.


...and that if they wanted to I'm sure Apple could spin a server version with many more cores in pretty short order.

It's not that easy. Adding cores is not just changing "const int CORE_COUNT" somewhere in a VHDL file. Scaling up cores without hitting communication or other bottlenecks is a difficult engineering problem that has taken a lot of effort from Intel and AMD.

What I'm curious is; is how Nuvia and the departure of Williams et al impacted their ability to do this?

> How often do big companies like Apple and Intel succeed in righting a sinking ship though? You can’t keep pointing to the one guy that succeeded.

I feel like the view of Intel as a "sinking ship" is an inside-basebally misread of the situation.

If you buy a PC or a laptop or a server today, you're most likely getting an Intel CPU. It's now at least possible to buy AMD in many market segments from major vendors - and of course many of us do - but to the broader consumer and hosting world Intel still dominates.

You can look ahead and extrapolate and say "they can't compete with TSMC right now, and maybe they're going to start falling behind further." Fair analysis. But they dominate the market to an extent that's hard to overstate.

Contrast this with between-Jobs-stints Apple, which was a tiny shrinking company with a niche product and no clear strategy.

Intel is almost in too-big-to-fail territory. They aren't sinking, just because they now have real competition. Yes, they need to do "something" to maintain dominance - but why would you see their situation and conclude they won't? The new CEO here is a sign that they see the trouble ahead, and they're ready to steer around it.


Perpetually. These companies are always in competition, always under pressure, and always advancing their products. Not every tech company is IBM or Research In Motion. Intel and Apple are juggernauts.

If you have $233 billion (a LOT of money) and you want to own a world-class chipmaker, would you rather start from scratch or just buy Intel for cash?


As you say, Apple can be pretty unequivocally considered a success story. To the extent that it was ever troubled at all in the first place, Microsoft is also widely considered to have turned around its fortunes. Dell seems to be another turnaround story.

Then there are companies that have at least managed to stabilize themselves reasonably, e.g. IBM and HP (I wouldn't consider either of them a huge turnaround success, but I would not expect them to head for bankruptcy anytime soon either).

And even companies that did go under, e.g. Kodak, took an enormous amount of time and effort to do so, and not without launching a cryptocurrency first…


Competition is always an amazing driver for getting large companies to start doing the right thing. As long as they are not completely shut out, they have a chance of succeeding.

Until still have some amazing technology compared to AMD, they're five processes screwed right now but I think they have a chance to succeed. But given the timeline on new CPUs and fab processes, it'll be 4 years before we see the fruits of anything he does.


IBM still exists. Microsoft still exists. Apple still exists. Notably, AMD also still exists and flourishes again. All of these companies were at one time or another considered dead in the water. Did every big company survive? No. But I don't think there's enough data to make a prediction which fate Intel will see.

survivor bias

Yea, I'm not sure what their plan is other than to keep trying to sell to institutions until everyone makes the switch to someone else.

If they actually want to survive long term, there are two paths as I see it:

A) Be legitimately better than AMD, this could include opening up the management engine, much higher performance chips at lower price points, or some sort of space magic utilizing their Altera acquisition.

B) Embrace RISC-V and push it to laptops and desktops HARD, while not pulling the Microsoft Embrace Extend Extinguish™ play. If they go this route then their stock becomes an exceptionally strong buy IMO.


They could give up on design and just try to be an American TSMC.

There's a lot of interest in that from a national security perspective anyway.


With Intel's cashflow, they can do both without any difficulty. But yes, they should be the american TSMC.

I'm not sure they really can do both (because doing the TSMC model well intentionally excludes the other).

Part of TSMC's offer is that they won't do design, they won't compete with you on design etc.

They specialize in the best possible fab and service for your design and that's it.


> Part of TSMC's offer is that they won't do design

That's kind of a hoax. GUC (Global Uni Chip) is huge, and a nearly-wholly-owned subsidiary of TSMC.

https://en.wikipedia.org/wiki/Global_Unichip_Corporation

TSMC has a large design operation. It simply is part of a subsidiary rather than the parent company, that's all.

There are a lot of shady links between TSMC and GUC. I was once pressured to "collaborate" with GUC as an explicit condition of getting access to the then-bleeding-edge 16nm PDK. I turned it down. A competitor of mine (with whose chief engineer I am friends) had the same screws put to them.


Interesting - thanks, I didn’t know that.

Exactly right.

Someone else mentioned it, but that's really not true about TSMC.

Doesn’t Samsung make a whole bunch of components and use the components to create products laptops tv phones storage

Actually I'd be curious to see how a Korean Chaebol works in this regard. At various points, Apple was competing w/ Samsung (mobile phones) to get fab space on Samsung (chip fab), and I think at various times different parts of Samsung were suing Samsung...

Fly Wheel, a term people used to describe Amazon.

Imagine 100x of that, and that is Samsung.


I would LOVE them to be an American TSMC. Would save so much pain.

That makes no sense. Intel's designs are fine, industry leading actually. Intel's biggest problem by far is the repeated inability to launch new process nodes. It makes no sense for Intel to give up on what it's good at to focus on the part where it's struggling.

I would love a RISC-V laptop, but only if the vector and matrix extensions can be used for training and inferring neural networks. Having separated vectorized processing for CPU (SSE) and a neural engine doesn't make sense to me.

I want to be able to train NNs with a laptop that can normally last 20 hours.


Imagine what Apple does to x64 to work on ARM, Intel having Altera, could do x86 translation to FPGA gates... It wouldn't take whole programs, but it could reconfigure itself to run most commonly executed routines on FPGA.

The same could have been said about AMD 7 years ago. Then Lisa Su came.

Don't forget Keller's contributions there. It was a much smaller operation so it was easier for a couple of people to change it. Intel, on the other hand, is a supertanker that's going to be very difficult to turn around in time.

Consider how quickly Nadella turned around Microsoft. It's not impossible.

I recently purchased a MacbookPro w/ an M1 chip and i'm taken aback at the value/dollar, compute/power, compute/heat, compute/noise. I love the MBP-m1.

I realize that WinTel machines arent going anywhere because of entrenched business use -- but between NVIDIA GPUs for heavy workloads and M1s for day-to-day activities -- what is the feeling inside Intel right now? Is this like a Microsoft-Netscape moment in the 90s?


An M1 equipped Mac Mini with 16 GB of ram and some more storage is still more than £1000. The M1 has to get a lot cheaper to be anything more than a warning shot for Intel.

What it should definitely be, however, is a death-sentence for x86. I bet Intel are proud of it, but ultimately they must be at least somewhat jealous of those who don't have to use it.


you can get a 6c/12t, 64gb ram, 1tb nvme disk intel nuc for that price.

Maybe more like Apple in the 90s?

They still have a lot of money in the bank (and more coming in). I see no reason why they can't turn things around if the right people are in charge.

Convince Keller to go back to Intel too and I'll think there could be a chance. (not likely to happen, of course)

No? It might. Keller's new startup doesn't sound particularly impactful or likely to satisfy a need for big career kills (assuming he has any such need).

We don't really know why Keller left. All we can say is that from the outside it looks like he might have just given up in disgust. Having an engineer at the helm again can make a big difference to an engineering organisation.


Interesting - you think he can pull this off?

It'll be interesting to see what happens, I had written them off as on the path of inevitable decline and irrelevance.

If they have someone as CEO who understand the existential threat they're facing from everywhere maybe they'll survive.

The irony of the other top comment is that the AMD threat wasn't the competitive threat that mattered. AMD is also screwed.


I'm old enough to remember writing IBM off in the early 90's and Microsoft off in the late 2000's. These companies had one thing in common: giant war chests of cash and legacy businesses that were still pumping out money by the truckload. It's not like a startup/small business that's riding on a razor's edge, these companies have so many resources they can keep searching for a way "out" of their predicament for a very long time until they finally land on the right combination of people, vision and timing.

We see it now with Google now being the "evil empire". They haven't had a real hit since Android and seem to be floundering, but online ad revenue is such a huge geyser of cash they're gonna be fine for a very, VERY long time.


I think you’re right that they’ll be around a while, but Microsoft (and Apple) were both saved by CEOs in fairly dramatic fashion.

I’d argue IBM largely is irrelevant today, but they technically still exist.

Google is lucky they have an ad monopoly because they don’t really have a coherent company vision, I wouldn’t buy their stock. They might get lucky with their deepmind purchase.

I don’t disagree with you, but if I was choosing companies in a strong position today it’d be Apple, Amazon, and Microsoft. I wouldn’t make a long term bet on Google or Intel.


Why would amd be screwed?

My long term bet is on M1 and ARM/RISC5 for the future.

x86 is on the way out and can't compete on power or performance. In the end RISC won, it just took a while to get there.

As things trend that direction AMD and Intel don't really have much to offer, they're competing on legacy tech.

With TSMC providing fabs for designs from anyone, the serious players don't need AMD or Intel. Apple has a massive lead here, but others watching this will follow it.


> x86 is on the way out and can't compete on power or performance. In the end RISC won, it just took a while to get there.

What exactly do you think is the fundamental limitation of x86? Most chips do lot and lots of crazy logic to go from instruction set to microcode, it's hard to imagine that the variable nature of x86 instructions are the limiting factor.


At very least having a big fat decoder costs you power and space, measuring the tradeoff in raw decoder performance is probably dependent on being an engineer working on it at Intel or AMD (or Apple). At very least, Zen 3 is a cutting edge microarchitecture on a cutting edge node with lots of transistors to play with, and it still decodes only half as many instructions per cycle as M1 - an unbelievably wide CPU, which is probably the way forward.

The "x86 tax" (if it exists, I guess) is usually estimated at somewhere on the order of 5% to 10% - which is a lot of money at scale but probably not enough for a total rethink.


Judging from the M1, it seems the most important thing the big fat decoder costs you is not being able to manufacture billions of processors for small devices (where the decoder would be a much larger fraction of the CPU), develop experience and optimizations and capital based on that, and then have those benefits carry over to your high-end processors.

There’s nothing that restricts AMD to compete solely using x86. I think they’re sitting on an ARM core design from the first Zen arch that they never released.

You’re right, there’s nothing to restrict them except inertia and that they haven’t done it so far when the writing is on the wall.

There’s also the coordination problem with windows and third parties. Without vertical integration and tight OS support they’ll fall behind.

Something will have to give, but the future doesn’t look good.


The reason that AMD didn't release their ARM design I think is that they know it is better to only have to compete with Intel than being one of the many ARM designs.

That reasoning is a mistake.

It's like train companies thinking they shouldn't compete with cars because they're in the train business (without realizing they’re actually in the transportation business).

AMD is in the chip business, not the x86 business - and x86 just became a whole lot less relevant for the future.


I'm sure amd could make a monster arm chip but without an oem who would put it in a laptop or desktop and without killer arm apps for windows whats the point?

I think this is why they're in trouble.

Apple has the vertical integration to do it properly. Amazon and Microsoft have the server cloud computing to do it server side.

Without proper coordination from Microsoft, third party OEMs, and AMD they'll all fall behind on computing on end devices.

I think that's part of the reason AMD/Intel will be less relevant, the incentives and market moves don't look good for them.


I don’t think x86 is on the way out. Apple won’t dominate desktop/server market anytime soon.

Not apple - see the graviton cpu's AWS is putting out?

I think the server market will move to RISC faster than people think.

Devils advocate from a throwaway for reasons; Pat foisted VMWare on a small startup trying to find its engineering footing culture wise, after being invited in to advise on the business side (loan his name mostly).

Cost a bunch of engineering time and forward motion, internal politicking. Eventually it got binned after months of not getting what we wanted out of it. There was no technical reason for it.

Maybe hardware really is his thing, but that quid pro quo hurt productivity.


I don't think I've seen a thread here before where the top 2 posts are such opposites of each other.

That's remarkable, can you share the reason for your emotional investment? Have you worked at Intel or with Pat?

Wow, I'm impressed by your persistence. Have you worked with him or know him personally?

I worked at VMware when Pat first became the CEO. Pat is very much an engineer. If you ever wanted an engineer as your CEO, then that's Pat. That comes with some goods and some not-so-greats. Pat isn't very inspiring, at least not when he first became CEO. But I always got the feeling that he genuinely love engineers and is more comfortable around them than anything else. I once hosted a fun little engineering challenge (building bridges out of spaghetti). It wasn't a fancy event -- just a bunch of engineers having fun. Pat actually agreed to come by to hand out the awards at the end. I left VMware partly because I've been there so long and partly because I wasn't excited about it anymore. I felt its best days were behind it. Well, Pat proved me wrong by a wide margin. If no-nonsense engineering is what you need to win, then Pat is the right person for the job. It's a good day for Intel I think.

What does this mean for VMware? By all accounts, it seems like Pat was well liked at VMware.

$VMW down 6.5%, $INTC up 7.5%, sounds like the market liked him at least

What are your thoughts on VMware? It seems they are too in decline...

I work for VMware, via the Pivotal acquisition.

My purely personal view is that VMware's second act has begun and it'll do well. Pat deserves some of the credit for accepting that Kubernetes would be the future of the business and throwing his weight behind it.

There are aspects of Pat Gelsinger's leadership that I dislike, but they're orthogonal to his management style and foresight. He's been effective.


I will echo this. After we were acquired, I was incredibly hesitant about Pat. Over the past year, I’ve come to believe he is an excellent leader who has fantastic vision and insight even if we disagree on many things.

I’d be incredibly happy to have him if I were an Intel employee.


We will be fine. Pat is a great leader and this is a net loss though

Intel's downfall in recent times has been "Only the paranoid survive". They strayed far too away from customers and focused on competition (and their customer feedback was a redirect from what competition was up to). I doubt there will be cultural changes.

Intel is at heart a R&D company, without strong technical CEO I don't see how they fend off competition. Lisa bae is a good example, she is rocking the CPU monopoly boat!

It's always too late to change the past, and never too late to change the future.

https://semiwiki.com/semiconductor-services/294637-2020-was-...

>A few years ago, back in 2016, Intel did a “RIF” (reduction in force) of about 11%. Intel had previously done a significant reduction way back in 2006 of about 10%

>In an industry that runs on “tribal knowledge” and “copy exact” and experience of how to run a very, very complex multi billion dollar fab, much of the most experienced, best talent walked out the doors at Intel’s behest, with years of knowledge in their collective heads

bottom line: Intel created the hole by itself and jump into that deep end.


Plus Intel for many years has paid slightly above average wages. If I recall correctly they targeted paying at about 55-60 percentile. The problem with that is that the FAANG companies will literally pay their engineers twice that.

Many of the competent people I knew at Intel have left (not all), while many of the incompetent people I knew are still there.


Management rule of thumb: For every % you cut from the bottom, you lose from the top. Cut 10% of workforce, and 10% of your best people say "yes" to the next headhunter trying to poach them.

It's actually more complex than that, when a big company wants to reduce workforce they will fire the bottom and offer retirement packages to their oldest employees which are above or near retirement age, sometime as much as a full year of salary.

The intuition is that older employees cost more and by cutting them you can reduce your payroll more significantly while doing what looks like smaller employee cuts from the outside. This is often viewed favourably by investors because on paper it doesn't seem as the company is stalling (head count is still high, costs are down). The obvious issue is that these older employees are not easily replaceable and you end up losing more velocity in the long run than originally anticipated.

The above is more applicable to traditional blue-chip businesses where workforce movements are more limited. For software engineering (which Intel is not really) your assumption is correct and once cuts are announced a lot of your great engineers will jump ship.


Those older employees better be replaceable! Many will be gone in a few more years because they retire anyway, so you should have a plan in place to save their knowledge.

The above applies to everyone. When I was an intern the company folklore was full of horror stories because the last guy knew anything about a very profitable product died suddenly. (the product was for mainframes: clearly near end of life, but it was still mission critical for major customers and got had to get minor updates)

I've also known important people to find a better job. Even when an offer of more money gets them to stay, my experience is they always wonder if they made the right decision and so are never again as good as they were before.

Moral of the story: don't allow anyone in your company to get irreplaceable. This is good for you too: it means you won't stagnate doing the same thing over an over.


The vendor of one of the security scanning tools we had subscribed to for the past six years told us they were shutting down the product because the PM left and no one else in the company wanted to take it over. I don't know how many SEs and other resources they had on the product but it was a major part of their offerings. They didn't even have an alternate upsale (or downsale for that matter) to offer us. So strange one person leaving could collapse an entire revenue stream like that.

I would venture that this is indeed how these exec rationalize the whole process. No one should be irreplaceable and therefore the move make sense. Even though management bashing is trendy these days, most managers/execs know these things and are not the idiots we satirize them to be.

Even Swan is probably not an idiot, he simply expected everyone to struggle as much as Intel on the 7/10nm node and when TSMC just breezed past Intel and AMD came out with a much better product than anticipated he found himself in very hot water.

(He could also be quite the idiot, I don't know him)


I work (but not for much longer) for a tech company that has done an ER package twice (and is well known for layoffs). The result is that the company has a sort of corporate alzheimers. It knows the inventory of all the things it used to know, but doesn't actually seem able to recall anything. The trajectory and outlook are not good.

I don't think tech companies can afford this practice. So much knowledge resides in their senior talent, and the hard-won experience-based understanding and things gleaned through opportunistic exposure that they seem to voluntarily surrender.


Pat was a "boy wonder" at Intel and could do no wrong — until Larrabee. I was working at Intel at the time and remember always assuming that Pat would someday be CEO. His departure came as such a shock to a lot of us, as does his return.

He might have what it takes to turn Intel around.


There was also Intel's whole pursuit of frequency--they demoed I think it was a 10GHz chip at IDF at one point (and Itanium was essentially an ILP-oriented design)--and resistance to multi-core. Some of it was doubtless Intel convincing themselves they could make it work. But they were also under a lot of pressure from Microsoft who didn't have confidence that they could do SMP effectively--at least that's what a certain Intel CTO told me. (Ironically, multi-core didn't end up being nearly the issue a lot of people were wringing hands over at the time thought it would be for various reasons.)

I’m not sure Itanium was a technical failure, to me it always was a business model failure as that CPU was co-developed with HP and essentially became a dedicated HP-Oracle box and by the time the ecosystem was opened up it was too late.

The heavy reliance on the compiler for ILP was an “odd-choice” but not something that was unsound in principle.

If the ecosystem was more open from the get go and more vendors were involved it had a much better chance of taking off.

And if nothing else at least it was something new.

The biggest disappointment I have with Itanium is that it and later Larabee/XeonPhi kinda pushed Intel even further into their own little x86 box when it came to processing units.

I think that failure is also why they haven’t really done anything interesting with Altera.


They do have Xe-graphics now. It's the closed Intel has come to a competitive non-x86 part in recent memory. It feels kind of forced though, everyone else has their own CPU+GPU now including Apple/Nvidia in addition to AMD/QC, so why wouldn't Intel? They also have OneAPI.

It would be interesting to see an explicitly JIT-based approach to ILP.


From my personal memory at the time, early NUMA multicore on Windows wasn't the smoothest sailing.

It wasn't. A few years earlier, I was the product manager for a line of large NUMA systems which admittedly had far larger near-far memory latency differences than it was on multicore systems. Commercial Unix systems still could have issues for write-intensive workloads but Windows was pretty much unusable for configurations that had far memory. Things were likely better by the mid-2000s but Windows was definitely still behind Unix in this regard. (Don't really know where Linux was at that point but IBM at least had done work in OSDL on various scale-up optimizations.)

It seems it's not great now either, seeing how the first iteration of AMD's Threadripper performed badly on Windows.

In fairness it's not like Microsoft are alone in that. Single-thread performance is still incredibly important. The Mill focuses on single thread performance almost exclusively for that reason: nobody ever made the mythical auto-parallelising compilers we were all supposed to have by now, not even for Haskell. The big wins for exploiting parallelism in most ordinary software have been just scaling up lots of independent single-threaded transactional workloads via sharding, and massively concurrent runtimes like the JVM where you can move all the memory management workload onto other cores and out of the critical paths. In terms of ordinary programmers writing ordinary logic, single-threaded perf is still where it's at which is why the M1 has 4 big super-wide cores and 4 small cores rather than 32 medium cores.

The trillion dollar question is:

Is the board leaning into the usual MBA moves 101 and turn Intel into a "services company" gradually going fabless and milking those sweet patents OR will they put the work boots on and start building an actual tech company with the people who actually can save them on the payroll? cutting on the usual contractors meat grinder and invite the vast armies of middle-management and marketing drones to leave?


> Is the board leaning into the usual MBA moves 101

Probably not, considering the guy they're throwing out the back door is a finance dude with an MBA and Gelsinger was/is an actual engineer.


BK was an engineer too but still managed to pull an Elop.

BK wasn't even remotely in the same class as Gelsinger engineering-wise.

Pat Gelsinger fits the second option better, he has an actual engineering background and worked on some of the most important Intel products early in his career.

Swan was utterly the first option.

Pat Gelsinger didn't do that at VMW, and it seems unlikely he'll do it at Intel. He's an engineer.

All of the other fabs are so busy they can't handle Intel's chip production plus Intel's technology is wound around their own labs. Switching wouldn't be easy and might end up being a failure and taking the company with it.

The argument I've seen for outsourcing is that everyone uses machines from ASML et al anyway, so retooling to run a different company's silicon may not be as impossible as it seems.

I think there's an issue with helping Intel temporarily, because if you're TSMC you'd rather use your capacity to serve long-term partners rather than helping Intel bridge the gap to 7nm only to get dropped a couple of years from now when they get their chips in order.


Or are they looking for that one acquisition that (this time!) will fix everything?

Arguably a re-aquisition. Gelsinger was at Intel for 30 years.

They didn't mean Gelsinger as the acquisition. The question is what direction Gelsinger will take them, with one option being to acquire some company or companies to bootstrap a new path forward.

Time to sign your startup in for the lottery. Your company might be the next Pure Digital or Autonomy Corporation!

Tangent. I first noticed the phrase "leaning into" a couple of years ago and I'm still not sure I understand how to use it.

It's like "flex". I don't hate them (I hate "relatable" and "addicting" but those are apparently acceptable as real words now) but it's odd how they seem to bubble up suddenly out of nowhere.

(British by the way - that might have a bearing)


Here I think of it as “committing to the direction,” the way you “lean into” a tight turn on a bike or motorcycle. It’s similar to “doubling down” or other euphemisms for committing harder to a course of action.

The metaphor works for me ¯\_(ツ)_/¯


> the way you “lean into” a tight turn on a bike or motorcycle.

Ah. It makes a lot more sense now. I never made that connection.


Hey don't expect help here, I barely speak my native language correctly let alone English :)

I play it by ear and there's a high chance the "leaning into" expression was used incorrectly. I meant to say, the board was more inclined to follow a roadmap than other options.


"Lean in" was popularised by Sheryl Sandberg. It's popular with management types and wildly unpopular with folks who feel that "just work harder" coming from an actual billionaire is a bit patronising.

It's like when someone tells you your singing is bad, so you purposely ham it up for laughs. That's leaning into it.

To me, that move makes a lot of sense. But judging from the other comments, I'm the only one with that assessment.

I'm my opinion, the secret sauce that makes Intel dominate certain industries is software. And it has been for some years already.

If you need really fast mathematical number crunching, e.g. high frequency trading or realtime audio filtering, then you need MKL, the Intel math kernel library.

If you want to further reduce latency with parallelism, you need TBB, the Intel thread building blocks.

Raytracing? Intel embree.

Once you are locked in that deeply, the raw Intel vs AMD performance becomes meaningless. You only care about how fast the Intel libraries run

So a CEO with experience building high performance low level software seems like an amazing fit.

Edit: And I almost forgot, the Intel compiler used in pretty every PC game to speed up physics. Plus some people have seen success replacing GPUs with icc+avx for huge deployment cost savings in AI.


To me this description sounds like a specialist high performance computing company rather than a consumer technology company. That may be a perfectly reasonable market to be in, but is that type of company worth $200bn? I'm not sure.

Roughly 40% of their revenue is consumer chips where, apart from some games optimisation, they are no longer standing out from the crowd, and the leader is arguably Apple, with AMD doing well. The next ~30% of their business is servers, where there may be a significant number of HPC clients, but the bulk of this is again likely to be VMs running non-Intel specific software, and this market is starting to realise that Intel is nothing special here.

Looking at their revenue breakdown, I struggle to put more than 20% into the things that you mention they are great at. Should they focus on this? It would lose them much of their market cap if they did.


>Roughly 40% of their revenue is consumer chips where, apart from some games optimisation, they are no longer standing out from the crowd, and the leader is arguably Apple, with AMD doing well.

You lost me at Apple. Apple owns around 15% of the PC market space and almost the entirety of that is Intel-based systems. Outside of HN, nobody cares about the M1 chip, it isn't a selling point to my mom or her friends. If someone at the Apple store recommends it they might buy it instead of an intel-based system but it definitely isn't something they're seeking out.

The only threat Intel has right now in the consumer space is AMD, and it's a very real threat. AMD won both Sony and Microsoft console designs, and the mobile Ryzen 5000 chips released at CES look to have enough OEM design wins to put a serious hurt on Intel in 2021.

Even if Apple goes 100% M1, there's the other 85% of the market that Intel is likely far more concerned about.


I get your point, but I think the M1 is more significant as proof of what is possible than because I think everyone will buy a Mac.

I can absolutely see Qualcomm offering laptop chips off the back of the M1's success. They may not be as good, but they might be much cheaper. I can also see Microsoft pushing Windows on ARM harder, and rolling out their own chips at some point.

Also once the market gets "used to" multi-architecture software (again), I think we'll see a renaissance of chip design as many more players crop up, because of the lower barrier to entry.


Maybe. Apple has solved the chicken and egg problem regarding software compatibility by forcing everyone to move on to ARM in the near future. Microsoft will not abandon x64 though, so there are far less incentives to port things. Also Microsoft cares far more about compatibility (e.g. 32-bit software). That means means a lot of things will run under a Rosetta 2 like system (probably less efficient if you need to support 32-bit as well). If you add the fact that Qualcomm is unlikely to match Apple in performance, the resulting product might not be very appealing compared to a classic x64 system.

An ARM transition isn't a fait accompli just because Apple introduced M1 at the lower end of the Mac lineup. There's a huge lump of inertia there.

> An ARM transition isn't a fait accompli just because Apple introduced M1 at the lower end of the Mac lineup. There's a huge lump of inertia there.

If you're just referring to Apple's future, the endgame is already a done deal. They don't roll back something this fundamental once it's public.

Their last Intel computers will be manufactured and sold within 3 years; I'd wager a month's salary on that if I had it lying around.

If they can't make it perform at the top level like AMD, which seems unlikely but obviously it's too soon to tell, they'll make up for it in other ways or simply live with the performance hit until they can.

Apple doesn't believe in inertia. Their partners and customers can either make the transition or get left behind. Microsoft's business model depends on making enterprises happy; Apple doesn't care.


The comment above mine said,

> Apple has solved the chicken and egg problem regarding software compatibility by forcing everyone to move on to ARM in the near future.

I don't see "everyone" moving to ARM in the near future. I don't even see myself buying an Apple Silicon machine in the near future. I don't see corporate users going to ARM in the near future, and I certainly don't see government users going to ARM in the near future.

Even if the Mac lineup goes exclusively to ARM in two years as planned, I'd be shocked if Dell, Lenovo, and other manufacturers follow suit. Lenovo just announced that the fully specced Thinkpad X1 Carbon Gen 9 will run on an Intel 11th Gen CPU. They're not even going to AMD, let alone ARM.

Apple switched to Intel from PowerPC for a reason. Apart from raw performance at that time, Intel's draw was practical: dual boot, software compatibility, and so forth. Now, Apple is leaving Intel because the lithography has fallen behind, the chips run hot, Intel's share of the profit eats into Apple's profit margins, and x86 impedes their dream of iOS/MacOS app singularity. However, the masses are not clamoring for a move away from x86, so there's no pent-up demand to be had with a switch to ARM.

Let's assume that Apple follows through on its two-year plan to offer an exclusively ARM Mac lineup. What's the bear case?

If Apple is the only player to move to ARM, and the market doesn't follow, it could end up as the first existential setback that the Mac has faced in a long time. In my view, Apple has been out of touch with Mac users for a while now. When they eliminated MagSafe for Macbooks, users shrugged. When they introduced TouchBar, users stared blankly. When they removed all but one format of accessory port, users groaned. When they introduced the butterfly keyboard that put thinness above function, users complained. Eventually if you make enough weird moves, you start to alienate customers.

There are bound to be gaps in the respective software libraries of x86 and ARM, and that is going to matter to regular folks and application-specific power users alike. Apple itself is having problems getting Big Sur to run properly on all the machines it supports. "Supported" 2013 and 2014 machines are getting bricked, and even brand-new machines are experiencing bugs with basic functions like managing external displays. I can attest that Intel-based 2020 four-port MBP13s were still shipping with Catalina at the end of December (I don't know if that has changed in the past two weeks). If the OS itself is having problems, that isn't exactly a good sign for the applications that run atop it.

Apple has gotten used to saying "whatever, I do what I want," and getting its way. Remember 3.5mm audio ports on flagship mobile phones? Me neither. But what happens if complacency about that kind of market power carries Apple a bridge too far? They will have been so used to success that they will miss the early signs of market distaste, and may be unable to adapt mid-flight. There's a certain sense of bravado and self-fulfilling prophecy in saying that -- come what may -- the Mac is moving to Apple Silicon, but it'd be smarter to eschew the bravado and see how things go with the M1 before announcing plans to transition the entire lineup.

So to answer your question, I was talking about ARM vs x86 in general. A transition from x86 to ARM isn't something that one firm with minority market share can force unilaterally. However, I also think the Apple Silicon two-year timeline is overly ambitious and that Apple risks losing market share as a result. If that happens, the stock will suffer drastically, and there will be a reshaping of Apple's C-suite.


I agree Apple's in a tough spot, facing multiple risks with the transition, but if they waffle they become Microsoft, and they can't afford that.

If they express any interest in keeping Intel around, the transition is at great risk, because software developers can just trust that Rosetta will continue to make their software work, and customers will keep buying Intel machines that Apple will have to support for years to come.

And I believe that was what the parent was saying:

> Apple has solved the chicken and egg problem regarding software compatibility by forcing everyone to move on to ARM in the near future.

Clearly there's no way that every computer user is going to be moving to ARM, but every new macOS user will, so the sentence only makes sense in the narrow scope of Apple customers.


I certainly don't claim to know what the author meant when those words were written. What I know is that Apple serves and represents an influential minority of the personal computing market, other manufacturers are unveiling new machines that run on Intel, and even Apple hasn't put out a single high-end machine with ARM. I personally haven't been forced onto ARM; I bought an Intel Macbook Pro after the M1 release.

You're absolutely right that the challenge is to get developers to build software that doesn't just depend on Rosetta 2. But at present, a developer has to think about users on x86 Windows, x86 MacOS, ARM MacOS, and Linux (ignoring ARM64 Windows and a few other OSes). The ARM MacOS devices at present are lower-end -- 2 TB3 ports, etc -- and less likely to be used by the customer cohorts that are the most profitable for software developers (except maybe the big-ticket names like Office, Photoshop, etc). I run x86 Mac and I still can't run all the APIs and software that I want to use. M1 would be totally untenable for me. For example, I use a brokerage whose software runs on Windows, but they have a cloud version that runs on Citrix Receiver. That used to work for me on Mojave, but I haven't been able to get it running on Catalina.

I don't think Apple has solved the chicken-and-egg problem, not even within the Apple landscape. If anything, Apple has made the problem more pressing and relevant by introducing a two-year cliff.

As Ludacris says, "tomorrow's not promised today." It's unwise to treat every large market segment as massless. Forced market shifts and state changes can be a neat trick when Apple pulls them off, but eventually they'll hit an immovable object. A two-year timeframe is an eternity for PCs and semiconductors; moreover, demand is astronomical at present, but the term structure of that demand is anything but predictable right now.

Apple is going to be supporting x86 Mac well past that two-year timeline, so even if ARM Mac takes over on schedule, developers are going to have to build for both setups for some time to come. I don't think that's particularly prohibitive if you set up your abstraction layers and work on one shared core before tailoring it to fit your supported platforms, but it's definitely added work and more planning.


Yes, I meant everyone in the Apple ecosystem: developers and users alike. Some will be left behind, most will transition, and I'm sure Apple is hoping some premium Windows/Linux users will be converted by the incredible efficiency.

True, but Apple has been the harbinger for a LARGE number of advances in consumer computing adoption. Mouse/windows desktop interface? Apple. Font libraries? Apple. USB? Apple. Touch interfaces? Apple. Modern smartphone? Apple.

The list goes on and on. M1 could be such a harbinger. Who knows if anyone else can compete, and likely Apple will be in the pole position for some time, but Apple has shown everyone it can be done.


At WWDC last year they made it clear that every Mac is going to run on ARM eventually.

I'm not so sure, since a lot of software is somewhat cross-platform now anyway. You need to port the JVM, .NET, and Chromium (for electron), and you've effectively ported a large part of the desktop application space already.

Web applications are portable if you have a browser

Many modern languages that compile to native, like Rust, trivially recompile for multiple target architectures.

Others are dynamic and don't need recompilation.

Of course some major software is written in non-portable C and C++. But the question is whether some emulation isn't acceptable here.


Porting the JVM to a new CPU architecture is hard but it has already been done for AArch64.

I know, I'm just thinking in today's day and age an architecture change isn't that hard anymore. We're already paying for all these abstractions, why not use them.

I don't think that there is a high performance JVM implementation for RISC-V. A vendor would have to do a lot more work if they picked an architecture other than AArch64.

Large parts of Windows 10 are still Win32 not UWP.

They can't even remove decades old legacy code from their own products, so good luck everyone else.


You misunderstand the point of the M1 out of Apple, and for that matter the graviton2 instances out of AWS. What was demonstrated in the marketplace is that the biggest tech companies are now able to develop in-house processors that are more cost efficient and more performant. These processors are based on ARM and have minimal overhead licensing costs, as compared to buying Intel or AMD chips for their vast fleets / products.

If AWS and Apple can do it, soon other very large companies will, but in a few years, even OEMs will be able to develop their own chips. The market for high end gaming is unlikely to be touched, but the vast consumer market is going to be eaten by custom made ARM-based chips.

So in a world where processor design becomes a commodity, what does that mean for Intel and AMD? And what does that mean for the overall datacenter, consumer markets?


> So in a world where processor design becomes a commodity, what does that mean for Intel and AMD? And what does that mean for the overall datacenter, consumer markets?

Processor design is already a commodity, and has been for many years. Any company with the cash can buy a license to the ARM64 instruction set and the reference core design, and have someone like TSMC or Samsung manufacture it.

These designs haven't taken over desktop market from x86 yet because those designs simply weren't performance-competitive with what AMD and Intel are pumping out, and it's not clear that that'll change anytime in the foreseeable future.

Apple knocked it out of the park with the M1, but they've been kicking ass for years, including their competition in the ARM processor space. Just because Apple's processors happened to use an ARM instruction set, doesn't imply that an ARM revolution is upon us.


Apple isn't using a reference core, though. They are using entirely custom cores.

>it isn't a selling point to my mom or her friends

Gargantuan battery life isn't a selling point? For laptop? In what universe?


For who? My mom uses her laptop at home 99% of the time, if the battery gets low she plugs it in. She needs a battery that will last 1-2 hours for the 3 times a year she flies.

You can find a place to plug in at basically any coffee shop or library you go to. My mom isn't spending 10 hours in a datacenter, so it doesn't really matter to her if the battery life is 3 hours or 12. For the average consumer, battery life has just been another stat on the spec sheet for years now.


> For the average consumer, battery life has just been another stat on the spec sheet for years now.

I'm not sure I agree with this. I think if you asked someone whether battery life was a priority, they might say no. And if you asked them to rank tech specs I'm not sure it would necessarily be that high either. But the experience of using a laptop with a noticeably better battery is, for me, quite likely to be one of those things that you didn't know you were missing, even if you just charge it every now and then.


This resonates with me too - anywhere it would be comfortable to work with my laptop is going to have somewhere to plug it in to charge.

Even if your mom doesn't care about battery life, she will probably buy a product that is also sold to buyers who do care about battery life, and if the product can meet those buyer's needs with a smaller battery, then your mom's laptop will be lighter.

So, does your mom also not care about the weight of her laptop?


The Macbook air with intel CPU weighs 2.75 lbs, the Macbook air with M1 weighs 2.8 lbs. The macbook pro is 3.0 vs 3.1 lbs.

The weight is a non-factor. Quite frankly until you start cracking 5lbs nobody even cares in my experience. Apple's maniacal focus on making laptops skinnier and lighter has done a disservice to the entire product line, which they seemingly acknowledged with the 16" Pro.


Apple had the option of making the M1 Air lighter (by choosing a smaller battery) but decided instead to greatly increase battery life. The point remains that Apple has choices that vendors of laptops reliant on Intel CPUs do not have, which might end up eating into Intel's market share.

Strongly disagree. I'm writing this on a 12" MacBook with a partly broken screen - by far my favourite machine (I have a 2020 MacBook Pro too). I'm not alone too.

I expect we'll see an M series MacBook again but this time it won't be underpowered.


the m1 isn’t competing on many of the same axes as an intel or amd cpu, because it’s necessarily packaged inside of an entire computer built around it. that computer is a mac, which might be different from the purchaser’s current os so they decide not to switch, or they already bought software for windows and want to use it there, or they’re married to the microsoft ecosystem, etc.

Office runs on Macs too!

Seriously, I predict we will see Apple successfully attack the sub $1000 laptop market within two years. They sell the iPhone SE with an A13 for $399 so they could easily do so now they no longer have the 'Intel tax'. And the products will be a lot better than the Windows equivalents.

Most home users might use Office and that's about it. The allure of the Apple ecosystem will be strong especially for iPhone users.


I’m skeptical.

Apple’s bread and butter, as far as Macs go, is the MacBook Air. And by all accounts, they sell a lot of those, and will presumably sell even more, with better margins, now that they’ve gone ARM.

Do they really want to undercut that with a cheaper laptop? I suppose it’s possible, if the volume/margins works out, but I’d bet they just keep plugging along with $999–$1500 13-inch laptops.


All fair points but I think that with higher margins it tips the balance towards market share growth. Key issues are 1) can they make an acceptable margin on a good $800 laptop and 2) can they genuinely significantly grow market share rather than lowering average selling price - i.e. can they maintain distinction between $800 and $1000 products. Given what we've seen them do on iPhone and iPad I bet then answer is yes to both of these.

Bear in mind too that after a generation or two they can put the last gen M chips in cheaper products.


I think there's an economic principle here (and I don't know the sign), but this is all assuming a frictionless vacuum - in practice, Apple cannot sell 25% more M1 Macs if they lower their price to $800, or whatever, since their marginal costs rise in that case (because TSMC is totally booked!).

That's today but I'd expect next year's sub $1000 Macs will use previous year's M series chips in due course. (Exactly the iPhone and iPad playbook).

For consumers? It’s a race to the bottom. Mom wants to pay $200 if anything. My in-laws do their taxes on their phone.

I'm not really sure to be honest. We've moved on from laptops that you can't watch movie on without charging few years ago. I don't really care if my laptop works for 10 hours or 15.

It’s the same reason battery life is so critical in EV’s. Smaller batteries need to be charged more often which eats up more of their remaining lifespan. It’s a downward spiral that means a 50% extra lifespan up front can be worth 100% extra lifespan in 3 years.

Laptop batteries are also expensive in terms of money, weight, and bulk which puts Intel into a much larger bind.


My mom would love to be able to use the same apps on her phone and her computer. I've been thinking about suggesting her next computer be an apple one since she got her first iphone and this makes it easier. Your Mom May Vary.

Not sure of the source of your 15% but I'm willing to bet that by value it's more - no Celerons in Apple's line up. Plus Apple wouldn't be going down this route if it didn't expect to grow market share - and although people don't care if it's M1 or i5 they do care if the experience is better.

Then Apple's success with the M1 will spur others - I would not be surprised if Microsoft follow them down the same route.


> Apple wouldn't be going down this route if it didn't expect to grow market share

Marketshare is not what Apple is about. Apple is about profitability and control. Their move to own silicon is driven by improvements in the reliability of their build pipeline (no more waiting for tic-tocs and whatnot) and tighter control / integration of their whole stack (same arch on phones and pc). That these chips happen to perform so well that they are potential market-growers, is a welcome coincidence.


Growing marketshare but profitably and without impairing the brand is what Apple is about. That's why we have the iPhone SE. The M series lets them do that with the Mac now. And more Macs implies more Apple services sales.

It's certainly partly defensive - they were frustrated with Intel - but Apple would only make a move of this scale if it thought it created business opportunities for them.


I think you are vastly underestimating the revolution of what the M1 represents to the PC industry.

I think you are vastly underestimating how hard IT departments will kick you if you request or bring a device that cannot properly execute the company-critical legacy Windows x64 software. Like SAP, for example.

(SAP is the largest non-American software company by revenue and does business management, workflow automation, and bookkeeping)

My prediction is that outside of hipster startups, M1 will have no effect on business laptop sales.


But then again, the current version of SAP is S/4 HANA, and unless you are a developer or admin for that, you will be using their Fiori based web clients, so a normal browser is enough. I am a developer in an S/4 rollout project in a Windows-only shop, but for our future system landscape I could see the normal people using the SAP systems using any kind of laptop or tablet. Even we are testing iPads and laptops at least.

>My prediction is that outside of hipster startups, M1 will have no effect on business laptop sales.

This is what everyone always says but the iPhone kicked off a whole BYOD trend that has ended up with many high-value employees caring a lot about what tools they have to use, and a lot of software engineers want Macs.


I’ve used (and developed software for, and helped administer) SAP from my Mac on and off for 20 years. Tends to be easier these days now that everything is mobile and web.

A lot bigger companies than “Hipster startups” use Macs, this tends to start with the C-suite and people follow suit.

My point also wasn’t that everyone was going to switch to Mac. It was that M1 proves you can build a “better in every way” PC with an ARM architecture. Linux ARM is also being pushed by AWS heavily from the server side with impressive price/performance numbers.

Windows ARM has been failing for many years, but I suspect this is going to change. Microsoft has a talented virtualization group, where the HyperV roots go back to the Connectix Virtual PC team that built PPC/x86 emulation for the Mac. I suspect they can pull off something like Rosetta - they just need a chipmaker to collaborate with. Might even be Intel! Pat Gelsinger is an outside of the box thinker.

You remind me of folks that thought the iPhone / iPad would have no impact on Blackberry sales, as real businesses need keyboards.


M1 with Rosetta emulation is still faster or comparable to top Windows laptops, with room to run for M2.

Isn't sap a database? Why would you want users to run db on laptop, especially on windows?

SAP is so many things that their Products page has search functionality and is broken down by first letter: https://www.sap.com/products-a-z.html

So no, SAP is not a database and has many client applications that would run on a laptop.


No, SAP is more like an Operating System for your factories. It contains EVERYTHING, from payroll to inventory management. Think of it more like an Exchange server plus all Microsoft office apps combined. To connect to the Exchange server and get all features, you need Outlook. It's the same with the SAP database and SAP client GUIs.

The official GUI is C++ and Windows only. They do have a Java port for other OSes, and some 3rd party GUIs, but none of that is feature-complete or even halfway there.


SAP is lots of enterprise software stuff, not just a DB.

> I think you are vastly underestimating how hard IT departments will kick you if you request or bring a device that cannot properly execute the company-critical legacy Windows x64 software. Like SAP, for example.

Many companies have long ago set up some beefy Citrix servers for those application.


I think you are vastly overestimating the "revolution" of what the M1 represents to the industry. Apple isn't selling it to any other PC makers, and corporations aren't pivoting away from Microsoft for a CPU. Every single ARM chip that's been targeted at the Windows world has produced yawn-inducing performance.

M1 is amazing. I think people are also underestimating how fast Intel can catch up.

> it isn't a selling point to my mom or her friends

Really? That's surprising to me. I'd imagine that for the demographic of her and her friends, quality of life increases for their phones are far more material than for their computers.


It sounds like you're agreeing with your parent comment.

> phones are far more material

Thus they don't really care about laptop battery life.


That's not what I'm getting at. What I'm getting at is that to say that "even if Apple goes 100% M1, there's the other 85% of the market that Intel is likely far more concerned about." is somewhat far-fetched. Apple can make their own desktop chips because it's an easier problem than making a phone chip (performance/thermal efficiency), but Intel can't make because they've sacrificed thermal efficiency time and time again -- not just this time but a decade ago. Remember Prescott?

I think that is why Intel should be (and probably is) worried about Apple. They will make Intel redundant by having solved a harder problem which their own problem becomes a subset of.


I agree with your market breakdown, but surely not with your assessment.

In the consumer segment, you have regular people trying to make vacation videos with software like Adobe Premiere and Adobe Media Encoder, or Magix. Nvenc quality is bad. AMD is horribly slow. The only fast high quality encode is with Intel's dedicated CPU instructions, which both apps heavily promote to their users.

And the 30% that you mention that run VMs... Wouldn't they be pretty happy if Intel added dedicated CPU instructions to make VMware better?

I agree that for the work that I do, AMD is as good as or better. But people doing highly parallelizable tasks like compiling are the minority.


I think you might over estimate the prevalence of video editing software like this. Adobe don't appear to sell consumer versions anymore, it's only pro subscriptions now. Magix is sold at a "vacation video friendly" price, but doesn't mention Intel in their marketing material.

I just don't think the market for home devices is thinking about their video encoding time when they buy a laptop, but I do think they'll use an M1 Mac and find it surprisingly fast, or hear from a friend or family member that they are really good.

Intel just haven't been optimising for the main user experience seen by these people, or those writing "normal" server software either. They've been pushing AVX512 instead, which looks good for video or things like that, but not for regular use-cases.

Another good example is how fast the M1 chips (and the A chips in iPhones) perform at Javascript benchmarks. Those benchmarks look a lot more like what most people are doing most of the time than video encoding benchmarks.


> but I do think they'll use an M1 Mac and find it surprisingly fast, or hear from a friend or family member that they are really good.

That only happens in California.


MAGIX heavily mentions Intel in their marketing.

"4K Ultra HD video editing with Intel and MAGIX"

"Enjoy HD Video editing with Magix Movie Editor Pro and Intel Iris Graphics"

"Edit in 4K Ultra HD" + Intel Logo

"Finish and Share videos quickly with Intel Iris Graphics"

https://www.youtube.com/watch?v=eT9KOtN7KFM

Plus, as a user of the software, I can tell you that if you tick the "Hardware Acceleration" checkbox on AMD, a popup will tell you to buy a supported Intel CPU and then turn the checkbox off again.

BTW I'm picking Magix here because in the local electronics store, that's the video software that you can buy as a box and that is featured in bundles with Intel laptops. So if someone clueless walks in there and says they need video editing, this is most likely what they will end up with.


It's not just marketing but also uses Intel QSV. AMD RADEON has similar feature VCE but its quality is relatively bad and its adoption isn't well, compared to QSV/NVEnc.

> AMD is horribly slow

Not sure where you're getting that these days? Absolutely in the days of Bulldozer, but AMD's Zen 3 architecture has taken even the single core lead from Intel, not to mention the multi core lead they've held for several years now.


The encoder?

AMDs GPU encoder still lags a way behind Nvidia for example


> AMD is horribly slow. The only fast high quality encode is with Intel's dedicated CPU instructions,

You might need a source for that.


Intel Core i7 6700K is double the FPS of AMD Ryzen 5 1600X https://www.magix.info/de/forum/ein-performancetest-zwischen...

Uh, those numbers are more than three/five years old at this point. Beyond comments on the test bench not being properly set up, Ryzen has improved significantly since then.

AMD's latest consumer-level chips significantly outperform Intel's chips in both price and performance. When talking about prosumer video editing performance, the Ryzen 9 5900x, the second most expensive "new" chip from AMD is a 3.4% performance improvement over Intel's most expensive "new" chip 10980XE. Additionally, the 5900x retails for $549 USD while the 10980XE retails for about $1,000 USD.

https://www.pugetsystems.com/labs/articles/Adobe-Premiere-Pr...


On first generation Ryzen AVX2 instructions were executed as 2 AVX instructions as the AVX pipeline was 128 bits wide. This was fixed in Zen 2 and nowadays we are at Zen 3.

Seems like the clock, RAM, and GPU differences there may have had an effect? Comparing between systems that different seems unusual.

this is article from 2017 which compares 5 generation old cpu vs other 5 generation old CPU

>if Intel added dedicated CPU instructions to make VMware better

They (and AMD) did years ago. Intel VT.


> And the 30% that you mention that run VMs... Wouldn't they be pretty happy if Intel added dedicated CPU instructions to make VMware better?

They have this today. What would make them happier is cutting power utilization by half or more, which is looking quite possible with non-Intel Silicon.


The compiler tricks can only get you so far. I administered a HPC cluster and we have a lot of software dependent on MKL and BLAS. However, with the lucrative performance boost AMD seems to put out, open source libraries like BLIS and open BLAS are attempting to fill gaps. Trust me, no one likes the intel lock-in if there is an alternative that is even close enough in performance.

> I'm my opinion, the secret sauce that makes Intel dominate certain industries is software. And it has been for some years already

Intel's secret sauce is inertia.

The thought that Intel's is not challengeable, and the world doesn't need a company to dethrone it either.

But that assumption is no longer true, and the counter movement is in its full swing.

The future of computing is on not CPU if you ask me. It would move from general computing to heterogeneous computing, and possibly application-specific chips/FPGA. MKL is fast, probably, but GPU and ASIC would be even faster.


> If you want to further reduce latency with parallelism, you need TBB, the Intel thread building blocks.

That's not how latency works and there is nothing too special about Intel's TBB library. It is a big bloated group of libraries that doesn't actually contain anything irreplaceable. Don't be fooled by marketing or people that haven't looked under the hood. It should also work on amd cpus.

> Raytracing? Intel embree.

Embree is a cool convenience, but also doesn't marry anyone to intel cpus.


mkl runs fine on amd now once you un-handicap it

The CEO of a semiconductor company needs to have an engineering background, IMO. The tech is too complex and too important to the business to have a CEO who doesn't understand the nuances. Wish Pat all the success at Intel. We need Intel to do better.

Disclosure: I worked at AMD for about a decade, although that's a while back now. It is traditional in semiconductor companies (or was, anyway) to have a triumvirate:

1) the "outside" guy (sales, know the customer) 2) the "inside" guy (operations, now the employees) 3) the "tech" guy

Any of these three can run the company, but whichever one it is, they need to have the other two near at hand, and they need to listen closely to them. The problem comes when, as at Intel and perhaps also at Boeing, you have options (1) or (2) in charge, and they're not listening to the person who is position (3) in the triumvirate, or they don't have a triumvirate at all. If the person in position (3) is in charge (as at AMD currently), they will still need to have experts in (1) and (2), and they will need to listen to them.


I agree, that is why Lisa Su runs AMD so well

From [0]:

Gelsinger earned a master's degree from Stanford University in 1985, his bachelor's degree from Santa Clara University in 1983 (magna cum laude), and an associate degree from Lincoln Technical Institute in 1979, all in electrical engineering.

I'd call it an engineering background.

[0]: https://www.vmware.com/company/leadership/pat-gelsinger.html


I've understood it as a critique of a previous CEO, not the new one.

Wasn’t he Andy Groves Protege?

Pretty damming to Bob Swan when a $200B market cap company jumps +8% on news that you are stepping down.

Pretty sure he gives zero fucks about this considering the juiciness of his golden parachute.

The only gig where you get rewarded handsomely even if you fail.


But us customers can watch Intel succeed again, rather than squander 7 years.

Is there any security to short Bob Swan? Seems like a good bet.

Intel is in shambles. The whole 10nm thing is a fiasco at this point. Who could've pictured 10 years ago that Intel would've even considered outsourcing their fabrication? Intel's core strength was their seemingly unassailable lead in chip design and fabrication.

From various articles over the years it seems that what's happened to Intel internally is fairly typical: internal fiefdoms, empire-building, turf wars and the like. This is something you have to actively prevent from happening.

This is going to take someone with deep experience in fab engineering to figure out, not a bean counter. And it should probably involve a massive house cleaning of middle management.

And no the answer isn't just another reorg. Unless you actively prevent it reorgs become a semi-constant thing. Every 3-6 months you'll be told how some VP in your management chain you've never heard of let alone or met now reports to some other VP you've never heard of or met. There'll be announcements about how the new structure is streamlined and better fits some new reality. And 6 months later you'll go through the same thing.

This is a way of essentially dodging responsibility. Nothing is in place long enough for anyone to be accountable for anything working or not working.


The world really needs a company like Intel to succeed, if only to take some of-- let's say-- the geographic concentration risk out of such a high percentage of the cutting-edge device manufacturers depending on TSMC's fabs.

No harm to VMware on this I think. As it pushes into SAAS it really does need a big shake up. Not that Pat wasn't good, but it is an old company with a lot of inertia, some fresh blood and fresh ideas could really be beneficial.

Hopefully the next CEO is as committed to all the other stuff, like treating the employees very good and the community stuff. The best bit of the company isn't the tech at all (in my opinion)


They need to get out from under Dell’s thumb. Until then no new CEO will bring significant changes.

I'm not sure I agree with that. I've been there now over 8 years and the only difference since Dell bought emc is they are cross selling. In terms of product development, staffing, benefits etc they have no influence. This is in opposition to EMC, who had a lot of benefits and nice things like free coffee cut to match with dell.

>but it is an old company

That makes me feel old given that I first spoke with them as an analyst before ESX Server came out :-)


Intel is still making tons of money, so there's still time and Pat seems the right choice. Competition is good and Intel could still be competitive.

For sure. Considering from what depths AMD has managed to recover, people are much too eager to count out Intel.

He joined Intel at 18 with an associate’s degree from a trade school (Lincoln Tech). Amazing.

Only in America.

You're getting downvotes, but it's true. The rest of the world is a lot more obsessed with credential-ism and pedigree.

Even Silicon Valley in 2020.

Is, what? Obsessed with credentials? It really isn't (but I can't tell if I'm just misreading you and don't want to belabor it).

I meant Silicon Valley 2020 is more focused on credentials than it was when Pat Gelsinger was starting out.

Maybe? I don't know what it was like when Gelsinger started. But it's the least credential-driven high-status professional market in the country, and maybe the world. It's hard to start your career with a first job at Google without a top 10 CS degree, but that might be the only example; it can be your second job without one.

It's possible to get into FAANG without a degree. There's a cottage industry dedicated to it. Not so much with consultants, lawyers, doctors and other fields of engineering.

However hard it might be to get an entry level position at a FAANG without a degree, I don't believe any significant obstacle exists at all at the next tier of companies down from them.

Really don’t see how this changes things inside Intel. Pat Gelsinger is not new to Intel at all. This is just playing musical chairs.

Pat was fairly effective at VMW. The fact that he himself is not new to Intel is probably beneficial for the company, but there's a pretty substantial difference between being CTO and actually steering the ship.

I'm not comparing Gelsinger to Steve Jobs in a general sense, but Jobs wasn't new to Apple when he returned -- and yet Jobs' return to leadership was transformative for the company.


I see a lot of parallels between what happened at Boeing and now at Intel. It’s a cultural phenomenon and I am not convinced Pat is going to shake things up.

When Steve Jobs returned to Apple, he literally fired most, if not all, of management consulting types. He changed the culture overnight. I don’t see that happening at Intel (but I hope I am wrong).


Only in the incestuous world of CEOs does being "fairly effective" at your last job grant you the keys to one of the biggest companies in the world.

You could argue that VMware was slow to move into containers and should probably have better leveraged Pivotal before they drew them in. The former is something of an innovator's dilemma thing. The lack of focus on developers and applications was something of a VMware blind spot going back to Diane Greene days. But "fairly effective" underplays how well VMware has done over the past 10 years or so overall. There are things they should have been more aggressive about, especially with the benefit of hindsight, but they've been a very successful software company/subsidiary.

If it puts a focus on Engineering leadership, it could work.

Definitely a win for Intel. Pat will be sorely missed at VMware, though.

Can confirm. With multiple other amazing things he brought to the table, he was super excited about things we were doing no matter how small or big. He truly respected everyone. Definitely will be missed.

I always thought VMware was super lucky to have Pat as CEO. The mood is pretty grim today at VMware.

Great to see a technologist back in charge of Intel. This was a company that was founded and CEO’ed by top-tier engineering elite. AMD’s CEO is an MIT PhD with long track record of technical achievement earlier in her career. Intel needs the same.

The interesting question is what this means for VMware. Besides Pat's departure, Rajiv Ramaswami left to lead Nutanix. They have multiple holes to fill.

poor guy. stepped in way too late to catch up to AMD, now falling on his sword. Intel needs a Wartime CEO now.

(that said i'm sure he's crying all the way to the bank with his millions so i'm not feeling too sorry for him)


Yes, they should call Stephen Elop

Gelsinger can definitely be that Wartime CEO. If they let him do his job.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: