There are a lot of things you can say about Silicon Valley, but one of the more interesting aspects of it for me has been how "small" it is in terms of people have an oversized impact. I have never been one of those people of course, just a part of the entourage. But it has been interesting to watch and learn from folks who are good (and bad) role models.
A few years back when the teacher retired, we all gathered together to thank him, including Pat! World-class guy, and makes us proud! So great to share a story with students, who come from nothing, how you can leave our hometown and eventually be the CEO of Intel.
Wishing Pat the best of luck in his new role!
Doesn't that mean that a lot of people will never be on the receiving end just because they did not win the teacher-lottery (arguably the chance of getting such a teacher is rather slim)?
Shouldn't our systems be more robust? (Not talking about normalizing these effects down, but creating a better environment for everyone, using these effects en masse)
We need our very best to go into teaching. It's the ultimate multiplier role.
But also I think about how much crap they put up with in those roles, and how relatively low-stress my life is, and then I feel better about it. They're cut out for that kind of job, and I most certainly am not.
Intel's technical failure with 10nm has gone hand-in-hand with financial success with 14nm. That is, without 10nm chips on the market in a meaningful way they've been able to raise prices for 14nm parts -- Intel put up better financial numbers than ever in a time when it has not been investing in future success.
VMWare was a big thing in 1998 but it was obsolete by the time Pat got involved -- a hypervisor is naturally part of the kernel and there is no way cloud providers are going to spend their margin on VMWare. Yes, many people in business are terrified of open source software and want a proprietary product so they have somebody to sue (VMWare) or they need somebody to hold their hand (Pivotal.) Either way, VMWare and Pivotal are units that can be merged and spun off whenever a company based in Texas (Dell) or Massachusetts (EMC) wants to look like it has a presence in San Francisco -- you see the vmware logo on CNBC every morning and somebody thinks the king is on the throne and a pound is worth a pound but that doesn't mean anything in the trenches.
Like Intel in the past 10 years, VMWare is entirely based on a harvesting business model. In the short term Intel made profits by pandering to cloud providers; but in the long term cloud providers invested their profits in better chips. (What if Southwest Airlines had developed a 737 replacement designed from the ground up for a low cost airline?)
Pat might be able to keep the game of soaking enterprise customers going for longer, but someday the enterprise customers will be running ARM and the clients will be running ARM and the coders will be thinking "did they add all of those AMX registers just to put dead space on the die to make it easier to cool?" and falling in love again with AVR8.
- VMware grew almost 2.5x under Pat. Obviously their core franchise is about harvesting, but new adjacent products like NSX and VSAN helped a lot with growth. They just took a long time to get traction given the customer base, who don’t like change. VMware Cloud on AWS has been surprisingly strong even to skeptics like myself.
On the other hand, companies would get rid of VMware if they could, and they tried and failed with OpenStack. The hypervisor is commodity, but the overall compute/network/storage private cloud system isn’t trivial. Turns out “holding hands” whether by software or services is pretty valuable?
Public cloud of course is a substitute, though private clouds and data centres have survived and thrived as well (for now).
- Pivotal had a boutique software and services model that would be difficult to scale as a public company given the current shifts in the enterprise fashion away from productivity-at-any-cost (PaaS and serverless) and towards perceived low cost building blocks (Kubernetes and its ecosystem).
But it would be an gross oversimplification to suggest this was merely a vanity project for EMC and Dell. It took in $500m+ annually on open source software (and another $300m in services), which is no small feat, though still minuscule given what surrounds it. But anything new has to start somewhere. Not enough to impact the mothership’s balance sheets, but there was something special there that could be nurtured. Whether it can be, or whether the differences matter enough is anyone’s guess.
Pat could take Intel in a direction we don’t anticipate. VMware was written off for dead when Pat came on but he managed to buy it another 10-15 years and at peak a doubling of the stock price. He’s probably learned from that.
Hypervisor is a commodity, however management and support of hundreds or thousands of them is not. You can either pay people to support them and fix the software when it breaks or you can pay <vendor name here>. Given the former requires expertise and planning it's often more cost effective to go the latter.
Disclaimer: I'm employed by VMware (less than 1 year) and chose to come here based on pivots I feel they are making.
The "actual" problem could be anything from the customer misinterpreting the datasheet (Marketing/Comms problem), to insufficient testing (Factory/Production problem), to chip function (Design Engineering problem). As a "new college grad", or NCG in the lexicon, I admired how he dug out details from various folks to get to the real problem. He always had ideas for things Systems Validation (SV) could do to maybe trigger the problem that made sense to me. He really embodied the philosophy of fixing the problem not fixing blame on some group.
My Pat Gelsinger story: I worked in a successor org to MIPO, in those days called MPG. One day during pre-silicon validation on the Pentium II, one of my NCG’s came to me and said: “An old test from the historical test archive is failing in the simulator. I want to find the author and ask him about it. Do you know a Pat Gelsinger?”
Me: “Well, he is an Intel Fellow now, so he might not remember what that test was supposed to do. “
I believe this is what you feel when you come to the industry when it is just starting in a given locality, and then it explodes. Senior workers who join in the earliest days tend to drift from company to company, but they rarely leave the place entirely, or change industries.
This effect can drive people to over-pay to go to "good schools" or work for less than they think they should be paid at the "good places to work" because it lets them join a cohort with members who are statistically more likely to be "successful" (for some arbitrary definition of success). I personally never paid much attention to it but thought about it when talking with this person.
I know from experience that it happens in Silicon Valley that someone will say, "Oh I know someone at some previous company who is really good at that, let me see if they would be willing to change jobs." That is why companies pay people bonuses for "referrals."
I suspect it happens in LA in the entertainment industry as well. If you watch the old American television series "ER". Produced by Steven Spielberg's Amblin Entertainment production company, it is amazing to see people who were on that show first as guest actors and then show up later in series.
"The only one who may have a slim chance to completely transform Intel is Pat Gelsinger, if Andy Grove saved Intel last time, it will be his apprentice to save Intel again. Unfortunately given what Intel has done to Pat during his last tenure, I am not sure if he is willing to pick up the job, especially the board's Chairman is Bryant, not sure how well they go together. But we know Pat still loves Intel, and I know a lot of us miss Pat."  - June, 2018
"This is the same as Intel pushing out Pat Gelsinger. The product people get pushed out by sales and marketing. Which are increasingly running the show at Apple."  30 Days ago.
And numerous other reference since 2009. Many more around various other forums and twitter. I am getting quite emotional right now. I cant believe this is really happening. ( I am wiring this with tears in my eyes! ) I guess Andy Bryant retired makes the decision a little easier. And Pat has always loved Intel. I guess he is pissed those muppets drove it to the ground.
This is 12 years! 12 years to prove a point! Consider 4 - 5 years of work in lead-time since he left in 2009. That is 2014. Guess what happen after 2014?
May be it is too little too late? Or May be this will be another Andy Grove "Only the paranoid survive" moment?
The King is Back at Intel, despite being a fan of Dr Lisa Su, I am little worry about AMD.
I wish him luck, though.
As Adam Smith said, "there is a great deal of ruin in a nation", and likewise, big companies generally get more opportunities to reinvent themselves than one would expect.
Tired brain read this as "there's a great deal in a ruin of a nation." Close enough, I guess?
Competitors maybe passed them in the last couple of years, but Intel is still in a really solid position to turn things around.
Apple's situation was a lot more dramatic. Intel still has a very good market share and can probably take a couple+ years to safely get back on track, in my opinion.
Amongst other things:
- SGX is really far ahead of AMD SEV. The latter is probably easier to sell because it's marketed as "drop in" (encrypts whole VMs), but SEV has been repeatedly hacked or shipped with obvious gaping design holes that they patch later and call features. SGX is a lot more focused, a lot more flexible and frankly a lot more secure.
- Optane NVRAM is completely unique, as far as I know. It's only 10x slower than DRAM which is nothing, but it's persistent! It totally changes the whole IO hierarchy.
- AVX512 / DLBoost are able to hold their own against mid-range GPUs for some AI tasks, which is impressive and useful.
- Their core chips are still very fast. AMD chips are selling more for less, which is a good position for them to be in, and TSMC's process advantage is helping them out for now. But they don't have a truly massive edge in tech like Apple's competitors had when Jobs returned.
- Intel have a long tail of obscure features that AMD doesn't, although it's often hard to know this. SGX and AVX512 are high profile but there are others that are less well known.
I don't count side channels as an issue because it's also an issue for all their competitors, and frankly I found the near single-minded focus on Intel by the security community to be rather misleading. I even read a side channels paper that admitted they suspected AMD had the same problem but they didn't bother to check simply because they didn't have access to any AMD hardware in the first place, which was unusually honest.
Apple's M1 is very impressive in its space, but for high performance cases like servers it's only quad-core - you can't use the little low power cores for much. Apple have shown no interest in making server parts for a long time, and Apple's engineering is bespoke so it tells us nothing about what other ARM vendors can do. So in that space it's still just AMD vs Intel and Intel is a long way from being on its back yet.
this is true, but does not account for the fact that apple M1 is just the first (public) iteration of such cpu.
I feel like the view of Intel as a "sinking ship" is an inside-basebally misread of the situation.
If you buy a PC or a laptop or a server today, you're most likely getting an Intel CPU. It's now at least possible to buy AMD in many market segments from major vendors - and of course many of us do - but to the broader consumer and hosting world Intel still dominates.
You can look ahead and extrapolate and say "they can't compete with TSMC right now, and maybe they're going to start falling behind further." Fair analysis. But they dominate the market to an extent that's hard to overstate.
Contrast this with between-Jobs-stints Apple, which was a tiny shrinking company with a niche product and no clear strategy.
Intel is almost in too-big-to-fail territory. They aren't sinking, just because they now have real competition. Yes, they need to do "something" to maintain dominance - but why would you see their situation and conclude they won't? The new CEO here is a sign that they see the trouble ahead, and they're ready to steer around it.
If you have $233 billion (a LOT of money) and you want to own a world-class chipmaker, would you rather start from scratch or just buy Intel for cash?
Then there are companies that have at least managed to stabilize themselves reasonably, e.g. IBM and HP (I wouldn't consider either of them a huge turnaround success, but I would not expect them to head for bankruptcy anytime soon either).
And even companies that did go under, e.g. Kodak, took an enormous amount of time and effort to do so, and not without launching a cryptocurrency first…
Until still have some amazing technology compared to AMD, they're five processes screwed right now but I think they have a chance to succeed. But given the timeline on new CPUs and fab processes, it'll be 4 years before we see the fruits of anything he does.
If they actually want to survive long term, there are two paths as I see it:
A) Be legitimately better than AMD, this could include opening up the management engine, much higher performance chips at lower price points, or some sort of space magic utilizing their Altera acquisition.
B) Embrace RISC-V and push it to laptops and desktops HARD, while not pulling the Microsoft Embrace Extend Extinguish™ play. If they go this route then their stock becomes an exceptionally strong buy IMO.
There's a lot of interest in that from a national security perspective anyway.
Part of TSMC's offer is that they won't do design, they won't compete with you on design etc.
They specialize in the best possible fab and service for your design and that's it.
That's kind of a hoax. GUC (Global Uni Chip) is huge, and a nearly-wholly-owned subsidiary of TSMC.
TSMC has a large design operation. It simply is part of a subsidiary rather than the parent company, that's all.
There are a lot of shady links between TSMC and GUC. I was once pressured to "collaborate" with GUC as an explicit condition of getting access to the then-bleeding-edge 16nm PDK. I turned it down. A competitor of mine (with whose chief engineer I am friends) had the same screws put to them.
Imagine 100x of that, and that is Samsung.
I want to be able to train NNs with a laptop that can normally last 20 hours.
I realize that WinTel machines arent going anywhere because of entrenched business use -- but between NVIDIA GPUs for heavy workloads and M1s for day-to-day activities -- what is the feeling inside Intel right now? Is this like a Microsoft-Netscape moment in the 90s?
What it should definitely be, however, is a death-sentence for x86. I bet Intel are proud of it, but ultimately they must be at least somewhat jealous of those who don't have to use it.
We don't really know why Keller left. All we can say is that from the outside it looks like he might have just given up in disgust. Having an engineer at the helm again can make a big difference to an engineering organisation.
It'll be interesting to see what happens, I had written them off as on the path of inevitable decline and irrelevance.
If they have someone as CEO who understand the existential threat they're facing from everywhere maybe they'll survive.
The irony of the other top comment is that the AMD threat wasn't the competitive threat that mattered. AMD is also screwed.
We see it now with Google now being the "evil empire". They haven't had a real hit since Android and seem to be floundering, but online ad revenue is such a huge geyser of cash they're gonna be fine for a very, VERY long time.
I’d argue IBM largely is irrelevant today, but they technically still exist.
Google is lucky they have an ad monopoly because they don’t really have a coherent company vision, I wouldn’t buy their stock. They might get lucky with their deepmind purchase.
I don’t disagree with you, but if I was choosing companies in a strong position today it’d be Apple, Amazon, and Microsoft. I wouldn’t make a long term bet on Google or Intel.
x86 is on the way out and can't compete on power or performance. In the end RISC won, it just took a while to get there.
As things trend that direction AMD and Intel don't really have much to offer, they're competing on legacy tech.
With TSMC providing fabs for designs from anyone, the serious players don't need AMD or Intel. Apple has a massive lead here, but others watching this will follow it.
What exactly do you think is the fundamental limitation of x86? Most chips do lot and lots of crazy logic to go from instruction set to microcode, it's hard to imagine that the variable nature of x86 instructions are the limiting factor.
The "x86 tax" (if it exists, I guess) is usually estimated at somewhere on the order of 5% to 10% - which is a lot of money at scale but probably not enough for a total rethink.
There’s also the coordination problem with windows and third parties. Without vertical integration and tight OS support they’ll fall behind.
Something will have to give, but the future doesn’t look good.
It's like train companies thinking they shouldn't compete with cars because they're in the train business (without realizing they’re actually in the transportation business).
AMD is in the chip business, not the x86 business - and x86 just became a whole lot less relevant for the future.
Apple has the vertical integration to do it properly. Amazon and Microsoft have the server cloud computing to do it server side.
Without proper coordination from Microsoft, third party OEMs, and AMD they'll all fall behind on computing on end devices.
I think that's part of the reason AMD/Intel will be less relevant, the incentives and market moves don't look good for them.
Cost a bunch of engineering time and forward motion, internal politicking. Eventually it got binned after months of not getting what we wanted out of it. There was no technical reason for it.
Maybe hardware really is his thing, but that quid pro quo hurt productivity.
My purely personal view is that VMware's second act has begun and it'll do well. Pat deserves some of the credit for accepting that Kubernetes would be the future of the business and throwing his weight behind it.
There are aspects of Pat Gelsinger's leadership that I dislike, but they're orthogonal to his management style and foresight. He's been effective.
I’d be incredibly happy to have him if I were an Intel employee.
>A few years ago, back in 2016, Intel did a “RIF” (reduction in force) of about 11%. Intel had previously done a significant reduction way back in 2006 of about 10%
>In an industry that runs on “tribal knowledge” and “copy exact” and experience of how to run a very, very complex multi billion dollar fab, much of the most experienced, best talent walked out the doors at Intel’s behest, with years of knowledge in their collective heads
bottom line: Intel created the hole by itself and jump into that deep end.
Many of the competent people I knew at Intel have left (not all), while many of the incompetent people I knew are still there.
The intuition is that older employees cost more and by cutting them you can reduce your payroll more significantly while doing what looks like smaller employee cuts from the outside. This is often viewed favourably by investors because on paper it doesn't seem as the company is stalling (head count is still high, costs are down). The obvious issue is that these older employees are not easily replaceable and you end up losing more velocity in the long run than originally anticipated.
The above is more applicable to traditional blue-chip businesses where workforce movements are more limited. For software engineering (which Intel is not really) your assumption is correct and once cuts are announced a lot of your great engineers will jump ship.
The above applies to everyone. When I was an intern the company folklore was full of horror stories because the last guy knew anything about a very profitable product died suddenly. (the product was for mainframes: clearly near end of life, but it was still mission critical for major customers and got had to get minor updates)
I've also known important people to find a better job. Even when an offer of more money gets them to stay, my experience is they always wonder if they made the right decision and so are never again as good as they were before.
Moral of the story: don't allow anyone in your company to get irreplaceable. This is good for you too: it means you won't stagnate doing the same thing over an over.
Even Swan is probably not an idiot, he simply expected everyone to struggle as much as Intel on the 7/10nm node and when TSMC just breezed past Intel and AMD came out with a much better product than anticipated he found himself in very hot water.
(He could also be quite the idiot, I don't know him)
I don't think tech companies can afford this practice. So much knowledge resides in their senior talent, and the hard-won experience-based understanding and things gleaned through opportunistic exposure that they seem to voluntarily surrender.
He might have what it takes to turn Intel around.
The heavy reliance on the compiler for ILP was an “odd-choice” but not something that was unsound in principle.
If the ecosystem was more open from the get go and more vendors were involved it had a much better chance of taking off.
And if nothing else at least it was something new.
The biggest disappointment I have with Itanium is that it and later Larabee/XeonPhi kinda pushed Intel even further into their own little x86 box when it came to processing units.
I think that failure is also why they haven’t really done anything interesting with Altera.
It would be interesting to see an explicitly JIT-based approach to ILP.
Is the board leaning into the usual MBA moves 101 and turn Intel into a "services company" gradually going fabless and milking those sweet patents OR will they put the work boots on and start building an actual tech company with the people who actually can save them on the payroll? cutting on the usual contractors meat grinder and invite the vast armies of middle-management and marketing drones to leave?
Probably not, considering the guy they're throwing out the back door is a finance dude with an MBA and Gelsinger was/is an actual engineer.
I think there's an issue with helping Intel temporarily, because if you're TSMC you'd rather use your capacity to serve long-term partners rather than helping Intel bridge the gap to 7nm only to get dropped a couple of years from now when they get their chips in order.
It's like "flex". I don't hate them (I hate "relatable" and "addicting" but those are apparently acceptable as real words now) but it's odd how they seem to bubble up suddenly out of nowhere.
(British by the way - that might have a bearing)
The metaphor works for me ¯\_(ツ)_/¯
Ah. It makes a lot more sense now. I never made that connection.
I play it by ear and there's a high chance the "leaning into" expression was used incorrectly. I meant to say, the board was more inclined to follow a roadmap than other options.
I'm my opinion, the secret sauce that makes Intel dominate certain industries is software. And it has been for some years already.
If you need really fast mathematical number crunching, e.g. high frequency trading or realtime audio filtering, then you need MKL, the Intel math kernel library.
If you want to further reduce latency with parallelism, you need TBB, the Intel thread building blocks.
Raytracing? Intel embree.
Once you are locked in that deeply, the raw Intel vs AMD performance becomes meaningless. You only care about how fast the Intel libraries run
So a CEO with experience building high performance low level software seems like an amazing fit.
Edit: And I almost forgot, the Intel compiler used in pretty every PC game to speed up physics. Plus some people have seen success replacing GPUs with icc+avx for huge deployment cost savings in AI.
Roughly 40% of their revenue is consumer chips where, apart from some games optimisation, they are no longer standing out from the crowd, and the leader is arguably Apple, with AMD doing well. The next ~30% of their business is servers, where there may be a significant number of HPC clients, but the bulk of this is again likely to be VMs running non-Intel specific software, and this market is starting to realise that Intel is nothing special here.
Looking at their revenue breakdown, I struggle to put more than 20% into the things that you mention they are great at. Should they focus on this? It would lose them much of their market cap if they did.
You lost me at Apple. Apple owns around 15% of the PC market space and almost the entirety of that is Intel-based systems. Outside of HN, nobody cares about the M1 chip, it isn't a selling point to my mom or her friends. If someone at the Apple store recommends it they might buy it instead of an intel-based system but it definitely isn't something they're seeking out.
The only threat Intel has right now in the consumer space is AMD, and it's a very real threat. AMD won both Sony and Microsoft console designs, and the mobile Ryzen 5000 chips released at CES look to have enough OEM design wins to put a serious hurt on Intel in 2021.
Even if Apple goes 100% M1, there's the other 85% of the market that Intel is likely far more concerned about.
I can absolutely see Qualcomm offering laptop chips off the back of the M1's success. They may not be as good, but they might be much cheaper. I can also see Microsoft pushing Windows on ARM harder, and rolling out their own chips at some point.
Also once the market gets "used to" multi-architecture software (again), I think we'll see a renaissance of chip design as many more players crop up, because of the lower barrier to entry.
If you're just referring to Apple's future, the endgame is already a done deal. They don't roll back something this fundamental once it's public.
Their last Intel computers will be manufactured and sold within 3 years; I'd wager a month's salary on that if I had it lying around.
If they can't make it perform at the top level like AMD, which seems unlikely but obviously it's too soon to tell, they'll make up for it in other ways or simply live with the performance hit until they can.
Apple doesn't believe in inertia. Their partners and customers can either make the transition or get left behind. Microsoft's business model depends on making enterprises happy; Apple doesn't care.
> Apple has solved the chicken and egg problem regarding software compatibility by forcing everyone to move on to ARM in the near future.
I don't see "everyone" moving to ARM in the near future. I don't even see myself buying an Apple Silicon machine in the near future. I don't see corporate users going to ARM in the near future, and I certainly don't see government users going to ARM in the near future.
Even if the Mac lineup goes exclusively to ARM in two years as planned, I'd be shocked if Dell, Lenovo, and other manufacturers follow suit. Lenovo just announced that the fully specced Thinkpad X1 Carbon Gen 9 will run on an Intel 11th Gen CPU. They're not even going to AMD, let alone ARM.
Apple switched to Intel from PowerPC for a reason. Apart from raw performance at that time, Intel's draw was practical: dual boot, software compatibility, and so forth. Now, Apple is leaving Intel because the lithography has fallen behind, the chips run hot, Intel's share of the profit eats into Apple's profit margins, and x86 impedes their dream of iOS/MacOS app singularity. However, the masses are not clamoring for a move away from x86, so there's no pent-up demand to be had with a switch to ARM.
Let's assume that Apple follows through on its two-year plan to offer an exclusively ARM Mac lineup. What's the bear case?
If Apple is the only player to move to ARM, and the market doesn't follow, it could end up as the first existential setback that the Mac has faced in a long time. In my view, Apple has been out of touch with Mac users for a while now. When they eliminated MagSafe for Macbooks, users shrugged. When they introduced TouchBar, users stared blankly. When they removed all but one format of accessory port, users groaned. When they introduced the butterfly keyboard that put thinness above function, users complained. Eventually if you make enough weird moves, you start to alienate customers.
There are bound to be gaps in the respective software libraries of x86 and ARM, and that is going to matter to regular folks and application-specific power users alike. Apple itself is having problems getting Big Sur to run properly on all the machines it supports. "Supported" 2013 and 2014 machines are getting bricked, and even brand-new machines are experiencing bugs with basic functions like managing external displays. I can attest that Intel-based 2020 four-port MBP13s were still shipping with Catalina at the end of December (I don't know if that has changed in the past two weeks). If the OS itself is having problems, that isn't exactly a good sign for the applications that run atop it.
Apple has gotten used to saying "whatever, I do what I want," and getting its way. Remember 3.5mm audio ports on flagship mobile phones? Me neither. But what happens if complacency about that kind of market power carries Apple a bridge too far? They will have been so used to success that they will miss the early signs of market distaste, and may be unable to adapt mid-flight. There's a certain sense of bravado and self-fulfilling prophecy in saying that -- come what may -- the Mac is moving to Apple Silicon, but it'd be smarter to eschew the bravado and see how things go with the M1 before announcing plans to transition the entire lineup.
So to answer your question, I was talking about ARM vs x86 in general. A transition from x86 to ARM isn't something that one firm with minority market share can force unilaterally. However, I also think the Apple Silicon two-year timeline is overly ambitious and that Apple risks losing market share as a result. If that happens, the stock will suffer drastically, and there will be a reshaping of Apple's C-suite.
If they express any interest in keeping Intel around, the transition is at great risk, because software developers can just trust that Rosetta will continue to make their software work, and customers will keep buying Intel machines that Apple will have to support for years to come.
And I believe that was what the parent was saying:
Clearly there's no way that every computer user is going to be moving to ARM, but every new macOS user will, so the sentence only makes sense in the narrow scope of Apple customers.
You're absolutely right that the challenge is to get developers to build software that doesn't just depend on Rosetta 2. But at present, a developer has to think about users on x86 Windows, x86 MacOS, ARM MacOS, and Linux (ignoring ARM64 Windows and a few other OSes). The ARM MacOS devices at present are lower-end -- 2 TB3 ports, etc -- and less likely to be used by the customer cohorts that are the most profitable for software developers (except maybe the big-ticket names like Office, Photoshop, etc). I run x86 Mac and I still can't run all the APIs and software that I want to use. M1 would be totally untenable for me. For example, I use a brokerage whose software runs on Windows, but they have a cloud version that runs on Citrix Receiver. That used to work for me on Mojave, but I haven't been able to get it running on Catalina.
I don't think Apple has solved the chicken-and-egg problem, not even within the Apple landscape. If anything, Apple has made the problem more pressing and relevant by introducing a two-year cliff.
As Ludacris says, "tomorrow's not promised today." It's unwise to treat every large market segment as massless. Forced market shifts and state changes can be a neat trick when Apple pulls them off, but eventually they'll hit an immovable object. A two-year timeframe is an eternity for PCs and semiconductors; moreover, demand is astronomical at present, but the term structure of that demand is anything but predictable right now.
Apple is going to be supporting x86 Mac well past that two-year timeline, so even if ARM Mac takes over on schedule, developers are going to have to build for both setups for some time to come. I don't think that's particularly prohibitive if you set up your abstraction layers and work on one shared core before tailoring it to fit your supported platforms, but it's definitely added work and more planning.
The list goes on and on. M1 could be such a harbinger. Who knows if anyone else can compete, and likely Apple will be in the pole position for some time, but Apple has shown everyone it can be done.
Web applications are portable if you have a browser
Many modern languages that compile to native, like Rust, trivially recompile for multiple target architectures.
Others are dynamic and don't need recompilation.
Of course some major software is written in non-portable C and C++. But the question is whether some emulation isn't acceptable here.
They can't even remove decades old legacy code from their own products, so good luck everyone else.
If AWS and Apple can do it, soon other very large companies will, but in a few years, even OEMs will be able to develop their own chips. The market for high end gaming is unlikely to be touched, but the vast consumer market is going to be eaten by custom made ARM-based chips.
So in a world where processor design becomes a commodity, what does that mean for Intel and AMD? And what does that mean for the overall datacenter, consumer markets?
Processor design is already a commodity, and has been for many years. Any company with the cash can buy a license to the ARM64 instruction set and the reference core design, and have someone like TSMC or Samsung manufacture it.
These designs haven't taken over desktop market from x86 yet because those designs simply weren't performance-competitive with what AMD and Intel are pumping out, and it's not clear that that'll change anytime in the foreseeable future.
Apple knocked it out of the park with the M1, but they've been kicking ass for years, including their competition in the ARM processor space. Just because Apple's processors happened to use an ARM instruction set, doesn't imply that an ARM revolution is upon us.
Gargantuan battery life isn't a selling point? For laptop? In what universe?
You can find a place to plug in at basically any coffee shop or library you go to. My mom isn't spending 10 hours in a datacenter, so it doesn't really matter to her if the battery life is 3 hours or 12. For the average consumer, battery life has just been another stat on the spec sheet for years now.
I'm not sure I agree with this. I think if you asked someone whether battery life was a priority, they might say no. And if you asked them to rank tech specs I'm not sure it would necessarily be that high either. But the experience of using a laptop with a noticeably better battery is, for me, quite likely to be one of those things that you didn't know you were missing, even if you just charge it every now and then.
So, does your mom also not care about the weight of her laptop?
The weight is a non-factor. Quite frankly until you start cracking 5lbs nobody even cares in my experience. Apple's maniacal focus on making laptops skinnier and lighter has done a disservice to the entire product line, which they seemingly acknowledged with the 16" Pro.
I expect we'll see an M series MacBook again but this time it won't be underpowered.
Seriously, I predict we will see Apple successfully attack the sub $1000 laptop market within two years. They sell the iPhone SE with an A13 for $399 so they could easily do so now they no longer have the 'Intel tax'. And the products will be a lot better than the Windows equivalents.
Most home users might use Office and that's about it. The allure of the Apple ecosystem will be strong especially for iPhone users.
Apple’s bread and butter, as far as Macs go, is the MacBook Air. And by all accounts, they sell a lot of those, and will presumably sell even more, with better margins, now that they’ve gone ARM.
Do they really want to undercut that with a cheaper laptop? I suppose it’s possible, if the volume/margins works out, but I’d bet they just keep plugging along with $999–$1500 13-inch laptops.
Bear in mind too that after a generation or two they can put the last gen M chips in cheaper products.
Laptop batteries are also expensive in terms of money, weight, and bulk which puts Intel into a much larger bind.
Then Apple's success with the M1 will spur others - I would not be surprised if Microsoft follow them down the same route.
Marketshare is not what Apple is about. Apple is about profitability and control. Their move to own silicon is driven by improvements in the reliability of their build pipeline (no more waiting for tic-tocs and whatnot) and tighter control / integration of their whole stack (same arch on phones and pc). That these chips happen to perform so well that they are potential market-growers, is a welcome coincidence.
It's certainly partly defensive - they were frustrated with Intel - but Apple would only make a move of this scale if it thought it created business opportunities for them.
(SAP is the largest non-American software company by revenue and does business management, workflow automation, and bookkeeping)
My prediction is that outside of hipster startups, M1 will have no effect on business laptop sales.
This is what everyone always says but the iPhone kicked off a whole BYOD trend that has ended up with many high-value employees caring a lot about what tools they have to use, and a lot of software engineers want Macs.
A lot bigger companies than “Hipster startups” use Macs, this tends to start with the C-suite and people follow suit.
My point also wasn’t that everyone was going to switch to Mac. It was that M1 proves you can build a “better in every way” PC with an ARM architecture. Linux ARM is also being pushed by AWS heavily from the server side with impressive price/performance numbers.
Windows ARM has been failing for many years, but I suspect this is going to change. Microsoft has a talented virtualization group, where the HyperV roots go back to the Connectix Virtual PC team that built PPC/x86 emulation for the Mac. I suspect they can pull off something like Rosetta - they just need a chipmaker to collaborate with. Might even be Intel! Pat Gelsinger is an outside of the box thinker.
You remind me of folks that thought the iPhone / iPad would have no impact on Blackberry sales, as real businesses need keyboards.
So no, SAP is not a database and has many client applications that would run on a laptop.
The official GUI is C++ and Windows only. They do have a Java port for other OSes, and some 3rd party GUIs, but none of that is feature-complete or even halfway there.
Many companies have long ago set up some beefy Citrix servers for those application.
Really? That's surprising to me. I'd imagine that for the demographic of her and her friends, quality of life increases for their phones are far more material than for their computers.
> phones are far more material
Thus they don't really care about laptop battery life.
I think that is why Intel should be (and probably is) worried about Apple. They will make Intel redundant by having solved a harder problem which their own problem becomes a subset of.
In the consumer segment, you have regular people trying to make vacation videos with software like Adobe Premiere and Adobe Media Encoder, or Magix. Nvenc quality is bad. AMD is horribly slow. The only fast high quality encode is with Intel's dedicated CPU instructions, which both apps heavily promote to their users.
And the 30% that you mention that run VMs... Wouldn't they be pretty happy if Intel added dedicated CPU instructions to make VMware better?
I agree that for the work that I do, AMD is as good as or better. But people doing highly parallelizable tasks like compiling are the minority.
I just don't think the market for home devices is thinking about their video encoding time when they buy a laptop, but I do think they'll use an M1 Mac and find it surprisingly fast, or hear from a friend or family member that they are really good.
Intel just haven't been optimising for the main user experience seen by these people, or those writing "normal" server software either. They've been pushing AVX512 instead, which looks good for video or things like that, but not for regular use-cases.
That only happens in California.
"4K Ultra HD video editing with Intel and MAGIX"
"Enjoy HD Video editing with Magix Movie Editor Pro and Intel Iris Graphics"
"Edit in 4K Ultra HD" + Intel Logo
"Finish and Share videos quickly with Intel Iris Graphics"
Plus, as a user of the software, I can tell you that if you tick the "Hardware Acceleration" checkbox on AMD, a popup will tell you to buy a supported Intel CPU and then turn the checkbox off again.
BTW I'm picking Magix here because in the local electronics store, that's the video software that you can buy as a box and that is featured in bundles with Intel laptops. So if someone clueless walks in there and says they need video editing, this is most likely what they will end up with.
Not sure where you're getting that these days? Absolutely in the days of Bulldozer, but AMD's Zen 3 architecture has taken even the single core lead from Intel, not to mention the multi core lead they've held for several years now.
AMDs GPU encoder still lags a way behind Nvidia for example
You might need a source for that.
AMD's latest consumer-level chips significantly outperform Intel's chips in both price and performance. When talking about prosumer video editing performance, the Ryzen 9 5900x, the second most expensive "new" chip from AMD is a 3.4% performance improvement over Intel's most expensive "new" chip 10980XE. Additionally, the 5900x retails for $549 USD while the 10980XE retails for about $1,000 USD.
They (and AMD) did years ago. Intel VT.
They have this today. What would make them happier is cutting power utilization by half or more, which is looking quite possible with non-Intel Silicon.
Intel's secret sauce is inertia.
The thought that Intel's is not challengeable, and the world doesn't need a company to dethrone it either.
But that assumption is no longer true, and the counter movement is in its full swing.
The future of computing is on not CPU if you ask me. It would move from general computing to heterogeneous computing, and possibly application-specific chips/FPGA. MKL is fast, probably, but GPU and ASIC would be even faster.
That's not how latency works and there is nothing too special about Intel's TBB library. It is a big bloated group of libraries that doesn't actually contain anything irreplaceable. Don't be fooled by marketing or people that haven't looked under the hood. It should also work on amd cpus.
> Raytracing? Intel embree.
Embree is a cool convenience, but also doesn't marry anyone to intel cpus.
1) the "outside" guy (sales, know the customer)
2) the "inside" guy (operations, now the employees)
3) the "tech" guy
Any of these three can run the company, but whichever one it is, they need to have the other two near at hand, and they need to listen closely to them. The problem comes when, as at Intel and perhaps also at Boeing, you have options (1) or (2) in charge, and they're not listening to the person who is position (3) in the triumvirate, or they don't have a triumvirate at all. If the person in position (3) is in charge (as at AMD currently), they will still need to have experts in (1) and (2), and they will need to listen to them.
Gelsinger earned a master's degree from Stanford University in 1985, his bachelor's degree from Santa Clara University in 1983 (magna cum laude), and an associate degree from Lincoln Technical Institute in 1979, all in electrical engineering.
I'd call it an engineering background.
The only gig where you get rewarded handsomely even if you fail.
From various articles over the years it seems that what's happened to Intel internally is fairly typical: internal fiefdoms, empire-building, turf wars and the like. This is something you have to actively prevent from happening.
This is going to take someone with deep experience in fab engineering to figure out, not a bean counter. And it should probably involve a massive house cleaning of middle management.
And no the answer isn't just another reorg. Unless you actively prevent it reorgs become a semi-constant thing. Every 3-6 months you'll be told how some VP in your management chain you've never heard of let alone or met now reports to some other VP you've never heard of or met. There'll be announcements about how the new structure is streamlined and better fits some new reality. And 6 months later you'll go through the same thing.
This is a way of essentially dodging responsibility. Nothing is in place long enough for anyone to be accountable for anything working or not working.
Hopefully the next CEO is as committed to all the other stuff, like treating the employees very good and the community stuff. The best bit of the company isn't the tech at all (in my opinion)
That makes me feel old given that I first spoke with them as an analyst before ESX Server came out :-)
I'm not comparing Gelsinger to Steve Jobs in a general sense, but Jobs wasn't new to Apple when he returned -- and yet Jobs' return to leadership was transformative for the company.
When Steve Jobs returned to Apple, he literally fired most, if not all, of management consulting types. He changed the culture overnight. I don’t see that happening at Intel (but I hope I am wrong).
(that said i'm sure he's crying all the way to the bank with his millions so i'm not feeling too sorry for him)