
Apple Silicon: The Passing of Wintel - robin_reala
https://mondaynote.com/apple-silicon-the-passing-of-wintel-79a5ef66ad2b
======
Animats
This is very bad for the US semiconductor industry. Intel is the only US
company with state of the art fabs in the US.[1] (Global Foundries 14nm fab in
East Fishkill, NY, formerly an IBM fab, maybe. It's owned by the Emirate of
Abu Dhabi and is being sold to ON Semiconductor.) Intel is profitable because
they have high margins on x86 family parts. That margin will now start to drop
as Intel faces competition from commodity ARM processors.

The US already lost the DRAM industry, the disk industry, the display panel
industry, most of the small component industry, and the consumer electronics
industry. In ten years, the US won't be able to make electronics in volume.

[1]
[https://en.wikipedia.org/wiki/List_of_semiconductor_fabricat...](https://en.wikipedia.org/wiki/List_of_semiconductor_fabrication_plants)

~~~
giantrobot
I'm no cheerleader for Intel but if Intel _really_ wanted to they could
release competitive ARM offerings. They've got enough money they could just
buy an existing ARM developer and then produce those chips in their fabs.

They've managed to change direction in the past. They were all-in on Itanium
and NetBurst until the Opteron showed up. They then kicked both to the curb
and went all-in on the Pentium-M/Core microarchitecture.

Intel would be foolish to ignore the ARM market if for no other reason than
the server market has started to change their focus on ops/watt rather than
just ops/dollar. A 64 core ARM uses less power than a 64 _thread_ Xeon with
comparable performance.

That means the lifetime cost of that ARM chip will be much lower than the Xeon
since it'll cost less to power and cool it (and the datacenter). Lower power
requirements can also change the calculus of datacenter siting.

I don't see Intel getting to ARM ops/watt levels with their x86 designs. The
Lakefield designs are promising but the "big.LITTLE" core configuration is
more useful in mobile applications than servers.

Intel losing out to ARM designs is really Intel's choice at this point.

~~~
pavlov
It’s crazy that Intel owned the best ARM design 15-20 years ago (Xscale) yet
sold it a year before the iPhone shipped. Intel actually turned Apple away
when they wanted an ARM SoC.

Both Intel and Microsoft made some kind of categorical error in 1995-2005 in
assuming that their dominance was tied to a particular programming interface.
Intel clung to the x86 instruction set while Microsoft operated on a “Windows
everywhere” mindset, first pushing Win32 and then .Net.

Wintel seems finally dead, and good riddance.

~~~
yjftsjthsd-h
> Intel clung to the x86 instruction set

Itanium launched in 2001 (per Wikipedia); if anything, I'd say they
overestimated how readily they could ditch x86. Of course, it might be that
replacing x86 was a good idea and it was just that Itanium sucked (I'd accept
that possibility, given the whole "needs a smarter compiler than exists"
thing). But I _do_ agree with the general feeling that Intel and MS both
overestimated how solid/entrenched they were.

~~~
RcouF1uZ4gsC
I think AMD killed Itanium. I think Intel’s plan was to leave x86 forever
stuck at 32- bit and have Itanium be the natural upgrade path as computers
started running into the 4 GB limitations. I think it would have been a nice
transition story.

However, when AMD released AMD64 it killed this strategy. You could now buy a
chip that natively ran x86 32 bit at full speed, plus be able to run code that
could address more than 4 GB of RAM.

Once AMD64 came out, Itanium was a dead architecture.

~~~
vardump
"...as computers started running into the 4 GB limitations..."

You mean 64 GB limitations? 32-bit x86 could handle that much, and I'm sure it
would have without AMD64.

~~~
saagarjha
Didn’t that require PAE and all sorts of workarounds, all for a not-that-great
experience?

~~~
vardump
No workarounds required if it's enough just to have a lot of up to 3 GB
processes (arbitrary, upper 1 GB was typically reserved for the kernel and
memory mapped peripherals/IO, like GPU, USB, SATA). Of course if a single
process wanted to access more, that would have required remapping or multi-
process approach. Or special hacks like Windows AWE [0].

All in all, not ideal, but I'm sure we would have managed with it a bit
longer.

[0]:
[https://en.wikipedia.org/wiki/Address_Windowing_Extensions](https://en.wikipedia.org/wiki/Address_Windowing_Extensions)

------
temac
_Apple_ is competitive with Intel. Not ARM in a more general way. So I don't
really see how Intel could magically produce a vastly better microarchitecture
by retaking an ARM license... Especially since the microarchitecture is _not_
completely independent from the ISA, and Intel is a specialist of x86, not
ARM. So it would take a huge time to reach the same level.

And that would not fix their process either...

Plus to be competitive on perfs Apple is using some tricks (larger page size
=> huge impact on L1) that are maybe difficult to apply to Windows, given the
Windows ARM ecosystem already exists (even if no very striving for now).

Also MS is attempting a somehow closed ARM hw ecosystem, so that's not a full
replacement for x86.

To finish what to compare to is currently likely to be AMD, not Intel.

So I think x86 is here to stay, or at least that's not the Apple switch that
is going to initiate a revolution.

~~~
jorvi
> So I think x86 is here to stay, or at least that's not the Apple switch that
> is going to initiate a revolution.

It will, indirectly. When MacBooks offer 1.5x the performance at 2x the
battery life, people using Windows or Linux will start to demand similar
devices from Dell, Lenovo, HP, Acer, etc;

It makes me the saddest for AMD. They’ve just clawed back a few victories from
a decade of loss, and starting within 5 years that will hardly matter.

As for the customer: this will be an amazing time, especially if we will
finally get some standardization around ARM SoCs. Apparently every single one
of them (at least phones) does their own little boot quirks, instead of their
being a unified BIOS or EFI or whatever to guide it all.

~~~
temac
> It will, indirectly. When MacBooks offer 1.5x the performance at 2x the
> battery life, people using Windows or Linux will start to demand similar
> devices from Dell, Lenovo, HP, Acer, etc;

Well, people can demand that, but if only Apple knows how to design the chips,
and given they will not sell them to competitors...

~~~
baconandeggs
Users don't want chips they want systems and the way people will "demand" them
will be by buying Apple. Dell and Lenovo will have to come up with something
or cede market share.

~~~
pjmlp
Where I come from, people won't certainly be rushing to buy Apple with 300
euros as minimum wage and 1000 euros on average for regular IT job.

~~~
baconandeggs
Sure, but neither are they now so it makes no difference to Apple.

~~~
pjmlp
Which makes the whole point of the article meaningless.

------
BooneJS
It's true that Intel (note, I'm a former employee) has had a mixed track
record in creating new businesses outside of desktop and server chips.
However, I'd like to point out a few things.

1\. When Intel released HT as a way to recover 15% idleness in CPUs, the Linux
scheduler took a while before it realized it should schedule work on all
_full_ cores before assigning something to a hyperthread.

2\. Intel has been adding specialized SIMD vector operations, and wide vector
operations, yet popular compilers like GCC were really slow to take advantage
of them. One or two blog posts a month will hit the HN front page about
inlining assembly or using specialized math APIs like the Intel MKL to extract
performance from the cores you're running on.

Apple can add neat blocks like audio, low power video playback, always-on
processor, and neural engine because they also release the software APIs for
it. Maybe that's Intel's sin. They never realized how hands-off the white box
industry needed to be to stay alive and relied too much on OSS developers to
fuel general-purpose Linux performance outside of large HPC computers. Intel
made add-on things, people didn't use them, and now Intel is perceived as the
company that can't innovate.

~~~
skissane
> the Linux scheduler took a while before it realized it should schedule work
> on all full cores before assigning something to a hyperthread.

According to statistics maintained by LWN [1], Intel was the number #1
employer of Linux kernel developers in the latest kernel, and has been for a
while. If you go back into previous years, Intel was not always as high up,
but even back in 2007 (when LWN started tracking this) they were in the top
5-10 (depending on how you count it.)

Intel released HT way back in 2002. I don't know how active Intel was in Linux
kernel development back then. But what Intel was or wasn't doing 18 years ago
probably isn't very relevant to the question of Intel's current performance.

[1]
[https://kernelnewbies.org/DevelopmentStatistics](https://kernelnewbies.org/DevelopmentStatistics)

~~~
bluedino
I’ll borrow the old BSD/linux saying about intel is a group cpu makers writing
an OS Where Apple is and OS maker that designed a cpu

------
tuatoru
Gassée is focusing on the wrong things. "Thermal envelope"? Bah.

In the block diagram of the SoC for the new Macs, Apple very clearly signaled
what they are doing. They are working towards Steve Jobs's long-term vision.

Onboard "advanced audio processing", "enhanced camera", "neural engine", "AI
processing": this is about making Macs that support development of the next
generation of software for iPhones.

Mobile phones right now are stuck. Apple wants to make them more useful to
their users, and the way to to that is to make them context aware,
continuously listening and occasionally looking around. so the phone knows
what's going on.

Context-aware phones is what this is about. "Siri, write minutes of the
meeting we just had, and email them to everyone."

Users will never accept a phone that's continuously uploading everything in
their lives to Apple's servers (or anyone else's), so all of this capability
_has to be on board the device_.

Intel is incapable of making a SoC that will support this, and no-one else has
the capacity, so Apple has to do it themselves. (Also, Apple wants to keep the
IP to themselves.)

~~~
detaro
If they wanted those extra things available on x86 macs, they could just stuff
them in an accelerator chip.

~~~
simonh
That’s an incredibly costly and inefficient approach. You used to be able to
get x86 add on boards for Macs back in the late 80s and early 90s, but they
cost a fortune and were incredibly clunky.

~~~
detaro
That's not a comparable thing. Putting specialized functions in their own
chips is completely normal, and only a problem if they need really deep
integration with the host CPU, or efficiency or size are extremely important
(iPhone: sure. laptop, not so much), so I don't think those are a primary
concern for the move to ARM laptops.

------
leonroy
Why Intel did not buy ARM is something that puzzles me...

In 2016 ARM sold to SoftBank for $31B.

In 2016 Intel had a market cap of ~$170B and about $17B cash on hand. I'm no
M&A analyst but surely given their fundamentals at the time Intel could have
bought ARM without financial difficulty, no?

If even Facebook can look two steps ahead and pay an eye watering $19B for
WhatsApp couldn't someone in the C-level at Intel could have seen that owning
ARM would have been a jolly good strategic purchase - or am I missing
something?

~~~
jhurliman
The conversation was a non-starter due to antitrust laws. Intel has wanted to
purchase both ARM and NVIDIA for at least a decade.

This is according to the internal rumor mill at Intel Labs circa 2010, so take
it with a grain of salt.

~~~
ebalit
I can see why buying ARM would be a problem for antitrust laws but I don't see
why it would be the case for NVIDIA. It seems that ATI/AMD were a stronger
competitor to NVIDIA on the GPU side that they were to Intel on the CPU one.

~~~
tpetry
Intel had (or still has?) the biggest share on the graphics market as every
intel cpu had (has?) an integrated gpu. So by definition of market share the
largest supplier wanting to buy the second one is an antitrust problem. It
does not matter that intels graphics cores are much worse.

------
om2
A big problem with the thesis of this story: no one else has ARM SoCs as fast
as Apple's. The highest end Android phones available get smoked on performance
by the 3-year-old iPhone 8:

[https://www.anandtech.com/show/15603/the-samsung-
galaxy-s20-...](https://www.anandtech.com/show/15603/the-samsung-
galaxy-s20-s20-ultra-exynos-snapdragon-review-megalomania-devices/7)

The tablet situation doesn't look much different.

Who's going to design equally capable SoCs for Windows portables? Doesn't look
like it's going to be Qualcom or Samsung.

~~~
cromwellian
Does single thread CPU matter? It's been facing marginal returns for years.
Most of the growth in computing power expansion is in chiplets, GPUs, and
TPUs.

The real question is mobile GPUs, and whether NVidia or AMD want to step into
that market. On the desktop, Ampere and RDNA2 absolutely crush Apple's
offerings, Apple is multiple generations behind. However this is at the
expense of not really carrying too much about TDP. The real question is can NV
or AMD squeeze a cut down version of their architectures into a SoC thats
power competitive.

That is, I might be willing to buy an AMD "Fusion" that has a Ryzen and RDNA2
architecture in it, even if it is a severely cut down version, over a MacBook
if the battery life is reasonable.

~~~
ebalit
AMD has a deal with Samsung to integrate their GPU technology with the Exynos
line.

------
yyyk
Not so long ago, we had this at HN:

[https://news.ycombinator.com/item?id=5587283](https://news.ycombinator.com/item?id=5587283)
("The rise and fall of AMD...")

With comments like these:

[https://news.ycombinator.com/item?id=5588069](https://news.ycombinator.com/item?id=5588069)
("I could go on and on about all the reasons I believe AMD went down the
tubes... but I think a lot of it reduces to the disfunctional corporate
culture alluded to in this piece.")

Guess what happened with AMD? So I suggest waiting with the obituaries, at
least until Apple actually has a working system a normal person can buy, and
until Intel/AMD shows us their response.

The entire idea behind the obituaries is that Apple's whatever will be so
performant that ordinary PCs will switch on mass. This misses a few things:

A) Where will the PCs get their ARM chips from? Not Apple, and non-Apple ARM
is still obviously inferior to x86. Intel/AMD will rather keep their x86
monopoly, so it's in their interests to keep producing x86 as long as it's
competitive even just against non-Apple ARMs.

B) The typical Desktop/Laptop user doesn't care that much about performance
anymore - not as much as they care about running their favourite apps. x86 has
a huge lead here.

All it takes is for Intel/AMD to stay/become slightly competitive with
whatever Apple put out, and Apple will stay in their silo - which actually
wouldn't bother Apple Inc. at all; They always ran their own ecosystem.

C) Really, CISC/RISC doesn't matter that much. Apple's advantage (if there's
one) would be a result of other architectural decisions, we should ask
ourselves if there's anything preventing Intel/AMD from copying those
decisions.

~~~
tonyedgecombe
_The entire idea behind the obituaries is that Apple 's whatever will be so
performant that ordinary PCs will switch on mass_

It doesn't need to and is unlikely to happen en masse. Apple creaming off the
most valuable consumers is a problem for everyone else. You can see that in
the smartphone market where Apple dominates the profits.

~~~
yyyk
PC makers switching to ARM was the thesis of the linked post.

Regarding your thesis:

'Creaming off most valuable consumers' somehow didn't stop Huawei, Samsung,
etc. from being huge profitable enterprises. Partly because it's not quite
true (Apple has a position in some countries, in others it might as well not
exist), partly because there's more to things than a benchmark comparison
(Apple survived for a long time when _Mac_ processors were inferior), and
partly because we'll see a response from Intel and AMD. It's going to be an
interesting few years in this sector.

------
dcow
Is it really the _x86 architecture_ that's driving costs up and not core micro
architecture and process? From where I sit, what we're actually seeing is a
7nm (and impending 5mn) TSMC _process_ destroy the old 14nm process Intel has
been hung up on for too long. I mean that's a square increase in density right
there. If Intel could fab a 7nm chip right now, I'm pretty sure it would
compete in the power department with any arm chip on the market. Just look at
the latest Ryzen 4800U. It's a 10w 7nm x86_64 chip that outperforms the a12z.
I don't want to downplay the proliferation of ARM we've seen lately due to the
smartphone market, and having another architecture is good thing. But I'm not
convinced the author isn't scapegoating x86.

[https://gizmodo.com/so-just-how-powerful-are-apples-new-
lapt...](https://gizmodo.com/so-just-how-powerful-are-apples-new-laptop-chips-
gonna-1844134011)

~~~
Xixi
I think that Gizmodo article is getting it very wrong. They keep talking about
"the A12Z intended for the Mac", but AFAIK there is no "A12Z intended for the
Mac". When Apple switched to Intel their TDK used a Pentium 4, yet no Mac ever
shipped with a Pentium 4 (besides the TDK itself, if you consider it a Mac).

The ARM TDK is basically an iPad Pro running macOS, without a screen and with
extra ram. A12Z themselves are just re-binned A12X (with just one extra GPU
core activated), processors that are nearly two years old.

No way Apple is shipping that in their first MacBook ARM...

~~~
dcow
>No way Apple is shipping that in their first MacBook ARM...

No doubt. However, I don't think the article is so much "getting it very
wrong" as it is working with what they've got. We've seen what a 10-18 watt
soc from apple can do in the iPad. And it's pretty cool. But the point is it's
not truth defining jaw dropping revolutionary. We can look at what AMD is
selling right now. It's also pretty cool and starting to turn some eyeballs.
It's very unlikely Apple is going to release something that outright spanks
the status quo. So my expectations are that whatever Apple releases will be
competitive with what we currently see from AMD and, maybe, with what we'll
see later this year. That's all. I'm skeptical that the shift to ARM is the
crux of the matter, is all.

------
nathantotten
Longer term this will also likely accelerate servers running on ARM. Writing
software on ARM laptops that are deployed to production servers running on x86
servers will start to cause a host of new challenges. The switch to running
ARM in production will have many advantages for developers and will likely be
very attractive to cloud providers (AWS, Azure) as the costs of electricity
for these servers may be significantly less.

~~~
jeffbee
If there was an actual energy efficiency advantage -- i.e. less power consumed
for the same amount of work -- Google would already be 100% on ARM. Why do you
think they would leave that on the table?

I realize the situation changes every time a new CPU comes out, but I have
never personally seen a _real_ workload where ARM won on energy efficiency and
had reasonable performance. Tests like [1] and [2] showing x86 having an
orders-of-magnitude lead on database performance vs. AWS Gravitron2 should
give you serious pause.

1: [https://openbenchmarking.org/embed.php?i=2005220-NI-
GRAVITON...](https://openbenchmarking.org/embed.php?i=2005220-NI-
GRAVITON200&sha=c954e23&p=2)

2: [https://openbenchmarking.org/embed.php?i=2005220-NI-
GRAVITON...](https://openbenchmarking.org/embed.php?i=2005220-NI-
GRAVITON200&sha=0c9b11b&p=2)

If you're wondering why ARM needs to have both competitive performance _and_
energy efficiency, see Urs Holzle's comments on wimpy vs. brawny[3].

3:
[https://static.googleusercontent.com/media/research.google.c...](https://static.googleusercontent.com/media/research.google.com/en//pubs/archive/36448.pdf)

~~~
nojito
>Google would already be 100% on ARM

Google is laggard when it comes to server tech. Amazon and others have
leapfrogged them.

~~~
jeffbee
That's pretty silly. What makes you say this? There are large groups at Google
responsible for buying every computer there is and evaluating the TCO thereof.
With operating expenses exceeding two billion dollars per _week_ , they have a
larger incentive to optimize their first-party power efficiency than anyone
else in the business. I'm fairly certain their first-party workloads (search
etc) are the largest workloads in the world.

~~~
baybal2
I don't think they would switch even if they could in the near future.

The poster above says that Amazons "leapfrogs." They question is "leapfrogs
where?" The fact that ARM cores cost 100+ times less than Intel, and are n
times more power efficient was well known for the whole eternity.

What people don't get is that you get the whole platform on x86, and ARM is a
very, very DIY thing, even if you are a multinational with lots of money on
RnD.

~~~
jeffbee
> I don't think they would switch even if they could in the near future.

They've already brandished their ability to port their whole product from x86
to POWER[1], and deploy POWER at scale if they need to[2]. My personal
interpretation of these announcements is they are made with the purpose of
keeping their Intel sales representatives in order, but the fact that you
don't also see them or anyone else brandishing their AArch64 port should tell
you something.

1: [https://www.cnet.com/news/google-acquires-a-taste-for-
ibms-p...](https://www.cnet.com/news/google-acquires-a-taste-for-ibms-
power8-processors/)

2:
[https://www.forbes.com/sites/patrickmoorhead/2018/03/19/head...](https://www.forbes.com/sites/patrickmoorhead/2018/03/19/headed-
into-its-fifth-year-openpower-has-momentum-into-the-
power9-generation/#5c2a7e0978a8)

------
ksec
> Furthermore, Apple doesn’t buy the expensive Xeon chips, used in millions of
> Cloud servers, that represent a growing proportion of Intel’s revenue.

While Apple is not traditionally considered a HyperScaler ( Amazon, Microsoft,
Google, Facebook, Alibaba etc ), their own Datacenter is still pretty big. You
would be using it when you are using Siri and iCloud. And they use Xeon.

> _Margins_ will inevitably suffer as the ARM-based SoC field is filled with
> sharp competitors such as Qualcomm and Nvidia, sure to be joined by arch-
> enemy AMD and others, all ushering in a new era of PCs.

That _is_ the problem. It isn't ARM vs x86. Nor Intel's Fab vs TSMC ( Although
they are part of the equation ). It is their _Margin_.

Intel is not willing to risk their current healthy margin. And there is
nothing inherently cheaper with ARM design. Intel could have lower their x86
Chip with less Margin to compete. In that case the compatibility of x86 with
slightly higher premium will be able to fence off any ARM attack into PC
market.

In the short term everything will still be fine for Intel. The market moves
very slowly unless you are Apple which control everything in their stack. But
in the long term the unit economics of x86 chip with current Intel's margin
stand zero chance to compete. Unless Intel's Fab does some miracle and
suddenly leap ahead of TSMC by a generation. Which as far as I can see there
is zero chance happening in the next 5 years. Further out we will have to wait
and see.

That is of course assuming the cost of running those Fab and their Tech being
equal. Which we know that isn't the case.

And I haven't even mention AWS Graviton 2. I would not be surprised if
Graviton 3 is already sampling.

~~~
scarface74
PC market is minuscule compared to mobile, stagnating , slowly declining and
has a long refresh cycle. Not exactly a market you want to be in and be
excluded from the larger market.

Even worse for Intel, most computers over $1000 are Macs leaving Intel with
the low end.

~~~
ksec
>Not exactly a market you want to be in and be excluded from the larger
market.

I am not sure what you are trying to suggest.

You either go in to the bigger market and complete with lower margin. Or you
stay within the market with margin and milk it for as long as you could.

~~~
scarface74
AKA “The Innovators Dilemma”.

[https://www.amazon.com/Innovators-Dilemma-Technologies-
Cause...](https://www.amazon.com/Innovators-Dilemma-Technologies-Cause-
Great/dp/1565114159)

The solution?

[https://www.amazon.com/Innovators-Solution-Creating-
Sustaini...](https://www.amazon.com/Innovators-Solution-Creating-Sustaining-
Successful/dp/1422196577)

------
JohnBooty
Apple Silicon-powered Macs will be better than Intel-based laptops, at least
after a few product cycles. I say "better" in terms of performance/watt.

But will their margin of superiority be large enough to cause the seismic
shifts in Windows/x86 land that this article suggests?

I'm not sure.

If Apple's laptops are 25% or perhaps 50% better, I'm _really_ not sure that's
enough to cause a major upheaval.

I think we will wind up in a middle ground different from the upheaval JLG
suggests.

We are already at a place in which your average laptop is "good enough for
most people"; being 25%-50% better than "good enough" is a nice _competitive
advantage_ but that's not going to necessarily turn the world upside down.

\- "Power users" will appreciate the CPU improvement. At this point, that
essentially means video editors. It would mean gamers, but Macs still won't be
the first choice for gaming.

\- "Extremely mobile users" will appreciate battery improvements. However, I
suspect most laptop users use them chained to a desk most of the time anyway.
I would _love_ 50% more battery life, but it's not really actually that much
of a factor for me personally.

\- "Cheap laptop buyers" won't be buying Macs anyway as Apple (wisely) doesn't
seem like it wants to play in the sub-$1000 space. Though... maybe it will
now.

~~~
dangus
Your last paragraph and your fourth paragraph are the big upheaval.

If you ask me, Apple is going to make a ~$700 passively cooled (sealed like an
iPad) MacBook Air with similar/better performance to the current model.

Most importantly, it will have the same build quality and “expensive” feeling.

Sure, there are plenty of laptops now with similar build quality to Macs, but
none of them are actually cheaper than a Mac. Load up a ThinkPad X1 Carbon or
XPS 13 with a high DPI display and comparable specs and you’ll see what I
mean.

But now you’re going to have Apple saving $200 compared to _every other
computer_ by not giving Intel a dime. The A13 chip is a $10 line item in the
component cost of an iPhone.

This barely even costs Apple R&D money since it was already a part of the much
larger iPhone and iPad business.

Literally nobody will be able to touch Apple on the value end of the spectrum
unless they go into bargain basement territory. Are customers going to buy a
$600 plastic 1080p laptop with big fans, keyboard deck flex, and 5 hour
battery life or will they buy a $700, thin, fanless, retina aluminum Mac with
the same performance, 10 hour battery, and a much better fit and finish?

If this all sounds a little unrealistic, I’ve got an iPad Air to sell you!

~~~
JohnBooty
I hope and _pray_ (to the extent that an atheist can pray) that this is what
will happen.

There is zero question in my mind that Apple can do what you say, for the
reasons you say.

The only potential obstacle is institutional will. You would now have a larger
overlap between low-end Macs and the higher-end iPads. In _my_ opinion, this
is not a problem because I don't view those product lines as conflicting. But
does Apple?

------
sradman
> This leaves Intel with one path: if you can’t beat them, join them. Intel
> will re-take an ARM license (it sold its ARM-based XScale business to
> Marvell in 2006) and come up with a competitive ARM SoC offering for PC
> OEMs.

I find this path unlikely and it depends on Intel's foundries not falling
further behind TSMC. The most likely path is to compete on price-per-
peformance (i.e. slash prices and internal costs) rather than performance-per-
watt.

~~~
baybal2
What this discussion lacks is the acknowledgement of just how titanic Intel
is.

It is pretty much the biggest semiconductor company out there, focusing solely
on 3-4 main product lines.

The most surprising is them under-performing so much while them having
resources comparable to the next few competitors combined.

It is not only their fabs that got an arrow to the knee with 10nm.

Their newest microarchitectures deliver tiny IPC gains in comparison to AMDs
products even on comparable nodes, while having significantly more transistors
per core.

Tiger Lake throws 40-45% more transistors, and only gets 10% IPC gain at max,
and the rest of performance gains come from better thermals.

~~~
starfallg
I think you have some salient points there.

A few years down the line, Intel will look back and thank AMD for their
decision to spin off GF and go with TSMC with Zen 2.

EYPC and Ryzen are keeping the x86 arch relevant until Intel is able to catch
up on their next-gen process nodes.

AMD may be eating Intel's market share for lunch in a few markets, but in the
long term keep ARM far enough away for it to be a niche player, both in the
server space and the laptop space.

------
no_wizard
It’s not apple that matters here. At all. Ever since they took ARM in house
for their devices it’s clear they don’t want to be dependent on another
corporations chip design.

What this misses, gravely I think, is that from the armchair CEO angle, I
would want to go after _Qualcomm_

They’re the biggest vendor in this space for mobile as far as I can tell. All
the major Android vendors use them, including in high end devices, and they’re
apparently incredibly hard to work with from what I understand. This would be
the way in for them to muscle into the ARM market but supplanting Qualcomms
dominance.

That would make more sense to me if I was running Intel. Then again, they
ditched their cellular unit so it seems this might be questionable all
together

~~~
adamnemecek
Apple doesn't want other vendors using their CPUs.

~~~
no_wizard
> Apple doesn't want other vendors using their CPUs.

Exactly my point, in so far as what I said: Apple’s never going to be Intels
customer for CPUs once they the last Mac requiring Intel CPUs is no longer
supported. That’s a dead market.

Qualcomm on the other hand, is ripe for some real competition. If Intel wants
to swing big into the ARM CPU game, they should look to dethrone Qualcomm.
Apple is pretty much irrelevant here. That’s been clear since they brought all
their CPU designs in house

------
valuearb
To add to JLGs points, a top of the line 2020 MacBook Air that sells for
$1,450

[https://browser.geekbench.com/macs/macbook-air-
early-2020-in...](https://browser.geekbench.com/macs/macbook-air-
early-2020-intel-core-i7-1060ng7-1-2-ghz-4-cores)

Is significantly slower than a $400 iPhone SE in single core, and only
slightly faster in Multi-core.

[https://browser.geekbench.com/ios_devices/iphone-se-2nd-
gene...](https://browser.geekbench.com/ios_devices/iphone-se-2nd-generation)

The entry level $900 MacBook Air is 30% slower in single and multi-core than
an SE:

[https://browser.geekbench.com/macs/macbook-air-
early-2020-in...](https://browser.geekbench.com/macs/macbook-air-
early-2020-intel-core-i3-1000ng4-1-1-ghz-2-cores)

Now imagine next years MacBook Airs running the next generation A14 processors
on a 5 nm process instead of 7 nm, with much more thermal head room than a
phone, with far faster integrated GPUs than Intels, and with functionality
like T2 integrated in the SOC. And using processors that are at least $100
cheaper.

It’s mind boggling to think of how much faster they will be, how much longer
their battery life will be, and that they likely will be even cheaper.

~~~
AnthonyMouse
> Is significantly slower than a $400 iPhone SE in single core, and only
> slightly faster in Multi-core.

Geekbench is super misleading because they do things like include benchmarks
which are hardware-accelerated on the CPU in the iPhone but not the Intel
ones. The weak little low power CPU in the Air is also hardly the fastest
single thread Intel processor, much less multi-core, so I don't know what
they're planning to do on the desktop.

> Now imagine next years MacBook Airs running the next generation A14
> processors on a 5 nm process instead of 7 nm

But then it's competing with not only whatever Intel manages to come up with
in the meantime but also AMD processors on the same 5nm process.

> with far faster integrated GPUs than Intels

But, again, faster than AMD's? Or what Intel will have at the time, given that
they're now developing discrete GPUs and will have that in house to improve
their iGPUs?

> And using processors that are at least $100 cheaper.

Which Apple has no incentive to give to you instead of putting in their own
pocket, especially if what they offer is actually better.

> how much longer their battery life will be

"Much more thermal head room than a phone" and "much longer battery life"
don't go together, they're a trade off against one another.

The amount of hype around this is getting to a level that people are going to
be very disappointed if it doesn't live up to it.

I'm waiting for some independent third party benchmarks before getting so
excited. And not the first ones which were given early access in exchange for
favorable skewing.

~~~
GeekyBear
>Geekbench is super misleading

SPEC is the industry standard for comparing performance across platforms.

>Last year I’ve noted that the A12 was margins off the best desktop CPU cores.
This year, the A13 has essentially matched best that AMD and Intel have to
offer – in SPECint2006 at least. In SPECfp2006 the A13 is still roughly 15%
behind.

[https://www.anandtech.com/show/14892/the-apple-
iphone-11-pro...](https://www.anandtech.com/show/14892/the-apple-
iphone-11-pro-and-max-review/4)

~~~
AnthonyMouse
So now we've gone from being "significantly faster" to being as fast on
integer, slower on floating point and slower on threaded applications. Which
seems a lot more plausible, but now where's the argument that this is going to
be a marked improvement rather than a lateral shift with a huge transition
cost?

~~~
GeekyBear
This is a comparison of an iPhone part to a laptop/desktop part. The iPhone
part doesn't have anywhere near the same power budget, nor does it have active
cooling. The SPEC benchmarks take a couple of hours to run, so cooling is
definitely a factor.

Apple's laptop part will be the first time they have a level playing field on
both.

The Apple laptop part is also going to have a process node advantage. It will
be built on TSMC's 5nm node while AMD is sticking with an enhanced version of
TSMC's 7nm node and Intel will still be on 10nm.

The word was that Apple's entry level laptop part will have 8 next generation
Big cores and 4 next generation little cores, which is quite aggressive for an
entry level laptop part. The A13, in comparison, only has two big cores.

We'll see how Apple's laptop chip fares when it ships.

~~~
AnthonyMouse
> The iPhone part doesn't have anywhere near the same power budget, nor does
> it have active cooling.

This is much less of an issue for single thread performance because a single
thread isn't going to exhaust an 8-core CPU's full power budget anyway, or if
it does it's by boosting to very high (i.e. less power efficient) clocks which
only give a few extra percent more performance. The main thing the extra power
budget gives you these days is the ability to have more cores, which x64
processors already have.

> It will be built on TSMC's 5nm node while AMD is sticking with an enhanced
> version of TSMC's 7nm node and Intel will still be on 10nm.

AMD already has 5nm processors on their roadmap. Apple might release theirs
first, but having a modest advantage for a few months until your competitors
release their next design isn't especially groundbreaking. It's basically the
expected state of affairs when companies compete with each other and neither
has a clear advantage -- the one released most recently is a bit faster and
then soon the competitor has a release and it goes back the other way.

~~~
GeekyBear
What is on your future roadmap doesn't matter at all, if it's not going to
ship in time to compete with your rivals in the current generation.

In the next generation, Apple is going to gain a process node advantage, and
level the playing field for the first time in respect to power and thermal
envelope.

Given that they were already in the same ballpark as Intel and AMD without any
of those advantages, we're in for a sea change.

~~~
AnthonyMouse
> What is on your future roadmap doesn't matter at all, if it's not going to
> ship in time to compete with your rivals in the current generation.

The point is that "current generation" doesn't work like that. If AMD releases
Zen3 on 7nm and Zen4 on 5nm and the ARM Macs arrive half way between one
release date and the other, which one do you want to compare them to? Neither
would be strictly "fair" because one or the other would have an advantage of
several months of advancement one way or the other. But doing all of this just
because they couldn't wait a few more months for Zen4 doesn't make much sense.

We're not going to really know until ordinary people can get it in their hands
and benchmark it.

The whole thing strikes me as Apple seeing Intel stagnating and starting a
long-term process in motion several years ago which was too far along to stop
by the time AMD brought competition back to the market again.

~~~
GeekyBear
>If AMD releases Zen3 on 7nm and Zen4 on 5nm and the ARM Macs arrive half way
between one release date and the other

Zen 3 and Apple's laptop chips are both due in the fourth quarter of this
year.

Apple is going to have a process node advantage at least until Zen 4 ships,
which definitely will not happen only a couple of months after Zen 3 ships.

However, despite Apple having a better hand to play than AMD or Intel for the
first time, I agree that the proof is in the pudding.

------
phendrenad2
Some serious mental leaps taking place here, well done. I take issue with the
assumption that Apple's behavior in the (increasingly incredibly niche) MacOS
ecosystem will somehow change Microsoft's strategy in any way. This may be the
passing of Mactel, but I doubt Wintel is going anywhere anytime soon.

~~~
scarface74
Microsoft changed strategy with Windows 8 trying to make it tablet friendly
because of the iPad (even though it shouldn’t have) and the entire reason
behind the Surface line was because other OEMs could only make cheap commodity
computers that didn’t compete with Apple on the high end.

------
jeffbee
From the guy who thought the AT&T Hobbit was going to take over the industry.

~~~
epx
Keep prophetizing, one day you nail it.

------
Zenst
Interesting read and the aspect that we may see ARM desktops become more
common, based upon Microsoft making more an effort for their ARM support is
also credible - only recently saw Windows 10 running natively upon a Raspberry
Pi4 -
[https://www.youtube.com/watch?v=BTT8GlsqXs8](https://www.youtube.com/watch?v=BTT8GlsqXs8)

Equally the Linux desktop offering have advanced and by that, more palatable
for the average user for a desktop.

Certainly when it comes to most desktops in business, something that can open
word and email and odd spreadsheet and powerpoint, that's the bulk of usage
right there. Small lower power usage alternative like Rpi4 would do those jobs
fine as for most users. So Apple may actually shift that cultural hurdle that
prevails in business, maybe.

~~~
ThatPlayer
There's already the Microsoft Surface Pro X that uses ARM. It also has a
translation layer of some sort like Rosetta that allows you to run x86 (but
not x86_64) Windows programs on it.

------
pjmlp
As much as I like Apple products this is wishful thinking, outside North
America and a couple of tier one countries, Apple is nowhere to be found
unless one belongs to a rich family that happens to bring an Apple device from
the outside during a business travel.

------
skavi
Gassée actually significantly understates the power consumption difference
between the Intel chip in his MBP and an iPad Pro. MBP 13s sustain around 35
watts of power indefinitely. An 12X sustains only around 10 watts.

~~~
dr_zoidberg
Yes, but I read this:

> think faster, svelter laptops actually lasting 10 hours on a battery charge.

...and I instantly think of HP's Envy x360 w/Ryzen 4500U pulling almost 14
hours of battery life in productivity tests.

I Apple wants to stand out, it will have to push closer to 20hs on a charge.
Not saying they can't, saying it'll be a stretch. And great if they pull it.

~~~
icedchai
I'm mainly a Mac user, but recently bought one of these Ryzen laptops. The
performance is amazing, especially for a low end / "budget" laptop.

~~~
dr_zoidberg
I'm not sure Apple will be able to exceed Ryzens[0] performance (as perceived
by the user). They could get to match it, and of course having tighter
integration between hardware and OS will give them opportunities to optimize
and tune beyond what Windows and Linux can do having to support a wider range
of hardware.

[0] I'm considering Zen 2 "the best performing x86" (with a small grain of
salt). If Intel wants to take that crown with Tiger Lake I'm all for it.

------
bilal4hmed
is there a possibility that Apple starts gaining such a lead that none of the
others are able to catch up. Could we all be running Apple hardware in say 7
years because the rest never could match the performance of Apple silicon, due
to the late start?

like what are the odds of qualcomm today matching the A12 in 5 years even?

~~~
kyriakos
Highly doubt it - even though apple is a giant you can't underestimate
everyone else who are also massive giants. Qualcomm until now may had no real
reason to step out of its incremental performance increases with every
generation since almost everyone on Android camp uses its SoC's for their
flagships anyway. Similar to how Intel has been slowly pushing small gradual
upgrades to their to CPU's until AMD managed to do something better and now we
see Intel pushing more cores than before to catch up.

~~~
bilal4hmed
Im truly ignorant on this. Who is Qualcomms competition ? given that all
android phones are running just their chips, whats their incentive to catch up
to Apple when they can just waddle along their current path.

~~~
kyriakos
Apple taking over the arm laptop market is now their competition. Qualcomm was
to gain big if Windows on ARM took off and chewed off a part of the x86
marketshare, but as the author of the parent comment suggested there's a
chance apple could leapfrog everyone and they won't be able to catch up
meaning less windows devices. qualcomm would be at loss.

~~~
bilal4hmed
thank you for explaining. Im also the original commentator lol.

I still worry about us entering in to a monoculture.

~~~
kyriakos
Sorry missed that :)

------
whalesalad
In the meantime, can anyone recommend a desktop workstation-like device to
start seriously playing around with daily driving ARM? Something more powerful
than an rpi4 but with good open source support.

~~~
pedrocr
Pine64 is working on a great set of devices:

[https://www.pine64.org/](https://www.pine64.org/)

For the workstation they don't have anything much above a RPi4 but they have
an interesting MiniITX form factor for a 7-way cluster and are even working on
an IP camera. Really cool set of devices.

------
m0zg
I don't see how Windows would be able to compete on ARM TBH. Nobody is even
close to the performance of Apple chips, and that gap is likely to grow over
time as Apple piles on more and more specialized compute and takes full
advantage of its vertical integration - something Microsoft cannot do. It's
really nice to have a fairly restricted set of hardware to develop for. It's
really nice to be able to just know that e.g. all your devices have a TPU, and
a certain amount of GPU compute bandwidth, and certain hardware codecs, etc.
On the Windows side this will never be the case unless Microsoft magically
commits itself to chip design the same way Apple did, but even under the best
of circumstances this will take a decade or so. Android is similarly
handicapped.

Intel will be fine for at least the next decade. Businesses will still run on
Intel, as will cloud. They'll figure out their process trouble and retake the
performance crown again from AMD, although by a narrower margin. All of this
has happened in the past. This is due to one simple truth: businesses couldn't
give less of a shit what hardware their software runs on, and moving
everything to ARM is so expensive, it's not going to happen for existing apps.

Most of us here would have a bout of OCD knowing we're not using the latest
and greatest thing, but businesses don't have such a problem.

------
m0ther
x86 and it's ancestors eating the markets of [iAPX 432, Itanium, i860, i960]
has taught a lesson in the value of an install-base in the server and desktop
market. Let's see if history repeats itself?

Apple has never seemed to really be into the concept of backward
compatibility; but historically the rest of the industry has been.

Whatever happens, I'll bet it won't be boring.

------
LaSombra
So, because Apple is moving their general computing devices to ARM-based
CPU/GPU, the whole world is going to do the same, Intel will die and world
peace will be achieved?

Hold your horses. No one even knows how it will perform, how Rosetta 2 will
perform and if Intel is not going to hit back as it did with Qualcomm and
Microsoft.

Seeing is believing, after all.

~~~
valuearb
It’s already clear they will be significantly faster, mobile battery life will
be significantly longer, and Apple Silicon will be significantly cheaper than
I86 processors. The A12 and A13 already demonstrate this in much tighter
thermal packaging than MacBooks or iMacs, and they use last generation process
size.

And Rosetta 2 has already been benchmarked, and the results were amazing even
on a two year old iPad processor.

Intel has nothing to hit back with.

------
m12k
Does anyone know if Microsoft has (or is working on) something similar to
Apple's Rosetta emulation system? It seems to me this is the critical
component that allows a transition to ARM to actually go forward without
having to wait for every last developer to port their applications over.

~~~
lunixbochs
The ARM Surface Pro X took a sort of opposite approach to Apple in several
ways.

Rosetta: usually recompiles once ahead of time, only supports 64-bit

Microsoft: seems to recompile on the fly, only supports 32-bit

Also notable is the fact that the leaked benchmarks show the A12Z is generally
faster when emulating an Intel CPU with Rosetta 2, than the Surface Pro X is
when running natively, and the A12Z isn’t even Apple’s real desktop cpu, it’s
just what they had on hand from the iPad (!)

~~~
my123
And A12Z is a two year old A12X rebranded (they literally just enabled a GPU
core turned off before for yields), using the exact same die.

------
varbhat
One Question: Why are they using the word "Apple Silicon" instead of "ARM" ?

Is this marketing move?

~~~
fnordsensei
Marketing, probably, but also accuracy. Apple integrates a bunch of
specialized co-processors as well. They want you to think of this, and the Arm
bit, as one unit.

------
sys_64738
This is wishful thinking. ARM was invented during the 1980s so is 'old' too.
Intel tried to move from x86 with Titanium but that failed and AMD released
the x64 extensions eventually causing Intel to put the full effort into those
extensions too. Intel is moving to hybrid CPUs with Lakefield so the mixture
of energy efficient and high powered isn't restricted to ARM anymore. Most
people in business simply won't mix x86 and ARM binaries on Windows so they're
going to be one field or the other. Those same people want to run those
Windows tools at home too. ARM Macs will be a niche product for those looking
to spend excessively.

~~~
nordsieck
> This is wishful thinking. ... Most people in business simply won't mix x86
> and ARM binaries on Windows so they're going to be one field or the other.
> Those same people want to run those Windows tools at home too. ARM Macs will
> be a niche product for those looking to spend excessively.

I think this ignores the historical reasons for x86 success. Intel made their
bones by crushing Unix boxes and mainframes with high volume x86 parts. PC
CPUs were the highest volume microprocessor for decades, and they leveraged
that into making high performance, yet low cost server and workstations parts
to buff up their margins.

Well, the tables have turned - ARM is far, far higher volume than x86 due to
its ubiquitous use in smart phones. And ARM is going to do to Intel what Intel
did to IBM, Sun, SGI, HP, DEC, etc. The problem is the same as it always has
been: NRE (fixed costs) for new chips climbs ever higher and needs to be
offset by even more volume to be acceptable to consumers.

ARM won that game in 2007. It's just taken this long before market forces and
product maturity have forced Intel's hand. To be fair, Intel may have staved
off the inevitable for a few years without their well publicized 10nm
production stall.

~~~
bgorman
> The problem is the same as it always has been: NRE (fixed costs) for new
> chips climbs ever higher and needs to be offset by even more volume to be
> acceptable to consumers.

While this is certainly true for bleeding edge nodes, hasn't the NRE become
significantly lower for mature process chips? I would guess that EDA tools and
a greater availability of IP blocks would drive down the real dollar cost of
designing your own ASIC on a node 1-2 generations back.

~~~
nordsieck
> While this is certainly true for bleeding edge nodes, hasn't the NRE become
> significantly lower for mature process chips? I would guess that EDA tools
> and a greater availability of IP blocks would drive down the real dollar
> cost of designing your own ASIC on a node 1-2 generations back.

Honestly, I'm not familiar enough to have a nuanced opinion, but from what I
understand, that's probably true.

However, in the battle for cloud VPS and PC CPU sockets, using a 1-2
generation old process doesn't appear to be a winning strategy.

------
partiallypro
Intel has been behind before, and eventually the surges ahead. I honestly
think that is going to happen yet again. For the next 5 years Apple's silicon
might be the top, but in 10 years? I'm not so sure.

~~~
formerly_proven
Apple has yet to show any silicon that scales significantly above laptop CPUs,
which is not trivial. Their achievement as a from-scratch, independent vendor
is surely impressive, but let's keep in mind that _all_ performance
comparisons between the architectures so far are based on a single benchmark.

~~~
eyelidlessness
I may be mistaken, I'm not nearly knowledgeable enough about hardware to say
this with full confidence, but wouldn't the very low power and heat
requirements of their laptop-CPU-competitive chips suggest that they have a
lot of room to increase performance simply by increasing clock speed?

~~~
formerly_proven
No, because voltage-frequency scaling is very non-linear (and process and
design dependent) and designs often just have a "brick wall" they can't
progress over.

For example, it was practically impossible to get Zen 1 CPUs over 4 GHz on
air, not because they got too hot, but because they'd just crash almost
immediately, regardless of voltage. Similiarly you can't push the current-est
Intel generation beyond ~5.3 GHz, regardless of voltage. Current Zen 2 CPUs
are highly binned, some bins cannot go beyond ~4.2 GHz, regardless of voltage
and cooling, some can do 4.7 GHz.

~~~
imtringued
The slowest operation on the entire CPU has to finish within a single clock
cycle. This is basically a design issue. If you design your CPU for 6GHz then
the slowest operation isn't allowed to take more than 0.17 nanoseconds. This
gets harder the tighter the tolerances are but we can go way beyond what
current CPUs run at. The reality is that power consumption/heat prevents us
from running CPUs at 6GHz so we don't build CPUs that can run at these
frequencies. The manufacturing process can also play a big role (some
processes are more efficient at ~2.x GHz than 4.xGHz) but it's biggest impact
by far is reducing power consumption to allow higher frequencies.

------
nottorp
Just take a look at this 2013 HN comment:

[https://news.ycombinator.com/item?id=5589410](https://news.ycombinator.com/item?id=5589410)

Kinda prophetic right?

------
victor106
> Intel execs know they missed the Smartphone 2.0 revolution because of
> culture blindness. They couldn’t bear to part with the high margins
> generated by the x86 cash cow; they couldn’t see that lower margins could be
> supported by unimaginable volume

Clayton Christensen‘S Innovators Dilemma at work.

It’s amazing how Apple with just 7% of market share can/could change the
entire industry. They did the same with Flash(for the better) and this time
with the PC. Interesting times ahead

------
frank2
(1) Many comment writers on this site have been predicting the replacement of
AMD64 with ARM in personal computers for about 12 years.

(2) Apple's announcement had no discernible effect on Intel's market cap,
which, even after adjusting for inflation, is higher today than it has ever
been except for an interval of about 2 or 3 years at the end of the dot-com
boom.

(I used a CPI calculator, a market-cap chart and one of my eyeballs to
estimate the duration of this interval.)

------
halo37253
I love how people like to talk about Apple vs Intel.

But Intel already had been beaten by AMD, and for the foreseeable future as
well. Even on the mobile side, AMD has more performances with less power.

How will Apple stack up with AMD is the real question. This move is great for
their low powered devices. But when it comes to The Pro devices, I'm sure they
would have more performance for the next half decade if they just moved to
AMD.

------
andy_ppp
I’ve always found it amazing Intel didn’t make their own OS eventually
software companies will try to to eat you.

------
nkingsy
Earlier discussions assumed some amount of cloud processing to cover for less
powerful chips. Is that not the case?

~~~
m0xte
Not needed now. My A12 equipped iPad has considerably more grunt than my i7
intel laptop.

~~~
gruez
>My A12 equipped iPad has considerably more grunt than my i7 intel laptop.

I know this meme gets brought up a lot, but has this claim been backed up by
benchmarks that's _not_ geekbench?

~~~
rayiner
Yes, SPEC, which is a real world suite of applications, like GCC:
[https://www.anandtech.com/show/13392/the-iphone-xs-xs-max-
re...](https://www.anandtech.com/show/13392/the-iphone-xs-xs-max-review-
unveiling-the-silicon-secrets/4)

~~~
gruez
I was too lazy to collate the numbers together, so I found this comparison
chart
[https://old.reddit.com/r/iphone/comments/9lup3m/iphone_xs_sp...](https://old.reddit.com/r/iphone/comments/9lup3m/iphone_xs_spec_2006_integer_performance_compared/e79kikr/).
The A12 looks faster in some cases, but not by much overall. It definitely
doesn't support the claim that it has "considerably more grunt" than an i7.

~~~
m0xte
Some comparison points. It's not all about hard benchmarks but what workloads
you throw at the devices.

1\. The A12Z in my iPad has 2 more vortex cores than the base A12 and 4 more
GPU cores.

2\. I'm comparing to _my_ laptop, a T470 with i7-7600U in it.

3\. _My_ (I did explicitly say that) laptop's modern equivalent, an i7-8565u,
is dog shit compared to the iPad still and costs 100% more.

Some benchmarks if we need them:

[https://browser.geekbench.com/processors/intel-
core-i7-8565u](https://browser.geekbench.com/processors/intel-core-i7-8565u)

[https://browser.geekbench.com/v5/cpu/2898525](https://browser.geekbench.com/v5/cpu/2898525)

Now if you do any video processing on the ipad, it'll chunk through 4k like my
Ryzen 3700X does if it's HEVC and barely even get warm, but that's the T2 ASIC
in it.

The point being, if you compare a modernish "professional laptop" with an i7
in it, to an iPad, the iPad will rip it a new orifice.

ARM + ASICs for other functions (ML/security/codec) run rings around any
general x86 CPU on performance per watt for sure.

Comparing as you did to a Xeon with 165W TDP is comedy but illustrative yes.

~~~
gruez
>Some benchmarks if we need them:

>[https://browser.geekbench.com/processors/intel-
core-i7-8565u](https://browser.geekbench.com/processors/intel-core-i7-8565u)

>[https://browser.geekbench.com/v5/cpu/2898525](https://browser.geekbench.com/v5/cpu/2898525)

My original question: " _but has this claim been backed up by benchmarks that
's not geekbench?_"

>Now if you do any video processing on the ipad, it'll chunk through 4k like
my Ryzen 3700X does if it's HEVC and barely even get warm, but that's the T2
ASIC in it.

>ARM + ASICs for other functions (ML/security/codec) run rings around any
general x86 CPU on performance per watt for sure.

Comparing hardware accelerated HEVC performance to software cpu performance
isn't fair and is shifting the goalposts. If you're going by that logic you
can also say that a snapdragon 865 (hardware rendering) has "considerably more
grunt" a 64 core threadripper (software rendering). While we're at it, we can
also compare A12 to intel/amd in other aspects, such as memory support,
floating point performance, and sustained performance, all of which I'm
confident A12 wouldn't do well in.

>Comparing as you did to a Xeon with 165W TDP is comedy but illustrative yes.

So? The xeon cpu also has 28 cores and 56 threads, whereas the A12 only has 2
"big" cores (the ones being benchmarked). Dividing 165 by 28 gets you to a
more reasonable TDP of 5.89W per core.

~~~
skavi
This review[0] has SPEC2006 scores for the A12X in the iPad Pro 2018. The same
review estimates the per core power at around ~4W.

This review[1] compares the SPEC2006 score of the A13 to high end modern
desktop chips. The A13 is very close despite its power consumption being much
lower.

[0]: [https://www.anandtech.com/show/13661/the-2018-apple-ipad-
pro...](https://www.anandtech.com/show/13661/the-2018-apple-ipad-pro-11-inch-
review/4)

[1]: [https://www.anandtech.com/show/14892/the-apple-
iphone-11-pro...](https://www.anandtech.com/show/14892/the-apple-
iphone-11-pro-and-max-review/4)

------
tinus_hn
Strange how Microsoft apparently has no way of providing a Rosetta like
solution for Windows on ARM.

------
rawoke083600
Interesting... Java was the promise of program once run every where, and i
guess that is somewhat true, but what got their faster and better is the
browser as a platform. For most things the browser doesn't need Intel or
Microsoft, not running or hosting it.

------
dmitshur
Hmm, this was posted 1 hour ago, 49 points, 84 comments, and it already moved
down to 31st slot, off of front page (from somewhere in the top 15 just 5-10
minutes ago). That seems surprisingly fast.

I noticed because I wanted to come back to read more comments and had a hard
time finding it.

~~~
oarsinsync
Comments outnumbering votes is one of the heuristics used to determine if a
thread is likely to be toxic rather than promoting good discourse, so that may
be what dampened this off the front page.

~~~
dmitshur
I see, thanks for an explanation.

------
enos_feedler
What I can't reconcile is NVDA's stock price given the vast majority of their
revenue and dominance comes from supplying GPUs to the Wintel industry. What
is their position in a world where Wintel starts to wind down?

------
jonplackett
How much of the improved efficiency is really ARM vs Intel and how much is
that the new Apple silicon will now be TWO cycles ahead of Intel in
miniaturisation. That must count for something too.

~~~
jcheng
You can look at AMD CPUs if you want an apples-to-apples comparison, as theirs
are on the same TSMC nodes as Apple.

~~~
GeekyBear
AMD is going to go with an enhanced version of TSMC's 7nm process for their
next round of CPUs.

Apple will be using TSMC's new 5nm process for the first version of it's
laptop class chips, so they will have a process node advantage in the upcoming
generation.

~~~
jonplackett
I guess we could compare AMD's new chips with Apple's old ones and see how
that looks.

~~~
GeekyBear
The A13 in Apple's iPhone is on the same TSMC process node as AMD's current
chiplets and compares quite favorably despite a greatly reduced power budget
and a lack of active cooling.

>Last year I’ve noted that the A12 was margins off the best desktop CPU cores.
This year, the A13 has essentially matched best that AMD and Intel have to
offer – in SPECint2006 at least. In SPECfp2006 the A13 is still roughly 15%
behind.

[https://www.anandtech.com/show/14892/the-apple-
iphone-11-pro...](https://www.anandtech.com/show/14892/the-apple-
iphone-11-pro-and-max-review/4)

~~~
imtringued
The Apple core runs at 5W. Consider that AMD's chips run at significantly
higher clock rates. The power budget isn't greatly reduced. It's basically the
same. Apple has little room for improvement. They have to stay below 2.4Ghz to
keep the same power budget.

Just read the article you linked.

>Apple’s marketing materials describe the A13 as being 20% faster along with
also stating that it uses 30% less power than the A12, which unfortunately is
phrased in a deceiving (or at least unclear) manner. While we suspect that a
lot of people will interpret it to mean that A13 is 20% faster while
simultaneously using 30% less power, it’s actually either one or the other.

~~~
GeekyBear
I did read it. That 5 watt figure is with one of the two big cores fully
loaded. AMD idles with a bigger power draw than that.

>In the face-off against a Cortex-A55 implementation such as on the Snapdragon
855, the new Thunder cores represent a 2.5-3x performance lead while at the
same time using less than half the energy.

[https://www.anandtech.com/show/14892/the-apple-
iphone-11-pro...](https://www.anandtech.com/show/14892/the-apple-
iphone-11-pro-and-max-review/2)

When compared to other designs that operate under the same cooling and power
constraints, there is absolutely no competition.

------
BunsanSpace
Until ARM can give processing power comparable to x86-64 power limits be
dammed. x86-64 will continue to rule the desktop and large parts of the server
market.

------
gwbas1c
The last time I played around with Visual Studio, I vaguely remember some
options for compiling for ARM.

No surprises if Windows eventually moves to ARM. (And if servers do too.)

~~~
kyriakos
Windows is already running on ARM and in an actual production model that's
been on the market for a while. Its just that Apple does a lot better with
marketing and makes a much bigger deal of these things. Microsoft on the other
hand has always been really bad at marketing.

~~~
wlesieutre
Windows for ARM can't emulate Intel x64 software. Nor will games using OpenGL
versions above 1.1, among other things.

I disagree that Apple is only better at marketing things like this, they're
much also better at doing it. And their processors are presumably going to be
ahead of anyone else's ARM based computers, just like they've been with
phones.

[https://docs.microsoft.com/en-us/surface/surface-pro-arm-
app...](https://docs.microsoft.com/en-us/surface/surface-pro-arm-app-
performance)

~~~
kyriakos
I didn't compare the two options just saying that apple makes a much bigger
splash with their announcements than Microsoft. A lot of people here haven't
noticed that a Windows version running on ARM but all know about Apple
switching to ARM.

~~~
valuearb
Because no one uses Windows for ARM, so there is no inTeresa in talking about
it. Apple Silicon will be used by millions within months of release.

~~~
kyriakos
That is also known as good marketing.

------
lokedhs
I'm a bit surprised at the use of the term "Apple silicon". Everybody seems to
have started to use it. As best as I can tell, this is a term invented by
Apple's marketing department. I believe it's meant to imply that Apple is
producing some magic that goes beyond the CPU.

Readers of this site will of course understand exactly what is being
discussed, but I'm concerned that the overuse of this marketing term confuses
things for non-technical readers. There are already more common, and more
accurate terms that can be used instead.

~~~
tim--
For CPUID, it makes sense.

    
    
       GenuineIntel
       AuthenticAMD
       AppleSilicon

~~~
lokedhs
The downvotes sure came quickly on my original post.

My point was that we don't refer to AMD CPU's as "Authentic AMD" in general
speech. However, in Apple's case they have succeeded in promoting their own
narrative.

I understand that this happens, but I was just intrigued by the fact that they
managed to do it. The difference in terms does have a subtle effect in
conversation however.

------
riffic
Cool, I really hope this means people will stop saying "Wintel" now,
especially in places like job descriptions (where it just looks stupid.)

~~~
amelius
Yes, and maybe we can start calling Apple computers PCs (Personal Computers),
which is what they are. At least they seem more personal than my multiuser
Linux desktop system.

~~~
macintux
A few commentators have noted that it’s unfortunate we renamed microcomputers
“personal computers” because clearly smartphones are much more personal.

~~~
Gibbon1
Personal except for that fact that you really own your smart phone in name
only. Where you owned a 1980's personal computer lock stock and barrel.

------
mensetmanusman
I have been using an iPad for over 10 years, am a huge fan.

But it’s disingenuous to compare benchmarks for single threaded performance
between the ipad and macbook.

I can’t do any ‘very’ serious productivity work on the ipad that requires
multiple apps, period. (try moving a photo between multiple photo-editing
apps, ha.)

I hope that is just a ‘feature’ of iOS, or else macOS will suffer greatly as
people leave it for more productivity based OS’s.

~~~
matthewmacleod
This is not at all disingenuous. The fact that the environment in an iPad is
restrictive versus a general-purpose computer is not particularly related to
the performance of the processor powering each. There is no reason to think
that (when the environment allows it) the performance of more “serious” apps
will not be equivalent.

------
vsskanth
On a related note, why aren't Qualcomm arm chips performing as well on
benchmarks ?

~~~
skavi
They just aren't as good[0].

[0]: [https://www.anandtech.com/show/14892/the-apple-
iphone-11-pro...](https://www.anandtech.com/show/14892/the-apple-
iphone-11-pro-and-max-review/4)

------
shmerl
I'd stick with AMD and Linux.

------
MangoCoffee
i feel like i saw this movie before. ah yes, PowerPc.

Steve Jobs reveal transition to Intel:
[https://www.youtube.com/watch?v=ghdTqnYnFyg](https://www.youtube.com/watch?v=ghdTqnYnFyg)

i don't know if the history will repeat itself but it will be interesting to
watch.

~~~
rvanlaar
Indeed, it will be interesting to watch. Last time they had Steve to run the
show.

If the last few years are to be a guide, it will crash and burn. Apple hasn't
been developer and power user friendly for some time now. Focusing on thinner
laptops while sacrificing the keyboard and the touchbar to name two. I've been
hearing from developers being skeptic who are planning to buy before they
switch to Arm. Which is a different vibe when they switched to intel.

It could be a massive marketing stunt from MS when they announce a more like
MacOS version of Windows which actually runs on Intel.

~~~
scarface74
Steve wasn’t there for the 68K to PPC transition.

------
shantara
One thing people prophetizing about Apple ARM transition changing the industry
tend to forget about: Apple is not a leader, but actually a latecomer in the
ARM space.

macOS is the last major OS getting ARM support. iOS and Android have been
built with ARM support from the very beginning. The first commits working on
ARM support for Linux have been made more than a decade ago. The first
publicly released Windows on ARM hardware came out in 2012, and MS has
undoubtedly have been working on it for years before the commercial release.

If anything would trigger an ARM transition on a massive scale, it would
likely be the work invested in the high performance ARM hardware by the
companies such as Amazon and Fujitsu.

~~~
eyelidlessness
Besides the various mentions of iOS being on ARM, it's also worth mentioning
iOS is a fork of macOS (née Mac OS X), and was explicitly branded as _the same
OS_ when the iPhone launched. In other words, macOS has been capable of
running on ARM—at least internally at Apple—for 13 years.

It's highly likely that Apple has maintained internal macOS ARM builds for
every successive release since, if not since before the Intel transition. They
had similarly maintained x86 builds of macOS from the time they began its
development after the NeXT acquisition.

