
Why Apple ditched PowerPC, and what it says about Apple ditching Intel - kristianp
https://tedium.co/2020/06/16/apple-powerpc-intel-transition-history/
======
mdasen
I think this article paints a rosy picture of the PowerPC. I was a Mac user
and owned G3 and G4 Macs (and a PowerPC 603 Mac). It wasn't a happy time that
suddenly came to an end with the G5 and a decaying relationship. IBM and
Motorola had been struggling to keep up with Intel for a long time. Apple kept
trying to spin it and the next-great-thing was always just around the
corner...the problem is that Intel kept getting there faster and cheaper.

Apple would talk about the "MHz-myth" a lot. While it's true that MHz doesn't
equal performance, Intel was doubling the PowerPC's performance most of the
time. The G3 saw Apple do OK, but then Intel went back to dominating in short
order. The PowerPC never matched Intel again.

It was really bad. People with Windows computers just had processors that were
so much more powerful and so much cheaper.

You can say that Apple always charges a premium, but not too much today on
their main lines. Apple simply doesn't sell low-end stuff. Yes, a MacBook Pro
2GHz costs $1,800 which is a lot. However, you can't compare it to laptops
with crappy 250-nit, 1080p screens or laptops made of plastic, or laptops with
15W processors. A ThinkPad X1 Carbon starts at $1,553 and that's with a 1080p
display rather than 1600p, 400-nits rather than 500-nits, 8GB of RAM rather
than 16GB (both soldiered), and a 15-watt 1.6GHz processor rather than the
28-watt 2GHz part. Heck, for $1,299 you can get something very similar to the
ThinkPad X1 Carbon from Apple (though with a 1.4GHz processor rather than
1.6GHz) - $250 cheaper!

The point of this isn't to say that you can't get good deals on Windows
computers or that there's no Apple premium or even that there's any value in
Apple's fit-and-finsh that you're paying for. This is to say that I remember
things like the original iMac with CRT display, 233MHz G3 processor, 13"
screen (when people wanted 15-17" screens), and an atrocious mouse going
against Intel machines for half the price with nearly double the speed and
better specs on everything other than aesthetics. Things were really bad
trying to argue that someone should spend $1,300 for an iMac when they could
get a Gateway, eMachine, Acer, etc. for $600 with a 400MHz processor rather
than 233MHz. A year later, Apple's at 266MHz while Intel has released the
Pentium III and is cranking it up from 400MHz to 600MHz that year.

Yea, you can point to $700 laptops today and say, "why buy an Apple for
$1,300?" Sure, but at least I can say that the display is so much better (500
nits vs 250 nits and retina), it's lighter than those bargain laptops, fit-
and-finish is so much better, etc. At least I'm not saying, "um, no...all
those benchmarks showing the Windows machine twice as fast...um...and the
mouse is so cool because it's translucent...you get used to it being
terrible." It's very, very different from the dark days of 2000.

Plus, today, a price premium doesn't seem as bad. Back in 2000 when you
thought you'd be upgrading ever 2-3 years, you'd be shelling out a lot more
frequently. If performance doubled every 18 months, 3 years later you'd be
stuck with a computer running at 1/4th the speed of something new. With the
slowdown in processor upgrades, paying for premium hardware doesn't seem like
throwing money away in the same way.

The article also paints the RISC architecture as superior. I'm not a chip
expert, but most people seem to say that while RISC and CISC architectures
have a different history, modern CPUs are hybrids of the approaches without
huge advantages inherent in their ideology. Frankly, if Intel were able to get
down to 7nm and 5nm, Apple might not be looking at ARM as strongly.

I think it also paints Apple as some sort of more demanding customer. In some
ways, sure. Apple likes to move things forward. However, it's not like
MacBooks are that different from PC notebooks. The difference is that Apple
has options. They can move to another architecture. Windows manufacturers
don't really have that. Sure, Windows on ARM has been a thing, but Microsoft
isn't really committed to it. Plus, Windows devs aren't as compliant when it
comes to moving architectures so a lot of programs would be running slowly
under CPU emulation.

The big issue is that Intel has been stuck for so long. Yes, they've shipped
some 10nm 15-watt parts and even made a bespoke 28-watt part for Apple. It's
not enough. I'd argue that PC sales are slow because Intel hasn't compellingly
upgraded their processors in a long time. It used to be that every 18 months,
we'd see a processor that was a huge upgrade. Now it's 5 years to get that
upgrade.

There's a trade-off between custom products and economies of scale. With the
iPhone using so many processors and TSMC doing so well with its fab, Apple now
kinda doesn't have to choose. Intel has been charging a huge premium for its
processors because people were locked into the x86 and it takes a while for
new competition to happen. Their fabs have fallen behind. It looked like they
might be able to do 10nm and move forward from that, but that doesn't seem to
be working out too well for them.

The transition from PowerPC to Intel was about IBM and Motorola not being able
to deliver parts. They were falling behind on fabs, they weren't making the
parts needed for Apple's product line, and it was leaving Apple in a position
where they simply had inferior machines. The transition from Intel to ARM is
about Intel not being able to deliver parts. It wasn't simply a short time
when they couldn't deliver enhancements, but a decently long trend on both
accounts. Apple knows it can deliver the parts it wants with its own
processors at this point. The iPhone business is large enough to ensure that
and they can make laptop parts that really fit what they're trying to market.
Intel got Apple's business because they produced superior parts at a lower
price. They're losing Apple's business for the same reason.

~~~
tartoran
> I'd argue that PC sales are slow because Intel hasn't compellingly upgraded
> their processors in a long time. It used to be that every 18 months, we'd
> see a processor that was a huge upgrade. Now it's 5 years to get that
> upgrade.

I think it’s Moore’s law approaching its limits and the direction of chip
improvements isn’t core speed but the number of them, power usage, etc which
make a difference but for most folk it doesn’t appear like an improvement like
say doubling the frequency every 18 months. I have an old laptop and it keeps
up quite nicely after 8 years...

~~~
hmottestad
This is me today. I'm typing this on an 8 year old macbook pro. First gen
retina. 4-cores and 16GB of ram. I want to upgrade, I really really do. I have
a 16" with 8-cores and 64GB of ram at work, but I can't bring myself to
purchase one for myself since I've been telling myself I would wait for 10nm.

The first 14nm processors started shipping in the 15" in 2015 - 5 years ago.

My current 8 year old macbook has a 22nm processor. I would never have thought
8 years ago that Intel would only have managed a single node shrink since
then.

~~~
ChuckNorris89
_> I have a 16" with 8-cores and 64GB of ram at work_

I'm jealous and genuinely curious where do you guys work that your employers
can afford to get everyone such expensive machines.

I've been a dev in the EU for 8 years now and at most places I've worked or
interviewed(not FAANG) the machines you get are some cheapo HP/Lenovo/Dell
with only the executives having Apple hardware.

I never understood why companies in the west cheap out on hardware so much
since compared to the cost of office rent and employee salaries that's a drop
in the ocean, they could buy everyone MacBooks or Ryzen towers and it wouldn't
even dent their bottom line.

~~~
geerlingguy
There are a few places that kind of go 'to the nines' for employees and give
adequate and even overpowered workstations. I think the crowd that gets that
treatment is slightly over-represented on HN.

But most businesses here are the same, you get lucky to get any nice feature
over the 'same laptop that sales gets' which is barely more than a Chromebook.
And getting an external monitor that's not the cheapest bulk-buy model was
also pretty hard to do (I had a friend in marketing who helped me get a larger
display with better colors at that place).

~~~
emsy
Or you go self employed and get your own fancy workstation since you know it’s
easily worth your money in the long run. Don’t work for people that don’t
understand this.

------
jarjoura
Internally Apple always had an x86 build of OS X running. Just like I’m sure
they have an ARM build running today.

Intel chips ran cooler, had better power efficiency and were way faster at a
lot of things than PowerPC.

Leadership was actually super reluctant to switch and it took a demo from an
engineer showing the massive improvement to convince them.

If Apple is ready to switch to ARM they must have some impressive CPU. Apples
not usually one to dabble here and there, so if they switch it’s going to be a
wholesale ordeal. What will this look like for the Mac Pro?

~~~
fluffything
> If Apple is ready to switch to ARM they must have some impressive CPU.
> Apples not usually one to dabble here and there, so if they switch it’s
> going to be a wholesale ordeal.

Apple has an ARM-based product on sale today that has a better screen, better
battery life, and better performance, than the 2020 Macbook air for many
workloads, including compiling and doing development work (it crushed the air
in the benchmark sets that can be run there).

That product is the 2020 ipad pro, and that is why there are so many posts
about people actually trying to turn them into development machines.

This product is one step away from being a real laptop: it is missing real
MacOSX.

The performance of the Intel-based Macbook Air has increased by 1.8-1.9x from
2012 to 2020 (that's 8 years). Apple already has internal A13 prototypes, and
A14 is probably going to the prototype phase right now.

I can't imagine the numbers working in favor of Intel. By 2021 Intel can
probably deliver a 1.1-1.2x speed up tops for Macbook airs, while Apple can
probably deliver a 2x speed up for 2021 with the A13 and another 2x one for
2023 with the A14. Being able to scale the same chip for iphones, ipads, and
macbooks, reusing internal resources, and avoiding the "hassle" of having to
deal with Intel.

The only thing that's IMO in the air is what is going to be the discrete
graphics story for macbook pro's and Mac pros.

Its unclear whether it would make sense for AMD to deliver discrete gfx
products that interface well with ARM, and the Apple-nvidia bridge burned long
time ago. So I wonder whether Apple has openings for electrical engineers to
work on discrete graphics verification & design, and driver developers. It
would strongly hint that apple would design their own discrete gfx in house.

~~~
ajross
The A12 is clearly at parity now vs. Intel's existing mobile offerings,
probably somewhat ahead given the long delays with 10nm.

The question upthread is whether or not switching architectures makes
financial sense, not whether it's a (mild) technical win.

Switching to their own chips cuts Intel out of the loop, but as far as
business risk that simply replaces one single source manufacturer with another
(TSMC).

It probably saves money per-part, which is good. But then Apple is still
drowning in cash and immediate term savings really aren't much of a motivator.

> By 2021 Intel can probably deliver a 1.1-1.2x speed up tops for Macbook
> airs, while Apple can probably deliver a 2x speed up for 2021 with the A15
> and another one for 2023 with the A16.

That's going to need some citation. Moore's law is ending for everyone, not
just Intel. TSMC has pulled ahead of Intel (and Samsung has caught up) for
sure, but progress is slowing. That kind of scaling just isn't going to happen
for anyone.

~~~
fluffything
> That's going to need some citation.

So this is Geekbench 5 showing the difference between a 2012 macbook air i5
and a 2020 macbook air i5 - both base models and similar price:
[https://browser.geekbench.com/v5/cpu/compare/2613713?baselin...](https://browser.geekbench.com/v5/cpu/compare/2613713?baseline=2613509)

And this is a thread in the rust subreddit about compilation speed on macbooks
where some users report the performance increase for different generations of
macbook pros and macbook airs, if you want a more "realistic benchmark" to
calibrate geekbench results:
[https://www.reddit.com/r/rust/comments/gypajc/macbook_pro_20...](https://www.reddit.com/r/rust/comments/gypajc/macbook_pro_2020_compile_performance/)

This is the ipad pro 2020 crushing the macbook air 2020:
[https://browser.geekbench.com/v5/cpu/compare/2612714?baselin...](https://browser.geekbench.com/v5/cpu/compare/2612714?baseline=2613509)

And this is the improvement from the previous generation ipad A10 to the ipad
pro's A12Z - 2x speed up in single a generation:
[https://browser.geekbench.com/v5/cpu/compare/2613991?baselin...](https://browser.geekbench.com/v5/cpu/compare/2613991?baseline=2613301)

You are definetely right that Moore's law is hitting Intel hard. But AMD is
still doing quite well, nvidia and "ati" are doing incredibly well, and Apple
chips have been doing extremely well over the last couple generations.

Maybe you are right, and Apple won't be able to deliver 2x speed ups in the
next 2 generations. I'd expect that, just like for Intel, things won't
abruptly change from one gen to another, but for this to happen over a longer
period of time. Right now, only apple knows what perf their next 2 gens of
chips are expected to deliver.

The only thing we know is that Apple ARM chips are crushing their previous
generation both for ipads and iphones year after year, and now they are
betting on them for macbooks, and potentially mac pros, probably for at least
the next 10-15 years.

~~~
ajross
> This is the ipad pro 2020 crushing the macbook air 2020

You keep coming back to that citation. It's more than a little spun. The parts
have comparable semiconductor process (Intel 10nm vs. TSMC 7nm) and die size
(146.1 vs. 127.3 mm2). But the the A12Z in the iPad is running as fast as
Apple can make it (it's basically an overclocked/high-binned A12X), where the
Intel part is a low power, low-binned variant running at about half the base
clock of the high end CPUs, with half the CPUs and half the L3 cache fused
off.

A more appropriate comparison would be with something like the Core i7 1065G7,
which is exactly the same die and can run in the same 12W TDP range but with
roughly double the silicon resources vs. Apple's turbocharged racehorse.

~~~
klelatti
But the A12Z (based on the A12) is not using the latest TSMC process as used
for the A13 (which is almost certainly in higher volume production than Intel
10nm).

Plus, if Apple can afford to put an overclocked / high-binned TSMC chip in the
lower-cost iPad but has to put a low binned i5 in the Air doesn't that say
something about the relative economics / yields.

For what it's worth I have a Core i7 1065G7 and it's decently fast but gets
very hot and definitely needs a fan (which the iPad doesn't) and has good
battery life (but not as good as the iPad's).

The advantage still seems to me be to be very strongly with the Apple parts.

~~~
ajross
> if Apple can afford to put an overclocked / high-binned TSMC chip in the
> lower-cost iPad but has to put a low binned i5 in the Air doesn't that say
> something about the relative economics / yields.

Potentially. It probably also says more about the relative product positioning
of the iPad Pro (high end, max performance) vs. MacBook Air (slim, light, and
by requirement slower than the MPB so that the products are correctly
differentiated).

The point is you're reaching. The A12 is a great part. TSMC is a great fab.
Neither are as far ahead of the competition as Apple's marketing has led you
to believe.

------
_ph_
The key element in the CPU market space is volumen. Volume lowers the cost of
manufacturing and allows you to spend much more money on R&D as it is
amortised over more devices. While all the big RISC manufacturers had in
principle better architectures than Intel, in the 90ies Intel would kill them
one by one due to the insane volumes of the PC market. Only on the server
PowerPC and Sparc would survive. This is what forced the PowerPC to Intel
transition, Apple had litte choices. I always keep wondering what the outcome
would have been, if one of the large RISC platforms would have been made
available in more consumer level products, e.g. offering an ATX motherboard
for running Linux. Volumes would have been much larger.

Another big factor for Intel was, that, financed by their huge cash flow, they
had the most advanced fabs, so competitors often were 1-2 generations behind
in the available processes.

But now a few things have changed. First of all, Intel got stuck with their
10nm process, so they are no longer the manufacturing lead. But most
importantly, TSMC would pull ahead of Intel and offer their services to
everyone in the market. For the first time, AMD had a manufacturing advantage
vs. Intel.

And the iPhone happened. Given Apple an almost endless supply of money and a
huge volume. Over many years, Apple built up a leading chip-design team. This
already paid off big with the iPhone, having by far the most powerful CPUs in
the mobile space. This also gave Apple a clear insight into the advantages of
really owning the whole platform. Designing software and the cpus together.

Offering desktop-class CPUs is of course a large additional investment - so it
is not a trivial step. But if Apple is willing to do it, it should be very
interesting and would give hope that they have ambitious plans with the Mac,
as it only makes sense if they really push the platform.

~~~
nextos
The big problem with switching CPUs is the instruction set.

For most people, it doesn't matter. But if you are in some niche domains, it
really has an impact. I don't expect a smooth transition of libraries such as
BLAS or VMs such as the JVM. You can't simply recompile these. You typically
need a human to rewrite SSE, AVX and other tricky low level code so that
performance stays competitive.

~~~
spacedcowboy
Apple has the Accelerate.framework already, which is hand-tuned per chip-type,
and is what most of the libraries call into. I’d imagine a lot of work will
have been done to make that as seamless as possible on the new chips.

It’s also kind of useful for a a framework team to be able to call up the guy
designing the next cou and say that “this bit here is a bottleneck, what can
you do for that?”...

~~~
Someone
Moreover, BLAS is part of the Accelerate framework
([https://developer.apple.com/documentation/accelerate](https://developer.apple.com/documentation/accelerate)),
so if your code targets that, no porting is needed.

~~~
physicsguy
You still need to recompile and relink. And it's not that simple, Apple's
implementation of LAPACK is well out of date for e.g. - it dates back to 2009.

------
protomyth
The book "The Race For A New Game Machine: Creating the Chips Inside the XBox
and the Playstation 3" by David Shippy has some commentary on Apple with their
relationship with IBM and the Cell. It's an interesting book and gives some
reasons for Apple to ditch PowerPC.

------
mister_hn
There was less software running on PowerPC than x86 back then and still
there's less software running on Arm than x86, especially the key ones like
professional ones (above all, Adobe suite, AutoCAD, Blender, Ableton, etc.)

Arm chips might even be faster for Apple, but MacOS X unfortunately isn't as
castrated as iOS and power users will refuse to give away that (small) liberty
available in the desktop system than on mobile one.

An OS where there's only a way to install software (the AppStore) is a huge
limitation and even Microsoft itself learned the mistake with Windows 10 S,
offering customers to install the standard version.

~~~
robertoandred
macOS running on ARM doesn't mean the App Store would be the only way to
install software.

~~~
mister_hn
Sure?

------
klelatti
Whilst most of the discussion has been on Intel vs ARM performance and power
consumption as a rationale it's probably worth mentioning two others:

\- Complete control of the Silicon. Apple will be able to place its own
silicon IP on the new ARM chips. Does this mean the integration of the T2 onto
the main SoC? Adding Neural Engine hardware? None of this would be possible
with Intel and this would seem to provide interesting opportunities for Apple
to differentiate the Mac from the PC market.

\- Economics. It seems likely that the ARM chips will be materially cheaper
for Apple to buy than comparable Intel chips, although Apple will have fixed
design costs to meet that it wouldn't do if it stuck with Intel. Any advantage
would grow if Mac volumes increase which would make it advantageous to try to
grow market share. Is this the start of a push to grow Mac volumes
significantly?

~~~
pier25
I think the lower heat density of ARM, and the increasing heat of AMD and
Intel is another interesting point.

AMD and Intel are racing to smaller manufacturing processes that inevitably
will increase heat density.

Today the most powerful laptops are those huge PC gaming bricks which of
course are much more powerful than any MBP. This is only going to get worse as
heat density increases, at least for demanding applications (gaming, 8k video
editing, vfx, etc).

By moving to ARM, Apple will be able to offer much more performant laptops in
a much smaller form factor which will only differentiate Macs even more from
the PC world. At least in theory.

If this works I wouldn't be surprised if PC laptops moved to ARM too a couple
of years later.

~~~
harpratap
> If this works I wouldn't be surprised if PC laptops moved to ARM too a
> couple of years later

Hasn't PC world ALREADY started the transition to ARM? Snapdragon based
laptops already started shipping with SD835, Microsoft already has Windows S
for such ARM laptops and many OEMs are already making experimental foldable
ARM based devices that can take advantage of these small chips. Apple would be
just retro-fitting their ARM chips in the shell of Macbooks 2 years too late.

~~~
pier25
> Hasn't PC world ALREADY started the transition to ARM?

Sure there are some ARM Windows devices but AFAIK this is not really a trend
to moving away from Intel/AMD.

We'll see tomorrow, but it seems Apple intends to move all their laptops to
ARM, not just the smaller/cheaper ones.

~~~
int_19h
Surface Pro X is not a smaller/cheaper device, either.

------
nojito
>In the early days of the Apple/Intel partnership, their use represented
something of a “pressure valve” on processor limitations that the Power Mac G5
created for Apple’s processor line. It helped solve a plateau in Apple’s
laptops, which weren’t able to take advantage of the 64-bit architecture that
the PowerPC G5 had promised to consumers.

I doubt Apple will ever forgive Intel for missing the Merom release
date...forcing Apple to support 32bit for an extra decade.

------
fortran77
I don't think they're exactly comparable. Ditching PPC (and I was working at
Apple at this time) was a bold move. The existing Apple loyal user base --
which Jobs wisely knew was irrelevant -- loved having a "different"
"Supercomputer" CPU at the heart of their computer instead of the "slow-as-a-
snail" Intel. Jobs knew it was better to appeal to the rest of the world than
be true to the "true believers" \-- who would have been happy with OS9, too.
But it was taking a risk:

To the True Believers it didn't matter that by this time ~2003, Intel was fast
and much more power efficient. You'd be lucky to get 40 minutes of battery
life from a PowerPC based Mac laptop at the time when Intel laptops could run
for a few hours.

Today, Apple doesn't have a core group of users who are "proud" of their
unique CPUs, and isn't fighting an uphill battle as they were in 2003-2005 or
so. However, sometimes they choose a tech for "stubborn" reasons rather than
technical ones and it's not clear if the ARM decision is made for the right
reasons. For example: I think not choosing NVidia, especially for the Mac Pro,
was a big mistake and costs them customers.

~~~
dylan604
>I think not choosing NVidia, especially for the Mac Pro, was a big mistake
and costs them customers.

Apple's reluctance to use Nvidia has been a total head scratcher. I owned a
2011 with Nvidia dedicated GPU, but this was the line with known manufacturing
defects. I had the mainboard replaced twice because of this issue, but
eventually replaced the laptop when the GPU failed again. It's like Apple is
holding a grudge.

~~~
slantyyz
I owned an early 2011 15" MBP with a discrete AMD Radeon GPU which had very
well known serious manufacturing defects.

I didn't realize Apple had any 2011 model year Macs that used nVidia GPUs.

~~~
dylan604
I think the 2011 was the last year a discrete Nvidia option was available.

------
tromp
The article title "Power Outage" is a very cute description of Apple's
migration away from PowerPC.

------
MistahKoala
Perhaps a naive question, but what will this mean for Bootcamp users?

~~~
jchook
Apparently Windows supports ARM, so in theory Apple will continue to support
Bootcamp.

> (notably, one thing Apple does not need to give up is Windows support:
> Windows has run on ARM for the last decade, and I expect Boot Camp to
> continue, and for virtualization offerings to be available as well; whether
> this will be as useful as Intel-based virtualization remains to be seen).

[https://stratechery.com/2020/apple-arm-and-
intel/](https://stratechery.com/2020/apple-arm-and-intel/)

~~~
paulryanrogers
Windows may support ARM. Yet the reason most people want Windows is
compatibility. And if Windows ARM doesn't run most of their software then it's
a step backward for them.

~~~
dtech
It _might_ give a small boost to Windows-on-ARM that Microsoft has been trying
for over a decade. Porting a typical windows app might be easier than porting
to Mac/Linux because you still have DirectX and all the Windows libraries.

There's also x86 emulation on ARM. It's slow, but it might be enough to run
that 20 year old business app.

~~~
slantyyz
Wasn't Intel saber-rattling about patent lawsuits when Microsoft announced
that Windows for ARM would run some x86 apps via emulation? [1]

While I don't know what became of that, I can see Microsoft working out a deal
with Intel because Windows is still huge on x86 and wasn't going anywhere.

On the other hand, if Apple is planning to completely drop Intel in favor of
ARM and wants to implement x86 emulation, I can't see Intel letting OSX ARM
emulate x86 without some form of resistance.

[1]
[https://www.forbes.com/sites/tiriasresearch/2017/06/16/intel...](https://www.forbes.com/sites/tiriasresearch/2017/06/16/intel-
threatens-microsoft-and-qualcomm-over-x86-emulation/#2ed6c54354f4) (sorry
about the link being Forbes, it was the first search result for the keywords I
used)

~~~
whereistimbo
But what IP would be violated for x86 emulation? It's clear that Intel only
threaten chipmaker who tried to add x86-emulation-acceleration ISA to the
silicon. x86 ISA is complete 17 years ago, which mean a lot of patents has
been expired today.

------
Sloppy
I ditched my Windows PC when Apple moved to Intel. I will ditch my MacBook
when it uses ARM.

Not dogma, just practical. I'm a `nix SW Dev and require the 95% of code that
works on AMD64 run on my machine. Odd to see MS now supporting more `nix. What
we (many SW Devs) need is `nix + AMD64 for the foreseeable future.

Furthermore I do not trust Apple. The advent of the iPhone software lock-in
echo-system shows where they want to take Macs and that is just a no starter
for me.

Bye-bye Apple.

~~~
pfranz
I don't think that's quite a given. I don't think Apple has changed as much as
you're implying. When OSX was first released, even though it was Unix-based,
most people were pretty confident it wouldn't ship with a Terminal. Apple has
always been very opinionated and strongly biased towards a user-facing
experience. "Techies" have always been a bit skeptical of what Apple may do
and should continue to be.

Linux already has a very healthy ecosystem on ARM with Raspberry Pi and
others. Heck, it had a healthy ecosystem back in the day on PPC with thins
like Yellowdog Linux. I don't think an ARM transition will change this part as
much as people think. You've always needed to recompile for macOS.

------
annoyingnoob
With all of the security issues surrounding Intel CPUs in the last couple of
years, not having an Intel processor will be a real advantage for Apple. And
competition is a good thing.

~~~
AnthonyMouse
If your impression was that modern ARM CPUs weren't affected, I have some bad
news for you:

[https://wiki.netbsd.org/security/meltdown_spectre/](https://wiki.netbsd.org/security/meltdown_spectre/)

~~~
gsnedders
That chart doesn't include Apple's implementations, but they were some of the
relatively few non-Intel CPUs to be affected by Meltdown as well as Spectre.

------
atarian
Never thought that I would see the day that Intel would lose its dominance.
Every time they came out with a new processor back in the 90s was like a new
iPhone.

------
github-cat
More about Apple's CPU architecture story can be found at
[https://www.pixelstech.net/article/1592625055-Why-Apple-
Mac-...](https://www.pixelstech.net/article/1592625055-Why-Apple-Mac-switches-
its-CPU-The-war-between-Intel-and-ARM), An interesting read.

------
sprash
P.A. Semi built a powerful and power-efficient Power ISA processor which
solved all the problems Apple had with the Power architecture at the time.

What did Jobs do? He bought the company and closed it immediately so that
nobody noticed that the switch to Intel was not only completely unnecessary
but also a big mistake.

~~~
selectodude
He didn't close it, he bought it and set them on ARM chips. Which have been
very successful.

~~~
runjake
And who knows, maybe Jobs didn’t set them on anything. Maybe he bought PA
intending to use their existing bus technology but someone at PA pulled him
aside and said “Hey, we don’t think you should waste your time with this. How
about we design [what is now their ARM line]?”

------
NotSammyHagar
I think amd may be on the verge of fixing a lot of the problems that intel has
run into in their inability to move forward quickly - the 10nm switch is
potentially devestating intel (intel's slow pace of advancement is kind of
what the article eventually gets to). If apple said they were going to focus
on amd chips at this point, the market would be excited. Are those apple arms
going to be able to really handle the cpu load and scale over time like large
market intel and amd design teams are scaling investments over billions of
devices? I'd just be afraid apple isn't quite big enough. It's very exciting
when a change like this comes along in any case. Will they run x86 'legacy'
mac programs at a reasonable speed?

------
mavhc
At the same time in the mid 80s Acorn was an even smaller company that wanted
to make its own CPU, and managed it, and by 1992 had a SoC version with GPU
and MMU built in (ARM250)

------
DonHopkins
>Gassée is certainly correct that Aquarius likely played a historical
precursor to Apple’s current processor ambitions, but it likely also played an
indirect role in its first major processor shift—that from Motorola’s 68000
series of processors used in the Apple Macintosh of the time, to the PowerPC,
which eventually took off in a big way in the 1990s.

Hey, Apple's first major processor shift was from 6502 to 68000!

------
liquidify
No way to easily maintain an ARM ecosystem and an x86, but it sure would be
nice if we got to choose our hardware more in our macs. They could solve a lot
of peoples issues by letting people customize basics like their laptop ports,
magsafe or not, keyboard / touchpad or not, and T2 chip or not.

They would have to open the OS up a lot to allow choice of CPU architecture,
which they will never do.

------
doggydogs94
For the end user, the migration to the new platform will be simple, but
confusing for the average Mac user. For developers, the migration will require
much more work (testing and coding for both platforms) for the foreseeable
future. Hopefully, the development tools will ease the pain somewhat.

------
urda
What does this mean for current x86 needs? Will Apple just "bridge" it for a
while like their transition to Intel? Do they have a shim or some other way to
handle it?

------
person_of_color
IBM should have open-sourced POWER.

------
zackmorris
Lack of backwards compatibility every time Apple changed their OS or processor
pretty much ruined my life back when I was trying to write games on the Mac,
because they coincided with downturns like the dot bomb and housing bubble
popping (in that case just after iOS arrived). I was so beat down trying to
survive at life that rewriting everything I had just written the last year for
the new hotness became too much of a burden. I worked a bunch of dead end jobs
instead and wasted whatever potential I might have had. Now midlife has hit
and I've generally let all that go, but it still bothers me thinking about
what might have been.

That said, Intel has sat on their hands for 15 (I would argue 20) years, and
so it's unsurprising that Apple is ditching them. I remember seeing 3 GHz
processors sometime around 2003-2004. Very little has changed since then. We
have faster memory busses now but we've generally lost almost two decades of
Moore's law. Before that, processors got twice as fast every 1.5 years, so 100
times faster every decade, which would be performance equivalent to a 30 THz
processor today.

Note that where progress HAS happened is video cards (GPUs). So I'm somewhat
optimistic that if Apple disrupts the CPU industry, we might see true general-
purpose computation speed up rather quickly and break the 4 core, 1 memory bus
barrier. I think low hanging fruit here would be 16 to 256 cores arranged in a
grid, with the square root of that number of memory busses on an edge. With
today's tech, we could have 1024 DEC Alpha cores with 32 memory busses for not
much more than we're paying for an Intel i9 with 2 billion transistors (the
Alpha had 2 million). Yes, I know it's not an exact comparison, but I have a
computer engineering degree so this isn't the time to be pedantic.

General purpose computing could also disrupt the GPU and AI industries,
because we could jump ship from the ever-narrowing niches of rasterization and
neural nets, and move on to broader experiments in things like ray tracing and
genetic algorithms. I had originally wanted to do that with FPGAs, but I've
been burned out so long trying to keep up with the shortening attention span
of tech that I had to let it go.

Hard to say if any of this will happen, but I just wanted to shed light on the
kind of innovations we've missed out on under two lost decades in tech. This
is the tip of the iceberg. An explanation for this is that customers want
cheap eye candy, and prices have certainly fallen on track with Moore's law.
But I've vote to finally see better performance again. Also I'd like to see
the emergent effects of better processors, such as more use of parallelized
higher-order functions and data driven/functional/declarative programming
using something like the Actor model, piping data around with simple tools
that do one thing well, borrowing techniques from UNIX and
Clojure/Erlang/Go/MATLAB/etc.

~~~
perl4ever
>I remember seeing 3 GHz processors sometime around 2003-2004. Very little has
changed since then

I think computers _are_ faster, but I'd like to see an in depth survey of how
today's 2 GHz chip differs from a 2001 model.

Probably I should do it myself.

~~~
Synaesthesia
More cores, way more cache, way better IPC, less power draw/heat, way more
instruction sets & capabilities, onboard graphics ...

Anandtech has lots of details on these things.

~~~
zackmorris
All of that stuff can be good, but has tradeoffs. Longer pipelines result in
worse branching performance, caching interferes with write-heavy code that's
mainly about moving data (like for games), and so on. I feel that putting
extra transistors towards large numbers of cores with short 4 stage pipelines
(like in early PowerPC) would have been better.

This is one of the more concise benchmark comparisons, in this case having a
3.6 GHz i9 and 1.4 GHz Pentium 3 (released starting in 1999):

[https://www.cpubenchmark.net/cpu.php?cpu=Intel+Pentium+III+1...](https://www.cpubenchmark.net/cpu.php?cpu=Intel+Pentium+III+1400S+%40+1400MHz&id=1146)

So this is 8 cores vs 1, at 2.57 times the clock speed. So per-core
performance has increased:

(18892/299) * (1/8) * (1.4/3.6) = 3.07

A 3x fold increase in 20 years is admirable but 1/3000 what would have been
predicted if performance had followed Moore's Law. To me, this indicates that
per-core performance stopped really increasing sometime around 2005 at the
latest. That's why fabs moved towards lower-cost mobile and embedded chips.

~~~
Synaesthesia
Per core performance has increased steadily for intel at approx 5-10% per
generation.

------
e2le
It's a shame they didn't switch back to PowerPC, more momentum behind
OpenPOWER would have being nice to see and allow more freedom to the end user.

~~~
detaro
Doesn't really make sense given their large existing investment in ARM and
there being (afaik) no current work on POWER for mobile applications.

~~~
e2le
That's true and it'll likely prove to be a good business decision although not
necessarily one that benefits the end user in the long run. I'm also not
suggesting they use POWER for mobile applications neither did I suggest they
use Intel x86 for that.

~~~
detaro
So POWER only for desktop systems, keep macbooks on something else?

------
rewoi
Does Apple actually invests into laptop/desktop development? They are usually
couple of years behind competition (DDR3, Sky Lake chips). While their ARM
chips are heavily developed.

I think they decided long time ago that ARM is good enough, and are waiting
for train to stop to change the engine. It is not about Intel quality, but
about saving money and independence. AMD is not even considered as an
alternative...

~~~
kyriakos
People seem to be buying them anyway. If we consider Form over function could
also be playing a role here, Apple's obsession with making their notebooks
thinner ditching intel will help a lot with that cause they can at least keep
the same battery life with a smaller battery.

~~~
tonyedgecombe
Apple’s latest MacBook Pro is thicker than the previous model.

~~~
kyriakos
That could also be part of the reason they are switching platforms. Likely not
the primary one.

~~~
tonyedgecombe
Or it could be an indication that they don't have an _" obsession with making
their notebooks thinner"_

