
Apple’s Mac Chip Switch Is Double Trouble for Intel - ksec
https://www.bloomberg.com/opinion/articles/2020-06-09/apple-s-mac-chip-switch-is-double-trouble-for-intel
======
wonderlg
Doesn’t Apple already include an ARM chip in MacBooks with the Touch Bar?

I have three feelings:

1\. Apple will just enable the existing chips “today” and developers can ship
ARM code to a sizable audience, with instant battery benefits.

2\. Cheaper Macs will start dropping x64 next year and continue running only
ARM code, _maybe_ emulating x64 like Rosetta.

3\. Pros will keep having both processors for the foreseeable future,
similarly to how they have “dual graphics”, except that the OS will always run
on ARM.

We should keep in mind that Apple just dropped a swath of 32-bit software
which means they aren’t afraid to do it again.

~~~
G4E
There is a legal slowdown for emulating x86 (and x86_64 I beleive), because of
Intel patents. That's why Microsoft's first attempt to ARM was such a
disaster, with no compatibility with Windows' huge ecosystem. That's why there
is no commercial toolsuit who offers x86 compatibility on other architectures.

But maybe the situation has evolve ? Expired patents or a deal between Apple
and Intel ?

------
albntomat0
(both of these are genuine questions, not rhetorical ones)

1\. If x64 wasn't already dominant and had the same market share as ARM, would
you choose x64? Why?

2\. With x64's current dominance, would you buy a ARM system as your primary
one, knowing that you'll have significant difficulties developing for the
majority of current market share?

~~~
scarface74
Seeing that most popular languages - Java, C#, Python, Javascript, etc. - run
on top of VM and you can develop on one architecture and deploy to another,
why does it matter?

I develop on a Windows laptop and deploy to Linux all of the time. I have some
Python programs that have native dependencies. I still develop on Windows,
push and the CI/CD pipeline runs on Linux and packages a Windows build.

But even with my first job out of college back in the 90s, I was writing C
code that I developed on Windows and was cross compiled for DEC VAX and
Stratus VOS mainframes.

~~~
pjmlp
Even with languages like C, targeting bytecode as distribution format is also
an option.

Mainframes do it, as means to integrate C and C++ into their language
environments.

Back in the early mobile OS wars, there was a company selling J2ME like stack,
but using C and C++ instead.

Then there is the LLVM bitcode used by Apple on iOS and watchOS (which happens
to be more platform neutral than regular LLVM bitcode).

Oh and WebAssembly and MSIL as well.

~~~
scarface74
LLVM won’t allow that type of portability. That was a myth that was dispelled
by none other than Chris Lattner during an interview on ATP. The transcript of
the interview is now returning a 404 though.

[https://atp.fm/205](https://atp.fm/205)

~~~
pjmlp
LLVM open source version, yes you are correct.

LLVM proprietary Apple own internal fork, as used in iOS and watchOS, is
another matter.

As a matter of fact, there is a WWDC talk about how it allowed the seamless
migration of 32 to 64 on watchOS.

~~~
scarface74
Lattner also said later on Twitter that he was purposefully being vague during
the interview. The 64 bit chip built for the watch and LLVM bitcode was
designed in concert.

[https://mobile.twitter.com/clattner_llvm/status/104696072464...](https://mobile.twitter.com/clattner_llvm/status/1046960724646465541)

~~~
pjmlp
Yeah, so we get into these kind of discussions, because everyone just talks
about open source variant of LLVM.

Apple does whatever they feel like with their proprietary fork, to the point
that Apple's clang also gets its own column on cppreference.

Just like Sony and Nintendo haven't contributed anything back to LLVM that
would disclose any capability from their consoles.

~~~
scarface74
I think Chris Latner has just a little insight on the inner workings of LLVM
and it’s capabilities - even the proprietary portions that Apple hasn’t
released in the open.

I tweeted @atp and let them know that the transcript was returning a 404. They
have since fixed it.

[https://atp.fm/205-chris-lattner-interview-
transcript](https://atp.fm/205-chris-lattner-interview-transcript)

 _John Siracusa: The same thing I would assume for architecture changes,
especially if there was an endian difference, because endianness is visible
from the C world, so you can’t target different endianness?

Chris Lattner: Yep. It’s not something that magically solves all portability
problems, but it is very useful for specific problems that Apple’s faced in
the past._

------
looping__lui
What I don’t quite understand in the discussions here is the hardware focus.

No offense, but the libraries Intel is putting out there (IPP, MKL) for
efficient parallel computing are really outstanding. It has been a fee years
for me in the end-consumer high-performance market, but AMD chips would easily
run 20% slower if you optimized the code w/ the Intel Libraries and OpenCL
would Not get anywhere near.

A similar thing happened w/ Nvidia and Apple many more years back; E.g., Apple
dropping Nvidia; fair amount of rumors went around that Apple did that so
Adobe would be less competitive if they had to rewrite their rendering engine
(which they just ported to CUDA) and Apple could sell their video editing
solutions with an edge (they have been hand tuned for a while of course...).

Apple dropping Nvidia has been such a “non-customer-focused” bullshit
decision. Not because the AMD hardware is bad - but OpenCL has simply been
nowhere near CUDA. Same with Intel IPPs and OpenCL. But maybe that changed.

The differences in performance, ease of use etc. have been mind-blowing back
in the days for CUDA and IPP. Hardware alone isn’t gonna cut it. The machines
are built to run software after all...

Maybe that has changed or maybe nobody needs any parallelised compute
intensive applications on Mac (research anyone, image and video editing
anyone?).

The fact aside that mayor software companies like Adobe et al. will probably
have to staff entire departments for rewrites...

~~~
sfifs
In my data science org, virtually NO ONE joining new prefers MacBook Pros
because of the lack of CUDA. Yes production workflows run on cloud and you can
spin up notebooks on the cloud but having the flexibility of CUDA in your
laptop for dev is just really high. People who joined with Macs are trading in
for more powerful Windows machines

------
bit_logic
Maybe this rumor is only half right and it isn't what everyone thinks it will
be. Instead of replacing x86, what if it's just Apple adding their ARM chip to
the laptop? Basically just take the current MacBook and stick a iPad ARM chip
in there. And maybe make the screen a touchscreen too. It would make iOS
development much closer to the real thing. And for non-developers they can run
iOS apps natively. So you get a Mac that can run both x86 and ARM natively.
But if you try to use more iOS (ARM) apps you get really long laptop battery
life since it's optimized for that. And x86 is there when you need to run more
power hungry applications. Seems like the best of both worlds in a single
laptop if they do this.

~~~
oroup
Adding significant BOM cost, growing physical volume, reducing battery life,
creating lots of complexity to get certain apps to run on one CPU or the other
and the associated cache coherency issues? So that iOS developers can test
their apps on a native CPU (But still plenty of differences like screen size,
no cell radio, no GPS, accelerometers, etc) Seems dubious.

~~~
ChuckNorris89
All Macs already have an ARM chip inside them in the form of the T2 chip.

~~~
jedieaston
But that’s specifically for Secure Enclave work (disk encryption, biometric
data). I don’t think they’d want to risk someone running arbitrary code there
and breaking the sandbox. (The SEP is also separate from the A-series chip in
the iPhone for a similar reason, IIRC).

~~~
monocasa
In addition to the secure enclave, they already have an A series core on
there. That's what runs the touch bar.

There's a full XNU based OS on there called "BridgeOS" that's in the
iOS/tvOS/watchOS family.

------
mromanuk
And what about this other rumor? (Which is much more plausible). They could
put everything T2, GPU and x86 in the same platform, bringing the cost down

[https://www.engadget.com/2020/02/07/apple-may-testing-amd-
pr...](https://www.engadget.com/2020/02/07/apple-may-testing-amd-processors-
internally/)

~~~
monocasa
I've been saying this for a while, that a semi custom x86 (probably from AMD)
makes way more sense. Those 8 Zen2 cores with 16GB with a very passable GPU in
the next gen consoles show that it can be done for a remarkably low cost all
things considered. Yes, Sony/Microsoft are probably taking a hit on those, and
yes Apple would have to spend even more to get it running in a laptop form
factor/TDP, but they also have MSRP headroom to make that happen with a decent
margin.

Even more out there idea (with literally zero proof, it's just a good idea,
IMO): Apple buys Centaur, and gets an x86 licence.

* Apple gets to have custom power efficient cores augmented by all the fabless firms they've acquired over the years.

* Mac stays x86, so Intel wins a minor victory when the alternative was a major customer switching to ARM.

* Intel wins a bigger victory because an x86 licence is effectively removed from products being on the open market. The great equalizer that is the end of Moore's law makes that licence sitting out there a long term existential risk for Intel.

* Apple doesn't have a costly transition with the tail end of Moore's law meaning they don't have the same perf gains expected from the other transitions.

* Apple also puts their hands on Centaur's newer inference accelerator IP.

* Centaur's parent keeping them on life support gets a payout.

Everyone wins except AMD (which is another win for Intel).

~~~
philistine
Tim Cook has said it: We believe that we need to own and control the primary
technologies behind the products we make.

So it would make no sense to trade Intel's poor schedule and performance for
AMD's uncertain future performance.

And aren't all x86 licensee prevented from keeping that license if they are
bought?

~~~
monocasa
> Tim Cook has said it: We believe that we need to own and control the primary
> technologies behind the products we make.

Tim Cook was talking about context nearly a decade old.

> So it would make no sense to trade Intel's poor schedule and performance for
> AMD's uncertain future performance.

x86 is just going to become more of a commodity as time goes on. And
currently, Zen 2 is hands down the best perf/watt combo currently.

> And aren't all x86 licensee prevented from keeping that license if they are
> bought?

That was a the rumor, but centaur has already been bought and kept it's
license, so at a minimum there there's some fine print to that clause. And my
experience with B2B is that clauses like that are ultimately a product of the
circumstances from when they're written. If circumstances change, those
clauses can change. The most indelible ink is the most likely to have new
semantics later.

------
yusyusyus
For my work, macs have never been really useful to develop on without
additions. The whole "I can run virtualbox" and having a good out-of-the-box
unixy setup is what has kept me coming back. Linux/others are not a realistic
option as a daily driver for what I do.

No x86, no virtualbox. No virtualbox and I'd rather just get (with much
sighing, complaining, and general ill will) a windows laptop. I mean, the
terminal is becoming slightly more useable in windows, right? And sometimes
you just need to run a windows VM, so you need x86 virtualization.

There are downstream effects to losing the "nerd" base and I suspect this move
is a bad idea, but apple has a track record of pulling rabbits out of hats and
knowing what really matters. Maybe this customer segment just doesn't matter.

~~~
pottertheotter
The thing I like about macOS over Windows when it comes to the terminal is
that it's a native part of macOS and I don't have to monkey around with as
many things. On my Macs, things aren't that much different compared to when
I'm working on my Ubuntu server.

I don't feel the same using WSL. It's not a seamless experience. Also, I've
never found a terminal emulator I enjoy using on Windows. Maybe it's still way
too early, but even the new Windows Terminal left me less than impressed.

~~~
harrygeez
what do you find lacking in the new Terminal?

------
ngcc_hk
Just read the other piece about replacing mac by r pi. The potential is there.
And with Apple control the chip ... it would be brutal.

~~~
agustif
My macbook has 2000 battery cycles, and thinking about acquiring. pi

------
neonate
[https://archive.vn/qbMuX](https://archive.vn/qbMuX)

------
smabie
How is the developer ecosystem going to fair? I would imagine a lot of tools
don't work on ARM (or aren't regular tested on ARM). Moreover, I feel like
it's going to be a big problem that the architecture you're deploying your
code to is different than the one you are developing on. Who knows what kind
of crazy performance differences/bugs there are between ARM and x86_64. Also,
what about the audio/video/modeling software? I can't imagine that there's any
support for ARM at the moment in that space.

Also, is the Mac Pro going to switch to ARM? I'm not aware of an ARM chip that
can compete with the super highend Xeons in the Pro. Having laptops run ARM
and Pros run x86_64 doesn't seem like the best idea (also sounds like a lot of
work on Apple's part).

Of course, maybe this switch is going to create a high-end ARM space, allowing
ARM to make inroads into HEDT and the server market.

A lot seems unclear at the moment, but one thing is clear (to me atleast):
there's going to be a huge fight over the next 10 years, x86_64 vs ARM. No one
can possibly know who will win, but it's exciting to the say the least. I
think we've all been a little tired of the x86_64 monoculture since the end of
PPC.

~~~
cirno
> Who knows what kind of crazy performance differences/bugs there are between
> ARM and x86_64.

A lot of code for demanding applications is often enhanced with SSE/AVX and
JIT recompilation techniques. Those are inherently unportable, and I'm not
sure how many developers will be willing or able to port that code over to
Neon and AArch64, especially for a small 2-4% of the market.

Even if they do, it's quite a cognitive burden to have master and maintain two
separate SIMD and recompilation implementations for the same applications.

This will probably further drive high-end gaming away from Macs, on the heels
of OpenGL deprecation and the Mac-only Metal API. Combined with Cocoa and
Swift, I imagine we'll end up seeing less and less applications that run
natively on both Macs and Windows/Linux after the move.

~~~
saagarjha
Android is much more than 2-4% of the market, though.

~~~
monocasa
Not in the desktop application space.

------
m0zg
I have to say, I'm not sure how Apple can pull this off. Mac commercial
software is pretty sparse and crappy as it is. I mean, if I can't run the apps
I use on a daily basis there (Adobe stuff mostly), it's a complete non-
starter. Introducing another arch is not going to help matters there. If they
try to pull in software from e.g. iPad, that's also pretty crappy. How many
note-taking apps does anyone really need? And besides what few good apps iPad
has, they won't work all that well in the desktop context.

I guess we'll see soon enough. I doubt they're as naive as to believe that
they can pull another massive arch switch without Jobs' reality distortion
field, and without their current arch severely lagging (like it was in
PPC->x86 transition).

~~~
Closi
Well Adobe will definitely support the new architecture, it would be stupid
for them not to.

And additionally with the latest iPad becoming much closer to ‘Mac’ in terms
of the browsing experience (eg with a touchpad) it makes total sense for Apple
to start making some of their apps truly cross platform. Does the Spotify
desktop app really need to be substantially different to the iPad app?

~~~
m0zg
It took them quite some time to start supporting Intel Mac last time.

------
_bxg1
I just bought a new 16" MBP earlier this year and I can't decide if it was the
very best or worst time to do that.

------
MintelIE
Apple is "preparing to announce" the switch, but haven't we heard these rumors
for, well, years now? How much more reliable are these rumors now compared to
a couple years ago?

I'm a little surprised to see Bloomberg basically cribbing MacRumors.

EDIT: Instantly downvoted? LOL

~~~
nicoburns
I always take these rumours with a pinch of salt. But I don't think that there
having been rumours for years discredits them. There were apple tablet rumours
for years too, which turned out to be because they _were_ working on a tablet
for about 10 years before they released it (they ended up releasing the iPhone
first as part of the same project, and I believe that was 7 years in).

~~~
valuearb
When I worked at Apple I ported FileMaker to a Macintosh Tablet computer
almost 20 years before the iPad.

