
ARM Mac Impact on Intel - robin_reala
https://mondaynote.com/arm-mac-impact-on-intel-9641a8e73dca
======
quadhome
_Intel’s real money is in high-end CPUs sold to prosperous Cloud operators,
not in supplying lower-end chips to cost-cutting laptop makers._

I keep searching for "Graviton" in these thinkpieces. I keep getting "no
results found."

Mac ARM laptops mean cloud ARM VMs.

And Amazon's Graviton2 VMs are best in class for price-performance. As
Anandtech said:

 _If you’re an EC2 customer today, and unless you’re tied to x86 for whatever
reason, you’d be stupid not to switch over to Graviton2 instances once they
become available, as the cost savings will be significant._

[https://www.anandtech.com/show/15578/cloud-clash-amazon-
grav...](https://www.anandtech.com/show/15578/cloud-clash-amazon-
graviton2-arm-against-intel-and-amd/10)

~~~
bubblethink
>Mac ARM laptops mean cloud ARM VMs.

What is the connection here ? ARM servers would be fine in a separate
discussion. What does it have to do with Macs ? Macs aren't harbingers of
anything. They have set literally no trend in the last couple of decades,
other than thinness at all costs. If you mean that developers will use
Gravitons to develop mac apps, why/how would that be ?

~~~
Torkel
To quote Linus Torvalds:

"Some people think that "the cloud" means that the instruction set doesn't
matter. Develop at home, deploy in the cloud.

That's bull __*t. If you develop on x86, then you 're going to want to deploy
on x86, because you'll be able to run what you test "at home" (and by "at
home" I don't mean literally in your home, but in your work environment)."

So I would argue there is a strong connection.

~~~
bubblethink
But that's the other way round. If you have an x86 PC, you can develop x86
cloud software easily. You don't develop cloud software on a mac anyway (i.e.,
that's not apple's focus). You develop mac software on macs for other macs. If
you have to develop cloud software, you'll do so on linux (or wsl or
whatever). What is the grand plan here ? You'll run an arm linux vm on your
mac to develop general cloud software which will be deployed on graviton ?

~~~
chrisseaton
> You don't develop cloud software on a mac anyway

You must be living in a different universe. What do you think the tens of
thousands of developers at Google, Facebook, Amazon, etc etc etc are doing on
their Macintoshes?

~~~
danans
> What do you think the tens of thousands of developers at Google ... are
> doing on their Macintoshes?

I can only speak of my experience at Google, but the Macs used by engineers
here are glorified terminals, since the cloud based software is built using
tools running on Google's internal Linux workstations and compute clusters.
Downloading code directly to a laptop is a security violation (With an
exception for those working on iOS, Mac, or Windows software)

If we need Linux on a laptop, there is either the laptop version of the
internal Linux distro or Chromebooks with Crostini.

------
KMag
Intel has a couple of issues.

1\. Back in the 1990s, the big UNIX workstation vendors were sitting where
Intel is now at the high end, being eaten from the bottom by derivatives of
what was essentially a little embedded processor for dumb terminals and
scientific calculators. Taken in isolation, Apple's chips aren't an example of
low-margin high-volume product eating its way up the food chain, but the whole
ARM ecosystem is.

2\. For a lot of the datacenter roles being played by Intel Xeons, flops/Watt
or iops/watt isn't the important metric. For many important workloads, the
processor is mostly there to orchestrate DMA from the SSD/HDD controller to
main memory and DMA from main memory to the network controller. The purchaser
of the systems is looking to maximize the number of bytes per second divided
by the amortized cost of the system plus the cost of the electricity. My
understanding is that even now, some of the ARM designs are better than the
Atoms in term of TDP, even forgetting the cost advantages.

~~~
TuringNYC
Curious -- with that being the case, why havent non-Intel system taken more
market share on these use cases?

~~~
KMag
Momentum. I think one component of momentum is just the time it takes to
develop mature tooling for the ecosystem and port existing software over.

I invested in ARM (ARMH) back in 2006, partly because I realized that whether
Apple or Android won more marketshare, nearly everyone was going to have an
ARM-based smartphone in a few years. Part of it was also realizing the above
and hoping ARM would take a good share of the server market. SoftBank took ARM
private before we saw much inroads in the server market, but it was still one
of the best investments I've made.

Of course, maybe I'm just lucky and my analysis was way off.

~~~
panabee
clearly this analysis was accurate! prescient and well done. was ARM running
most non-iphone smartphones in 2006? the iphone launched in 2007, so this
analysis must have been based on other smartphones (unless you had inside
intel on the iphone).

1) which 2006 analyses proved wrong? it would be interesting to see which
assumptions made sense in 2006 but were ultimately proven wrong.

2) which companies are you evaluating in 2020, and why?

thanks for sharing.

~~~
KMag
Oops. Must have been 2007 that I bought ARMH. I was working at Google at the
time, and I had an iPhone, and Google had announced they were working on a
smartphone. Some of my colleagues were internally beta-testing the G1 at the
time, but it was before we all got G1s for end-of-year bonuses. I think
December 2007 was the first year we got Andraid phones instead of (non-
performance-based) cash for the holidays.

1) I thought the case for high-density/low-power ARM in the datacenter was
pretty clear-cut, but it was an obvious enough market that within a few multi-
year design cycles, Intel would close the gap with Atoms and close that window
of opportunity forever, especially considering Intel's long-running
fabrication process competitive advantage. In late 2012, one of my friends in
the finance industry wanted to be "a fly on the wall" in an email conversation
between me and a hedge fund manager he really respected who had posted on
SeekingAlpha that his fund was massively short ARMH. A few email back-and-
forths shook my confidence enough to convince me that the server market window
had closed, and that it was possible (though unlikely) that Atom was about to
take over the smartphone market. I reduced my position at that time by just
enough to guarantee I'd break even if ARMH dropped to zero. In hindsight, I'm
still not sure that was the wrong thing to do given what I knew at the time. I
had two initial thesies: the smartphone thesis had played out and was more-or-
less believed by everyone in the market, and the server thesis was reaching
the end of that design window and I was worried that Intel was not far from
releasing an ARM-killer Atom, backed up by the arguments of this hedge fund
manager. I'm really glad the hedge fund manager didn't spook me enough to
close out my position entirely.

2) I'm not a great stock picker. I've had some great calls and had about as
many pretty bad calls. I'm now doing software development, but my degree is in
Mechanical Engineering, and I took a CPU design course (MIT's 6.004) back in
college. I think my edge over the market is realizing when the market is
under-appreciating a really good piece of engineering, though I punch well
below my weight in the actual business side analysis.

Jim Keller is a superstar engineer, who attracted other superstars, and I
don't think the market ever really figured that out. Jim Keller retired now,
so there goes about half my investment edge. Though, maybe I'm just deluding
myself on the impact really good engineering really has on the business side.

~~~
panabee
thanks for the thoughtful reply. mistakes are expected in any field; you are
too humble and should give yourself a little more credit. :)

could you share what arguments the hedge fund manager made?

do you find any companies interesting now?

------
jake_morrison
I am a programmer. All the software that I run is cross platform, so I expect
a smooth transition.

Elixir, my main programming language, will use all the cores in the machine,
e.g. parallel compilation. Even if an ARM-based mac has worse per-core
performance than Intel, I am ahead.

Apple can easily give me more cores to smooth the transition. Whatever the
profit margin was for Intel on the CPU, they can give it to me instead. And
they can optimize the total system for better performance, battery life and
security.

~~~
delfinom
>Whatever the profit margin was for Intel on the CPU, they can give it to me
instead.

HAHAHA you sweet summer child.

Next thing you know, you'll be demanding Apple puts back the heatpipe in the
Macbook Air for cooling the cpu instead of hoping theres enough airflow over
the improperly seated heatsink (which just has to work until the warranty
expires ;) )

~~~
spideymans
Not an unreasonable expectation, if you’ve been paying attention to Apple’s
pricing lately

They now sell an iPad, arguably better than any of the non-iPadOS competition,
for $350.

The iPhone SE, one of the fastest smartphones on the market (only other
iPhones are faster) starts at just $399.

They’ve aggressively been cutting prices, I believe, so that they can expand
the reach of the services business. They’re cutting costs to expand their
userbase.

I wouldn’t be surprised to see the return of the 12-inch MacBook at the $799
to $899 price point, now that they no longer have to pay Intel a premium for
their chips.

~~~
steve_adams_86
> The iPhone SE, one of the fastest smartphones on the market (only other
> iPhones are faster)

I just replaced my Google Pixel (4 years old) with an SE and it's like...
They're not even comparable. I'm sure it makes clear sense as to how so much
progress could be made in 4 years, and how it could cost so much less (I paid
$649 USD for the Pixel), but it feels a bit magical as a consumer and
infrequent phone user. It's a fantastic little device.

I'm still 900% pissed about the MacBook Air they sold me in 2018 and I resent
the awful support they gave me for the keyboard (It's virtually unusable
already), but as a phone business, they seem hard to beat right now.

~~~
charwalker
I can't do iOS still. Too many restrictions, compromise and missing
functionality.

Like, give me a custom launcher option (custom everything options), emulation
of classic consoles and handhelds, and easy plug and play access via a PC. If
I can't even get one of those, I'm on Android regardless of how shiny I think
Apple hardware is.

But if you don't want nay of that, it probably works grate. Just not for me.

~~~
steve_adams_86
I hear you, I used to feel the same. I was a heavy phone user once and that's
why I got the Pixel. A lot of things mattered to me then that just don't now.
I could almost get by with a flip phone, but there are still a few things I
like about smart phones:

\- I like to use my phone to check bathy charts while I'm out free diving. In
a water proof case, a phone is a huge asset for finding interesting spots to
dive. I don't really want to buy a dedicated device for this. I can just hook
it to my float and it's there for exploration/emergencies/location beacon for
family/etc.

\- When I forget to charge my watch, it's nice to have GPS handy for tracking
a run or ride

\- It's really nice to be able to do a decent code review from a phone if I'm
out and about. I wouldn't do this with critical code or large changes, but
it's nice to give someone some extra eyes without committing to sitting at the
desk or bringing my computer places

\- I have ADHD and having a full-fledged reminder/task box is a god send. I'd
be lost without todoist

I could do this all with any modern smart phone, but I went with the best
'bang for the buck' model I could find. I don't think I'll miss anything from
Android.

------
thatwasunusual
Slightly off-topic, but does anyone have any details on how the new ARM-
powered MacBooks will perform compared to the Intel-powered MacBooks?
According to this[1] article, "the new ARM machines (is expected) to
outperform their Intel predecessors by 50 to 100 percent". Can anyone shed
some insight into how this is possible?

[1] [https://www.theverge.com/2020/6/21/21298607/first-arm-mac-
ma...](https://www.theverge.com/2020/6/21/21298607/first-arm-mac-macbook-pro-
imac-ming-chi-kuo-wwdc-2020)

~~~
ipsum2
The Macbooks will have a CPU that's comparable or better than the iPad Pro.
The iPad Pro already beats the Macbook Pro in some well-known benchmarks, such
as Geekbench.

[https://www.macrumors.com/2020/05/12/ipad-pro-vs-macbook-
air...](https://www.macrumors.com/2020/05/12/ipad-pro-vs-macbook-air-vs-
macbook-pro/)

~~~
whywhywhywhy
> The iPad Pro already beats the Macbook Pro in some well-known benchmarks,
> such as Geekbench.

We keep hearing this especially from the Apple-bubble blogs, yet why does it
not translate into higher end work being done on these iPads if they are
supposedly so powerful, yet all we ever see is digital painting work that
isn't pushing it at all and extremely basic video editing.

~~~
sheeshkebab
it’s bc of iOS and AppStore. No serious dev will port their desktop apps to
that.

Also good luck to Apple if they lock down macOS the same way.

~~~
delfinom
>Also good luck to Apple if they lock down macOS the same way.

They are locking down macOS gradually. Their hardware revenue is declining
indefinitely and they know it. Their push is to lock down the software and
reap a cut as much as they can.

------
me551ah
I wonder how this would impact developers who primarily use a MacBook for
development. A lot of the compile toolchain is optimized for Intel based x86
CPUs. If buying a Mac means that I end up with a slower build every time I
compile, I would buy Windows.

~~~
glogla
Even with ARM, Mac would still be the only way to get laptop with HiDPI screen
and OS that supports HiDPI well, good touchpad and unix environment as a first
class citizen.

Linux on laptop doesn't work that well, especially if you want a nice (HiDPI)
screen. Microsoft is trying with the Linux emulation, but they are creepy with
their snooping and telemetry, and will autoinstall Candy Crush and other shit.

~~~
whywhywhywhy
\- HiDPI screen - Surface has this

\- OS that supports HiDPI well - Windows 10 has good support for this now

\- good touchpad - Surface touchpads are very close to Apple ones these days
and I'd argue the key part of keyboards are superior as is the touch and pen
support.

\- unix environment as a first class citizen - WSL2 (while Linux not Unix) is
a first class citizen.

How is Microsofts telemetry any more "snooping" than Apples telemetry? I mean
they both do it...

~~~
jen20
The surface screens are nice, but mixed DPI screens on Windows are a complete
mess - even Linux handles it better.

Surface keyboards are mushy and imprecise - frankly not even as good as the
one on my iPad Pro, let alone the Macbook Pro which is to date the only
keyboard I can type on for a whole day without significant RSI flaring.

WSL2 is absolutely not a first class citizen - right now I have to opt into a
build with _required_ telemetry to use it. In fact the Microsoft telemetry and
update regime is a disgrace to the company (apparently outside of enterprise
SKUs which also do not support WSL2) - if you think they’re comparable to what
Apple collect I don’t know what to tell you.

And Candy Crush with advertising in the start menu... JFC someone needs to be
fired for that.

~~~
eknkc
\- I haven't had any issues with my multi monitor setup and multiple DPI
values recently.

\- Subjective. I love old Macbook keyboards. I hate new ones.

\- WSL2 is available on mainline Win10

\- Yeah

------
sradman
Apple has a tactical advantage over Microsoft and even server-side ARM like
Graviton: it’s LLVM toolchain integrated into Xcode for both Objective-C and
Swift. Apple has a precedent for a seamless CPU architecture transition: the
iPhone 64-bit transition. It is much harder to cross-compile seamlessly with
traditional Gnu C or Visual Studio toolchains.

The App Store requirements make it easier to control this transition on iOS
than macOS. I wonder how many Brew apps will make the transition seamlessly?

~~~
woadwarrior01
Also Bitcode[1] which was introduced 5 years ago for iOS, watchOS and tvOS. I
have a feeling they'll be making app binaries built with `-fembed-bitcode`
optional on the Mac App Store and eventually mandatory.

[1]:
[https://help.apple.com/xcode/mac/current/#/devbbdc5ce4f](https://help.apple.com/xcode/mac/current/#/devbbdc5ce4f)

~~~
scarface74
The idea that bitcode will allow porting from x86 to ARM was dispelled by non
other than Chris Lattner himself during an interview on ATP.

[https://atp.fm/205-chris-lattner-interview-
transcript#bitcod...](https://atp.fm/205-chris-lattner-interview-
transcript#bitcode)

 _John Siracusa: The same thing I would assume for architecture changes,
especially if there was an endian difference, because endianness is visible
from the C world, so you can’t target different endianness?

Chris Lattner: Yep. It’s not something that magically solves all portability
problems, but it is very useful for specific problems that Apple’s faced in
the past._

~~~
woadwarrior01
You seem to be assuming that the CPU on iOS devices are running in big-endian
mode. ARM CPUs support both big endian and little endian modes and ARMs on
Apple i-devices have been running on little endian mode for as long as they've
been around. There is no endian difference gap to bridge between ARMs on
iPhones and x86 CPUs.

Given this, it's quite unlikely that Apple will go big endian in their
Desktop/Laptop ARM CPUs.

Of course, Bitcode won't solve all portability problems, but it does solve
many of them in this specific case.

~~~
scarface74
Read the transcript around the quote. It’s not just the endian differences. He
said that it basically only helps with things like minor optimizations and
added instructions.

------
neximo64
This is a typical corporate type view of it. I.e that its least profitable so
doesnt matter.

It misses, developers use Macs to build stuff => It's easy to make arm
compatible applications => The server (most profitable) domino falls.

I can imagine moving straight to ARM processors if its easy enough to work on
and AWS/Google has a deployment option.

The dominos can cascade really fast, particularly in where the new demand for
chips comes in, vs the existing one that will just run as it is now.

~~~
pjmlp
Indeed, as developer I use Macs to build iOS and macOS stuff.

~~~
sheeshkebab
Most developers using Macs don’t touch iOS and macOS dev with 10 foot pole.

~~~
pjmlp
Those "developers" shouldn't complain that Apple doesn't care about them, as
they should have given their money to a Linux/BSD OEM.

Apparently only UNIX has developers, the rest of us are something else.

------
ed25519FUUU
I'm looking forward to an ARM laptop, especially if it includes their latest
GPU. That's excellent power usage and graphics performance.

I expect good compute performance, good graphics performance, better battery
life, and (hopefully) better price range. We'll see what's in store for us
tomorrow.

~~~
p1necone
Whose latest GPU?

~~~
dannyw
Apple's.

It's as powerful as a Radeon RX 5500M, which is incredible when you consider
that the A13 is passively cooled and the RX 5500M is actively cooled.

It's also 40% as powerful as the RTX 2080 Ti, the best-performing consumer GPU
on the market, that draws hundreds of watts and very well cooled.

By some quick maths, performance/watt is an _order of magnitude_ better on the
A13 than GPUs from AMD and NVIDIA.

~~~
strictnein
> 40% as powerful as the RTX 2080 Ti

Yeah... going to need to see the work on that one.

~~~
dannyw
3DMark - Ice Storm Unlimited Graphics Score 1280x720 offscreen:

Apple A13 Bionic: 208697

RTX 2080 Ti: 521458

[https://www.notebookcheck.net/Apple-A13-Bionic-
GPU.434833.0....](https://www.notebookcheck.net/Apple-A13-Bionic-
GPU.434833.0.html)

[https://www.notebookcheck.net/NVIDIA-GeForce-RTX-2080-Ti-
Des...](https://www.notebookcheck.net/NVIDIA-GeForce-RTX-2080-Ti-Desktop-
Graphics-Card.386296.0.html)

Keep in mind the A12Z drives a 2732×2048 display, at 120Hz (iPad Pro). That is
a beast to drive for the GPU.

A 2080 Ti can do 4K at 80fps (Rise of the Tomb Raider).

For another source, AnandTech says the A12Z is a Xbox One S class GPU:
[https://www.anandtech.com/show/13661/the-2018-apple-ipad-
pro...](https://www.anandtech.com/show/13661/the-2018-apple-ipad-pro-11-inch-
review/6)

~~~
strictnein
You may want to check out those other benchmarks.

Ex: GFXBench 3.0 - Manhattan Onscreen OGL on screen The GTX1070 beats it by
508%

~~~
phaus
Yea its not going to be 40% as powerful as a 2080TI when t comes to real world
performance. It might do OK in special cases.

Going from the links he posted it runs PUBG @ 40FPS and a mobile racing game
at 30 FPS. Those numbers don't agree with the assertion that its as powerful
as a mid-range desktop GPU.

------
klelatti
Using ARM processors combined with their own custom silicon will give Apple
the potential to add features that can't readily be replicated by Intel-based
laptops (as we already see in the phone market).

Does the same logic then hold in the datacentre? With the ability to add their
own IP mean that AWS, Google etc can start to add new features (e.g.
specialised accelerators) that would not be possible with Intel CPUs?

~~~
Traster
I think the big difference between what Apple does and what happens in the
data centre, is that Apple builds custom silicon along with a full technology
stack on top of it. They aren't throwing some ML specific stuff on their chip
and then just hoping someone uses it - they've got specific applications
(voice recognition, face recognition etc) that they're targeting.

It's much more difficult in the data centre where you have customers who are
going to be writing their own code. You have to build something very generic,
you have to build an API, you have to create a software eco-system of
libraries and examples and a community of users, and even then lots of people
will be worried about lock-in. It's absolutely possible, and Google are
already doing custom silicon with the TPUs, but it's a very different
challenge.

~~~
klelatti
Fair point but there are examples where customers could be using AWS / Google
API's that could hide the new hardware (voice recognition would be a good
example).

I guess too that for Apple there is a huge advantage to having the custom
silicon on the same SoC as the CPU whereas there is much less penalty for
having it on a separate chip (as per TPUs) in the datacentre.

------
albeva
won't it be funny if after all this hype past few weeks .... Apple won't say a
word about ARM or transition ...

------
ryan-allen
What about the ARM impact on developers?

Will the Pro line of laptops continue on Intel or is it the whole line?

If it's the whole line, it's either going to be very good for developer
tooling on ARM or it's going to be a nightmare.

I've been developing on Windows 10 ARM with WSL and it's pretty great, but
it's not 100% there. I've had to switch back to x64 due to some tools not
having ARM builds.

~~~
dannyw
Ming-Chi Kuo (who has probably the best track record on Apple rumors) reports
that this will actually come to the Pro line first.

~~~
ryan-allen
That is really interesting. Most of the developers I know use Mac OS for
nodejs tooled web development, and they use Visual Studio Code. VS Code hasn't
got an official ARM build yet.

~~~
onion2k
_VS Code hasn 't got an official ARM build yet_

Yes it does. [https://code.visualstudio.com/updates/v1_46#_windows-
arm64-i...](https://code.visualstudio.com/updates/v1_46#_windows-
arm64-insiders)

~~~
ryan-allen
Whoops, I missed this! Can't wait to try it on the Galaxy Book S when I get
home! :)

EDIT: Vscode arm build works GREAT, and it has WSL and SSH remote extensions
too! This is fantastic.

I can move back from my x64 machine for development again! I'm using Edge,
Windows Terminal, WSL and now VS Code all with native arm on Windows 10,
amazing!

Now... we just need Windows 10 ARM builds that run on Raspberry Pi and how
much fun is that going to be!

------
soapdog
What no one appears to be talking about is that the move to ARM might lead to
computers that are less power-hungry thus freeing enough power for Apple to
ship some of their nice LTE tech in macbooks without compromising battery life
too much. I bet many people would welcome that.

I use a Surface Pro X with LTE, having an always-connected, always-on machine
is really nice.

~~~
agloeregrets
To do that though, you’re acting like they would have to have a source that
isn’t Qualcomm for LTE modems! Where would they get those from? Intel used to
have a cellular business but then they sold it..oh wait...to who? Apple?!

Lol I’m kidding but exactly, I agree, plus they can make them in-house.

------
funkaster
It’s true that Apple has done two moves to different chips with success, but
both times it was to a more capable and probably powerful processor. When
moving to PPC they allowed to run 68k apps/code without too much penalty. When
moving to x86, there was Rosetta[0], that made the entire existing library
/almost/ work seamlessly. moving to ARM... this might not necessarily be true.
Sure, you have Catalina but iOS apps on macOS says nothing about all existing
apps for macOS. I have no idea of the amount of already existing apps and what
would be the amount of work needed to move them to ARM. Very likely Apple did
the math and it will be easier/there will be incentives for devs to move to
that platform. They do have experience though.

[0]:
[https://en.m.wikipedia.org/wiki/Rosetta_(software)](https://en.m.wikipedia.org/wiki/Rosetta_\(software\))

~~~
empthought
The G5 and Core Duo were not that far apart, performance-wise. Certainly
nothing like 68k vs PPC. Rosetta worked because most software isn’t CPU-bound;
software that is got native versions very quickly.

------
ngcc_hk
The article mentioned about the WinARM twice failure. I wonder as Apple is
actually a Windows OEM, have they tried that and learnt from that experience
and learn from that as well.

Further, would the AppleArm run BootCamp? What happen to thunderbolt, given it
is ok now from Intel to use it (but in ARM)? Is Office compatibility still
important to Apple and if so have Microsoft Apple product teams are already
testing ARM version of Office on both WinArm and AppleArm? (Given that some
small subset did run on iPad Pro, or even iPhone).

But unlike Windows, Apple has successful twice and given it can ignore our
call for CUDA compatibility and worry about future Premiere/Adobe ... They
would move. And we will move as well. Just not sure where we move to.

Sorry lots of questions. And the impact to Intel ... it is a side show and
side issue. Games on.

------
bsaul
Something i don't get about microsoft not porting its suite to ARM when
releasing the surface Pro :

What's so hard about it ? Your code is supposed to use something like a C
stdlib, which has been ported to ARM obviously. So what makes it so much
harder than to recompile everything?

Once the OS is ported, the system libraries are available, and the programming
language has a compiler for the target architecture, i don't understand what's
blocking.

~~~
mav3rick
You truly underestimate how many things still use raw assembly here and there.
Intel provides a library called libhoudini for this very problem. also what
would you have end users do ? Somehow recompile arcane sourceless binaries to
ARM ?

~~~
kyberias
The Office source code has been ported to ARM multiple times. We're running
that code in the iOS and Android ports of Office.

------
arexxbifs
On a side note, I wonder what this will actually mean for Macs. If the move to
ARM is part of a strategy for unifying iOS and macOS, we can expect
productivity to drop as macOS veers more towards the touch-and-fullscreen UI
of iOS.

As far as I understand it, the OS is the main selling point of Macs and a
substantial change in the user experience will probably alienate their core
user base.

~~~
vbezhenar
I think that they'll unify internals between macOS and iOS, but interface will
be different. Something similar to iOS on iPhone and iPadOS: OS is mostly the
same, but UI is different.

~~~
arexxbifs
It'll be interesting to see if they succeed. It seems developers aren't to
keen on differentiating their UI:s, but perhaps Apple's draconian app store
rules can counteract that.

------
dangus
> PS: For today, we’ll leave the impact of ARM-based servers and their greater
> thermal efficiency alone.

They shouldn’t have left this topic alone!

In the US 1/4 of developers surveyed on StackOverflow use macOS.

If they all switch to ARM and AWS will sell them Gravitron instances for a 40%
discount, in what world are Intel data centers going to be necessary in the
long term future?

Intel should be absolutely terrified.

------
jillesvangurp
Everybody is focusing on the CPU architecture and the impact on CPU
manufacturers. IMHO the risk for Apple is actually alienating people in the
software ecosystem. It's also where the opportunities are.

If, like rumored, they are switching their pro line to ARM that will impact
two groups of big spending customers (i.e. people actually spending many
thousands of $ on hardware regularly):

1) Developers buying maxed out pro laptops for running IDEs, Docker, etc. I'm
one of those.

2) Creatives using Adobe and other third party tool providers for 3D graphics,
movies, photography, etc. This stuff is critical to their workflow and any
hint of compatibility or performance issues will cause people in this segment
to start considering other platforms or delaying purchasing decisions. I know
people that bought the Mac Pro just before it was renewed because they needed
it and there wasn't really anything else to buy for them that met their
requirements even though it was 3 years out of date by then.

These segments are the ones where switching CPU architecture will hurt the
most until such time that the tool ecosystem catches up. E.g. Adobe would have
a lot of tools that probably will need quite a bit of work to run smoothly on
ARM. It will be interesting to see how long that takes. The last few times
Apple switched CPU architecture, it took Adobe a bit of time to switch and it
provided an opportunity to MS, which was able to run Adobe's latest and
greatest throughout the transition. And emulation is probably not going to be
good enough here.

I'm a backend developer and the sole reason I'm still on a Mac is convenience.
At this point it's neither the fastest nor the cheapest option. And, I can
trivially get everything I use running on Linux or Windows (with the linux
subsystem). Most of the stuff I use is OSS, cross platform (IDEs, command line
tooling) & dockerized (databases, web servers, search engines, middleware,
etc.).

All of that is x86 currently. Theoretically, ARM variants of the stuff I use
could be created but in practice, this stuff does not yet exist or is kind of
poorly supported/an afterthought at best.

Maybe, emulation of this stuff will be good enough. But still, I'm deploying
on x86 and will be likely to want to test on that for the foreseeable future
and not run different containers locally than in production. So, my workflow
slowing down because of emulation is kind of a big deal for me.

So, (not so) hypothetically if I were to buy a new laptop right now, I'd be
looking for something that supports my workflow going forward and that
increasingly looks like either using Windows with the Linux subsystem or Linux
(Ubuntu is pretty nice these days). Intel macs are still fine of course but
not if there's this Apple will drop support in a hurry thing looming over it.
I buy laptops with a 4-5 year useful life and Apple losing interest in
anything Intel worries me when I'm going to be spending 3-4K on hardware.

The opportunities are also obvious: gaming & VR have so far not happened in
the Apple ecosystem and I suspect a big part of the reason is Apple wanting to
have their own hardware when they launch this stuff without dependencies on
the likes of Intel, AMD, Nvidia, etc.

Also data centers eventually switching to ARM is something that is technically
already a bit overdue. At this point most linux software should just compile
and run on ARM. Mostly it's just market inertia. Data center supply lines just
tend to be dominated by AMD & Intel and developers just happen to run x86
hardware.

So, long term this is definitely a smart move for Apple and I suspect they
want to get this over with sooner rather than later. However, they do have
their highend users to protect. A mac pro without Intel architecture would be
a hard sell in the current market.

Unless of course they really nail high performance X86 emulation. I could see
them dedicating a few extra cores to that.

~~~
matwood
> Creatives using Adobe

Adobe has been pushing hard to get 'real' versions of all their apps on iOS.
LR on my iPad Pro often runs better than LR on my 2017 MBP for example. Adobe
has also been pushing transitioning to cloud subscriptions so users are less
likely to be stuck on old versions.

My guess is that Adobe is better positioned for a transition than they ever
have been in the past.

~~~
whywhywhywhy
Are they "real" though, iPad apps often feel better and smoother than desktop
apps because you're not actually working with the real data they often
downsample to fit within the small memory requirements.

If I drag a video file from my desktop into Premier then I'm actually working
with that video file. If I open a video file in an iPad video editing app then
most of the time that video file is then transcoded to H264 so the hardware
acceleration can make it actually usable. Therefore adding a layer of
compression.

It's not the real file anymore.

------
ChrisRR
I feel like I've missed something. Why are there all these rumours of Apple
moving away from Intel anyway? Do they have some sort of issue with Intel?

Otherwise I can't see the benefit in lower power CPUs for macs. If they really
cared so much about reducing power usage and heat, then they wouldn't have
shoehorned an i9 into their laptops

~~~
snazz
Because their iPhone chips are so obscenely powerful, MBPs are famous for
running super hot, and they can make much better margins this way.
Consolidating onto a single CPU and GPU platform is nice for developers as
well.

We’ll have to wait until 10PDT today and see the points that are brought up in
the keynote.

------
FridgeSeal
If they do make the jump to arm, who else is excited to play games like
"adventures in cross-compiling" and "which of my core dependencies suddenly
doesn't work anymore?"

AMD chips are better than ever, throw a curve-ball and adopt them, please
don't pick the weird mutant-mobile processor to seriously put in desktops
Apple

~~~
klelatti
Don't think Apple will be putting mobile processors into desktops - expect
they will be processors designed for desktops with comparable or better
performance than the Intel processors they replace.

Not sure what is weird about 64 bit ARM architecture - it certainly doesn't
have the legacy (dating back to 8086 in 1978) that x86 carries with it.

~~~
joefourier
Hell, you can find x86 instructions and concepts dating to the 8080 from 1974,
and probably to the 8008 from 1972 (although I'm not familiar enough with it
to confirm).

It's funny how no-one has been able to dethrone x86 despite decades of effort
from industry rivals and even Intel itself with iAPX and then Itanium.

------
skissane
I wonder, is it going to be possible to run Windows 10 ARM on ARM Macs?

~~~
TazeTSchnitzel
This is a big question for me, I hope Apple doesn't throw away the advantage
that Boot Camp is (and that Microsoft are willing to co-operate). There would
be some hurdles to it like needing Windows/Direct3D drivers for Apple's GPU.

------
MangoCoffee
Can TSMC handle it? Apple probably won't use Samsung because Samsung compete
with Apple on smartphone and there is a bad blood between them.

AMD, Nvidia, Qualcomm, NXP...etc. all use TSMC.

~~~
nojito
The Mac line is extremely low volume.

The issue with A series has always been yield because they are so massive.

~~~
matwood
Apple sells 16-20+ million macs per year. Not Windows computer level, but I
wouldn't call that extremely low volume.

~~~
nojito
Compared to their phone and tablet line of course it is.

Volume is always relative.

------
aphroz
Are these numbers wrong?

[https://gs.statcounter.com/os-market-share](https://gs.statcounter.com/os-
market-share)

In the comments I read some really big numbers regarding the Mac dominance,
can this be caused by a bias (country + income)? In this data OSx seems to be
under 9%.

Please don't downvote me for questioning Mac dominance.

------
waltpad
What's the provided GPU solution for ARM based machines? Do Apple SoCs contain
a GPU?

~~~
tngranados
It does in the iPad so I we can probably expect that.

------
geogra4
So this is what jlg is up to these days. Loved his work on BeOS.

------
a_imho
In my news bubble Intel seems to be on a very bad roll. What are some good
ways to gamble on INTC going down significantly in 12 months time?

------
thesquib
Would you buy a MacBook Pro that you cannot install Linux or Windows on?

~~~
tokamak-teapot
Yes. I used Windows via BootCamp back in 2008. It was painful, with the fans
running full speed all the time and the trackpad not responding in the way it
should.

After a few more tries, I gave up. I occasionally used Windows since then
through a VM in order to access something specifically, but these days there's
nothing left on Windows that I need to use. Office on MacOS is good enough for
the light use I put it to. VS Code / JetBrains tools have replaced Visual
Studio.

There's no reason for me to use Linux as a desktop vs MacOS. I build Docker
containers that run Linux, but that works fine from / in MacOS.

~~~
RMPR
>from / in MacOS.

Via a Linux VM

------
quyleanh
As AMD user, I do hope Apple consider making Mac with AMD chip. The p/p of AMD
chip is far beyond Intel chip now.

~~~
jmnicolas
Why would they give control to AMD or Intel instead of going their own way ?
It's not an emotional decision.

AMD is fast now but if history repeats we'll wait another 15 years before
anything good happens.

And frankly why would you care what brand is your processor as long as it's
fast enough ?

~~~
davewritescode
Ive mad a lot of money over the last few years buying AMD stock but I might be
exiting this week ahead of WWDC.

I think getting developers on ARM might just be what eats X86 on the server.

------
turblety
It will be interesting if Apple does announce a move to ARM for (at least some
of) their macbook range. Not surprising though, as Intel has refused to remove
the vulnerable backdoor [1] (Intel Management Engine) from all their new
chips, companies like Google and Apple want more security and privacy for
their platforms.

While ARM is not perfect, it does allow companies like Apple more control over
the secretive firmware that boots these chips.

1\.
[https://libreboot.org/faq.html#intelme](https://libreboot.org/faq.html#intelme)

~~~
mbreese
I would be very surprised if the presence of the Intel ME had an impact on any
CPU move Apple makes.

~~~
leeoniya
remember that time Jobs flat out said no to Flash on iOS due to security? that
was when flash was the dominant tech for delivering multimedia.

~~~
harpratap
I believe he said no to flash because it was a battery hogging tech and would
have rendered early smartphones basically useless as portable devices.

------
jokoon
Since RISC has a much smaller set of instruction compared to CISC, doesn't
this mean any RISC software is about at least twice the size? Isn't there any
study about average executable size when comparing CISC vs RISC?

If you look at smartphones, you always notice how much memory apps require.
It's not a secret. It's odd that nobody seems to mention this.

Anyways, Wirth's law is relevant again. Nobody wants to hear it, but I really
believe a new era of lightweight software will soon begin. I put a lot of hope
in WASM, and I hope it will work well on smartphones.

I'm using tinder on a 3 year old android, and everyday it's slower and slower.

It's almost like software companies and hardware vendors have the opposite
interests. Software wants to be faster, but hardware vendors wants to increase
their margin, so they want software to be slower or be more feature-rich.

~~~
cesarb
> Isn't there any study about average executable size when comparing CISC vs
> RISC?

The RISC-V people have done a few while developing their RVC (compressed
instructions) extension; see for instance slide 16 of [https://riscv.org/wp-
content/uploads/2015/06/riscv-compresse...](https://riscv.org/wp-
content/uploads/2015/06/riscv-compressed-workshop-june2015.pdf) which shows,
on 64-bit, x86-64 (CISC) being slightly _bigger_ than ARMv8 (RISC), and on
32-bit, x86 (CISC) being _much bigger_ than ARM Thumb2 (RISC). The main reason
is that the encoding of x86 instructions is not very efficient, with rare
instructions having shorter encodings than common instructions; an infamous
example being the "ASCII adjust" single-byte instructions which nobody uses.

~~~
jokoon
Interesting, thanks! Are compressed instructions always enabled by default?

------
siia
Here's my (uninformed) guess about how this could go.

There's already have a fairly powerful ARM chip in all Apple computers - the
T2 chip. Assuming it's a similar spec to the iPad Pro's A12 then Apple could
start by moving the OS to the T2 chip, which should improve the battery life
of all recent macs.

They'll come up with a fancy marketing term for apps that have been compiled
for ARM and advertise them as having improved performance and battery life,
thereby putting pressure on developers to update. The X86's will initially be
removed from all non-pro devices and replaced with more powerful ARM chips,
and once there's enough momentum and support they'll also be removed from the
pro devices in a year or two.

