
Apple aims to sell Macs with its own chips starting in 2021 - blopeur
https://www.bloomberg.com/news/articles/2020-04-23/apple-aims-to-sell-macs-with-its-own-chips-starting-in-2021
======
DCKing
It's worth pointing out how extremely far ahead Apple seems to be in terms of
CPU power and efficiency. It tends to go under the radar a bit because it's
not easy to run your own software on an iPhone, and most apps on these things
are not serious workhorse loads (a few specific use cases are, most are not).

The Apple A13 - even its implementation in the iPhone SE, in microbenchmarks
achieves on par single core performance [1] with the Core i7 8086k [2] and
Ryzen 9 3950X [3]. That's the highest single core performance you can buy in
PCs _in principle_.

I don't have to explain how insane it is a 5-ish watts smartphone CPU delivers
that kind of performance, even if it is in bursts. By sticking to Intel or
even x86 in general, there is ample evidence Apple is leaving a lot of
performance on the table. Not just in MacBooks - but for the _Mac Pro_ too.

[1]:
[https://browser.geekbench.com/v5/cpu/search?utf8=%E2%9C%93&q...](https://browser.geekbench.com/v5/cpu/search?utf8=%E2%9C%93&q=iphone+11)
\- the iPhone 11 with the A13 has to do as a surrogate while the SE is just
out, they benchmark the same.

[2]:
[https://browser.geekbench.com/v5/cpu/search?utf8=%E2%9C%93&q...](https://browser.geekbench.com/v5/cpu/search?utf8=%E2%9C%93&q=8086k)

[3]:
[https://browser.geekbench.com/v5/cpu/search?utf8=%E2%9C%93&q...](https://browser.geekbench.com/v5/cpu/search?utf8=%E2%9C%93&q=3950X)

It's worth noting that Geekbench is a pure microbenchmark. The iPhone will not
sustain performance as long as the others. The point is that Apple could solve
this when moving to bigger devices.

~~~
rsynnott
> It's worth pointing out how extremely far ahead Apple seems to be in terms
> of CPU power and efficiency.

Honestly, I'm not convinced they're THAT far ahead. They don't have a lot of
the legacy baggage Intel has to contend with, and they're the only company
making high end ARM chips (besides Amazon and a few other weird server
implementations), but being able to match big core i7s in some benchmarks
single threaded is to a large extent something that Intel's own low power
chips can also do, at least burstily.

There are a lot of challenges to big many-cored chips beyond single-core
performance, and we really don't know where they are with that yet, as there
are no publicly-available examples of Apple desktop chips.

~~~
soapdog
> and they're the only company making high end ARM chips

I'm typing this with a Surface Pro X running a ARM64 CPU called SQ1 which a
customization of a Snapdragon 8cx. It is quite high end and is not made by
Apple. It might not be the amazing custom CPUs they have on iOS devices but it
is still a pretty good CPU.

~~~
rsynnott
I mean, the Surface Pro X chip is fine, but it would be hard to call it high
end; it significantly lags the usual intel chips used in tablets and small
laptops on performance, especially single core. The newest Apple ones are
competitive with those or beat them.

------
yingw787
I remember watching this video on the Intel 8086 by Harvard Business School,
and they mentioned how revolutionary it is from a business perspective because
it took a vertically integrated market and made it a horizontally integrated
market.

(I can't find the video after a cursory search...)

Is this is the reversal of this revolution? Are we going back to a vertically
integrated market due to consolidation in market players or because of
performance / power concerns? Everyone seems to be making their own chips and
boards these days. Google / TPU, AWS / Graviton, Microsoft / SQ1...

Will we ever see a fragmentation in ISAs a la EEE? IMHO that would be a
catastrophic regression in the software space, easily a black swan event, if
you say needed to compile software differently between major cloud vendors
just to deploy.

~~~
mdasen
In a certain way, this move is horizontally integrated. Intel vertically
integrated its chip design and fab. This moves design and fab apart. One of
the reasons that Intel is falling so far behind is that they can't keep up
with TSMC (and maybe others as well) on the fab side.

Intel's vertical integration worked well for them for so many years. However,
the crack has been around long enough for others to start muscling in. AWS can
push Graviton because Intel has been stuck at 14nm for so long (yes, they have
some 10nm parts now, but it's been limited). Apple can push a move to ARM
desktop/laptops because Intel has stagnated on the fab side.

I wouldn't say this is a reversal of that revolution as much as a
demonstration of the power and fragility of vertical integration. Intel's
vertical integration of design and fab gave them a lot of power. Money from
chip sales drove fab improvements for a long time and kept them well ahead of
competitors. However, enough stumbling left them in a fragile place.

I think part of it is that ARM also has reference implementations that people
can use as starting blocks. I don't know a lot about chip design myself, but
it seems like it would be a lot easier to start off with a working processor
and improve it than starting from scratch.

I think we're just seeing a dominant player that no one really likes stumble
for long enough combined with people willing to target ARM. Whether I run my
Python or C# or Java on ARM or Intel doesn't matter too much to me and if AWS
can offer me ARM servers at a discount, I might as well take advantage of
that. Intel pressed its fab advantage and the importance of the x86
instruction set against everyone. Now Intel has a worse fab and their
instruction set isn't as important anymore. They've basically lost their two
competitive advantages. I'm not arguing that Intel is dying, but they're
certainly in a weaker position than they used to be.

~~~
koolba
NodeJS is in that category as well. If you avoid native modules it’s easy to
cross deploy on ARM. The workload seem to translate well to the process model
of scaling out as well.

~~~
TheSoftwareGuy
Really though, unless you're writing x86 assembly, any language should be just
fine on ARM. The only potential holdp is is you rely on precompiled binaries
at some point. Otherwise it should just be a matter of hitting the compile
button again.

~~~
sharpneli
In C/C++ land the kind of ”Just try and see if it works” kind of development
is super common in proprietary software. Leading to issues such as:

Using volatile for threadsafe stuff. Arm has weaker memory model than X86 so
it requires barriers. C++ standard threading lib handles this for you but not
everyone uses it.

Memory alignment. Arm tends to be more critical of that. While it’s impossible
for well formed C++ program to mess it up it’s quite common for people just go
”Hey it’s just a number” and go full yolo with it. Because hey, it works on
their machine.

~~~
pjscott
Apple's recent 64-bit ARM processors now support unaligned memory access, so
there's one porting problem out of the way:

[https://developer.arm.com/docs/den0024/latest/an-
introductio...](https://developer.arm.com/docs/den0024/latest/an-introduction-
to-the-armv8-instruction-sets/the-armv8-instruction-sets/addressing)

------
morganvachon
I know it won't be exactly the same thing as the 68k and PPC eras, but I'm
excited nonetheless. I'm a sucker for stories of going against the mainstream
and pulling it off. I remember being wholly underwhelmed when Apple switched
to Intel in the mid 2000s; my PPC Mac mini was more performant than the first
Intel mini by far, especially the GPU. Given how powerful the A series mobile
processors are (the new iPhone SE makes my Galaxy Note 10+ look like a slouch
in benchmarks), I have a feeling the new Macs will be worth a look for anyone
without hard requirements for Windows 10.

~~~
nvarsj
> I have a feeling the new Macs will be worth a look for anyone without hard
> requirements for Windows 10

Not just that - but the many, many legacy apps professionals rely on. The
64-bit move already decimated the professional audio space. The cynic in me
can't help but see this as a move towards pure consumer device. As a
development machine it will be all but useless as you find things that won't
compile or work correctly on the new CPU arch.

~~~
morganvachon
> _As a development machine it will be all but useless as you find things that
> won 't compile or work correctly on the new CPU arch._

I'm not a developer so forgive my ignorance, but isn't this what cross-
compiling is for? I get that compiling natively can increase performance and
find obscure hardware issues, but it's my understanding that, for example, ARM
builds of GNU/Linux binaries are just cross-compiled by server farms that are
also natively compiling the AMD64 builds.

Also, fat binaries and JIT emulation have been a thing forever, especially for
Apple who has dealt with these changes twice now (68k -> PPC -> x86-64).

I just don't see this being any different than current multi-platform efforts
like Debian, NetBSD, etc., except it's a for-profit company with billions of
dollars and thousands of expert employees behind it.

~~~
drewg123
There can be subtle bugs, especially if you have code which has to adhere to a
strict on-disk or on-network format.

I recently committed a change to FreeBSD's kernel (written in C) which I'd
tested on amd64 and x86. Much to my surprise, when I did the cross-build for
all platforms, I had a compile-time assert fail on 32-bit arm because the size
of a struct was too large. It turns out that 32-bit arm (at least on FreeBSD)
naturally aligns 8 byte data types to 8 byte boundaries, whereas x86 packs
them and allows them to straddle the 8 byte boundary. This left a 4-byte hole
in my struct, and caused it to be 4 byte too large.

These are the sorts of things that bite you when moving from x86/x86_64 to a
risc platform, even when its the same endian.

~~~
saagarjha
If you care about the packing of your struct, then you should probably be
using compiler-specific packing attributes.

------
ben7799
This might be OK for consumers and it might be OK for the small business
home/lone-wolf developers targeting B2C and small business but it seems like
it will be the death knell of Macs/Macbooks being used for a lot of big
commerical & enterprise software.

A lot of the Mac popularity got a boost by developers loving Macs in the
early/mid-2000s because you could target unix so easily compared to a Windows
machine and you didn't have to deal with the pain of playing systems
integrator to run linux.

Today it's a lot worse cause most of us are targeting container technologies
on linux. So even if we can compile on the Mac fine we're running a VM
(Hyperkit) and taking a performance & memory hit from that compared to a
native linux system.

If they make the whole lineup ARM and we're now going to be stuck doing an X86
VM on Arm + containers it's going to be even more painful. Given the premium
pricing this would probably force a lot of companies to get serious about
native linux machines. Painful at first but it'd save a lot of money in the
long run. Where I work we already have tried this a little cause the
performance on the Mac was starting to suck, you can get stuff like the
System76 laptops and get more hardware for less money, there was still just
quite a bit of integration pain to use those laptops last time we tried.

Yah you can deploy everything from Mac -> cloud when you actually want to run
your software but that does make the develop -> build -> deploy -> test cycle
take even longer. That aspect is no different whether the mac is x86 or ARM.

I'd argue if Apple had stayed PPC all this time and was just transitioning to
ARM practically no enterprise shops or startups targeting linux in the cloud
would ever be using MBPs, everyone would have found a different solution over
the last 20 years. Macs were always just cool curiosities for a lot of
software dev until they got to x86.

~~~
Aperocky
As an enteprise and private user of mac, I think there's one thing that you
didn't touch.

Mac no longer serve as a 'development machine'. For instance, in my line of
work I have a 64 core cloud dev box that I use, my mac are used to connect to
that box and serve as a frontend. Granted, there are still workflows that
happen on the mac (i.e. IDEs for language that can't really work without one),
but more and more things happen directly in the dev box, somewhere far away
from home.

And I think that will be the future - that's why I think the air is a great
machine. Top out the user experience and design at the cost of performance,
which even the enteprise customers are starting to need less.

~~~
DeathArrow
>For instance, in my line of work I have a 64 core cloud dev box that I use

I prefer to use a local box for faster feedback. But even if someone would
prefer using a remote box, they can also use a Windows laptop to do the same
thing.

~~~
Aperocky
> use a Windows laptop to do the same thing.

That's where all the bad things come in. I can use all the normal commands in
mac or at remote, with almost no change. The aliases work on both machines (I
use .. 2 to go up 2 parent dirs, etc). I usually don't use UI with the
exception of the browser, and I can't see how that can work in Windows.

------
frou_dh
At this point, it's not interesting unless it's officially announced.

We've seen endless wishful thinking and stories on this subject come and go
over the years.

------
izacus
The big question here is - are users of these Macs still going to be allowed
to decide what software they can run or will that decision be made by Apple
like on iOS?

~~~
pampa
Is there any software not built by Apple that is worth getting a Mac for?

Logic Pro, Final Cut. And you cant even buy a decent Mac to run them without
paying arm and leg.

Everyting else seems to be cross platform and runs ok on Windows, or even
linux.

~~~
gimboland
> Everyting else seems to be cross platform and runs ok on Windows, or even
> linux.

Yes, but then you have to run Windows or Linux.

I think that people don't (on the whole) get Macs in order to run particular
software, but many (myself included) do get Macs in order to run MacOS. (Silky
smooth trackpads don't hurt, either.)

~~~
danlugo92
Ubuntu et al aren't that bad at all honestly. UIs have gotten better over the
years..

But the trackpad and screens (and the handling of the screen by MacOS which is
leagues beyond Windows / Linux) alone are worth the price tags in Macs.

~~~
_ph_
I really like Linux and would use it happily, but beyond all the software you
can't get for Linux, things like trackpad and screen support are just great on
the Mac, especially multiple, screens with HiDPI.

~~~
sudosysgen
An underestimated benefit of Linux vs MacOS especially if they go for ARM is
that Wine is getting really, really good. One more than one occasion I
downloaded an exe by accident and used it flawlessly.

~~~
freedomben
Wine is indeed very good these days. It's amazing how much Windows software it
will run.

And if Wine doesn't cut it, KVM with Gnome Boxes works amazingly well and is
super easy to use. I had to do that for my tax software last year.

------
bluedino
I don't know if buy that Apple would do this for battery life. In certain use
cases, we already get 8-10 hours out of their laptops. Of course, running
certain apps or heavy loads will shorten that to 2-3 hours, but I can do the
same thing on my iPhone, certain apps will chew the battery up in no time.
Won't the same thing happen with an ARM MacBook Pro?

~~~
Zenst
Part of me wishes it is for battery life, yet sadly in many area's like this
when we gain battery life we find that offset with a smaller battery and the
marketing of how it is now lighter prevails. Which is fine if can do a whole
day, which seems to be the target most aim for. I'd love a few days, heck I
recall the days of mobile phones that ran for a week of usage.

Alas over the decades we have seen faster and faster CPU's and equally we have
in many area's seen sloppier and sloppier code. After all if it runs fine on
the latest kit, no apparent need to optimise something that could run much
better and by that, use less of the available CPU and less battery. Hence many
applications that could run upon less resources - just don't and that's the
real shame as whatever gains we make in one area's are eaten up in another
when they don't need to be.

~~~
jp555
"I recall the days of mobile phones that ran for a week of usage"

do you mean talk time or standby? Because I do not remember this.

Seems much more likely we just used our "single-app" mobiles WAY less. Maybe
you were on cellphone calls 4-6 hours a day? But I do not remember any cell
phone in the 90s and early 2000s that had a 30+ hours talk-time (plus 150
hours of standby).

~~~
iamben
Standby.

I'd get a week (at a push?) out of my phone at university in the late
90s/early 00s. I don't think 3-5 days was at all unreasonable back then - for
me that would be multiple text messages, a few games of snake, and a handful
of calls a day. I get a day (at very best) out of my phone now if I don't
touch it at all.

~~~
Hamuko
I charge my iPhone 11 Pro every other day now because not leaving the house
means that I'm using it way less now. I think I got about 3 days of standby
when I really pushed not charging it.

I don't think getting 3 days standby out of a current smartphone would be that
impossible. It's just that no one uses their smartphone so that it just sits
there, not doing anything.

~~~
nicoburns
Indeed. Samsung phones (I have the S7) have an "Ultra power saving mode" which
enables a bunch of power saving features including throttling, reducing
display resolution / brightness. But the thing that makes the biggest
difference is disabling background data.

I once got 4 days of usage at a festival in this mode. And I was using it to
keep in touch with people and as a camera (I probably took ~150 photos). The
trick was to use it as little as possible, and keep in airplane mode unless I
was specifically expecting a message (I would periodically take it out of
airplane mode to check for messages).

------
exabrial
[http://archive.is/lXlMO](http://archive.is/lXlMO)

~~~
czottmann
Thanks!

------
villgax
RIP hackintosh, it was a fun ride to show apple what was possible for the buck

~~~
krstffr
I have been wondering about this as well, about what would happen to AMD/Intel
Hackintoshes if Apple was to switch to their own processors. However, wouldn't
the assumption that they would stop working also assume that all current Macs
(which run Intel processors) would also stop working? Would there not have to
be at least like 5 years of continued support of at least Intel processors
which would also mean that Hackintoshes could keep working?

Disclaimer: I do not know very much about Hackinthoses, but I have been
wanting one more and more the for the last couple of months when more and more
successful AMD builds have been created!

~~~
paulcarroty
> at least like 5 years of continued support

only for old CPUs. Also non-T2 chipset support probably will be discontinued
too.

Anyway, Hackintosh shutdown it's matter of time, 'cause Apple naturally cares
about his closed ecosystem.

~~~
mastercheif
They're still shipping non-T2 Macs in 2020 (iMac)... They will be supported
for a long time.

~~~
abrowne
The 2017 model MacBook Air is still available to education institutional (and
probably corporate) purchasers as well.

------
mxcrossb
What interests me is how many custom chips they’ll pack in to their processor
nice they have total control. Phones continue to have more and more
specialized accelerators, and I could see computers following that path.

------
hiby007
Can someone expert in this field explain what does this means for developers
and what this means in the long run?

~~~
m12k
Mostly, it means apps will need to be re-compiled for the new CPU
architecture. For a while now, apps uploaded to the Mac App store are uploaded
in an intermediate representation, so the final compilation to the specific
architecture happens on Apple's servers. This puts Apple in a place where they
can launch a new computer with an ARM CPU, and (ideally) all apps on the Mac
App Store will "just work"TM out of the box. And they can know whether that
will be the case before the launch, maybe even reach out to developers to
solve issues ahead of time.

So the biggest potential problem will be for of apps outside the app store.
Developers will need to re-compile their apps themselves - I hope this will be
possible with cross-compilation, so it's not necessary to get a new mac in
order to do so (that would severely limit how many devs would do so). Maybe
the norm will be 'fat binaries' like in the PowerPC->Intel transition period,
where apps ship with binaries compiled for both architectures. Also, apps that
haven't been updated for a long time might not receive updates and would get
left behind (although switching off 32-bit support in Catalina could be seen
as a test run for the same disruption).

Also, any apps that are using specific x86-instructions for optimization might
take a while longer to port - e.g game engines (luckily most game engines are
cross-platform these days, so Unity, Unreal and the others already have ARM-
targets from iOS/Android/Switch to base this on)

~~~
Reason077
> _" Maybe the norm will be 'fat binaries' like in the PowerPC->Intel
> transition period"_

Multi-arch "fat binaries" have been the norm much more recently than that. For
a long time, typical macOS binaries supported both 32-bit and 64-bit x86.

Likewise, fat binaries are used to support the various different ARM variants
on iOS devices (armv7, arm64, etc).

~~~
Someone
The first fat binaries on the Mac were 68k-PowerPC.

That time one could also patch 68k OS calls with PPC code and vice versa.
Depending on your viewpoint, making that possible, where the 68k OS calls had
about 10 different calling conventions (the ABI of some of the callbacks
seemed to be chosen by what registers the arguments happened to be in in the
first implementation), that was either a can of worms or a brilliant hack.

------
uyuioi
I reckon Apple actually will release iPadOS for fixed keyboard layouts. Or
something similar.

------
dbbbbbbbb
The big question is will they run macOS or will they essentially be iPad-pro
books, my guess is the latter, it's not what I'd want but Apple could gain a
lot more in terms of control and revenue that way.

~~~
Unklejoe
I posted a similar comment and was met with a similar negative reaction - not
sure why.

I think it's perfectly valid to be skeptical of the long term plans of any
company.

In this case, it seems pretty obvious that they would eventually try to
converge the iPad and Macbook, especially if they're running pretty much the
same hardware.

I do think it would be a very long transition though.

------
fbn79
So no more Hackintosh and virtualized MacOs?

~~~
nutjob2
Apple will still be selling and supporting x86 machines for many years,
possibly forever.

~~~
fluffything
x86-32bit, PowerPC... nvm.

~~~
nutjob2
I think if they wanted to get to parity with Intel or AMD with the higher end
desktop processors they may have quite a bit of work ahead of them, and for a
relatively small market, so it's questionable if they would want to sink
precious engineering resources into that.

On the portable side though its a no-brainer. I don't see Apple having x86 in
new laptops at all after the transition.

But Apple may stop making high end desktop systems all together so it is
possible. It's a very small part of their revenue.

~~~
Hamuko
> _But Apple may stop making high end desktop systems all together so it is
> possible. It 's a very small part of their revenue._

That's what they said when the Mac Pro was neglected and they still designed a
brand new machine to sell.

------
fastball
Would that mean no more Bootcamp?

Windows won't run under ARM, will it?

~~~
DeathArrow
There is Windows for ARM, however it lacks apps.

------
zanethomas
And then will Mac owners be forced to install only software available, at a
price, on the app store?

~~~
SahAssar
Apple has been moving towards this direction for years so while I find it sad
I don't think anyone will be surprised.

------
DeathArrow
It's funny how Microsoft didn't succeed with the push for ARM.

Their latest try with 8cx and SQ1 wasn't a big hit. The CPUs aren't the best,
MS doesn't want to pour as much money as Apple into CPU research.

They also try a gradual move, targeting both x86 and ARM as they don't want
to/don't afford to upset consumers.

If Microsoft would make next version of Windows ARM only, most software makers
would probably target last version of Windows, most users would sit at the
last version of Windows and Intel and AMD would happily sell their x86 CPUs.

------
jrobn
Hopefully Apple will go back to being a integrated hardware and software
company.

It makes absolute sense to me for a future MacBook Pro to have a great
AMD/Intel CPU and one of their high performance A13X chips with neural
accelerators to accelerating specifics tasks to improve performance and
battery life.

I'm thinking audio/video encoding in FCPX, neural AI assisted tracking, color
matching, face detection, ProRes acceleration, H264/H265 acceleration.

Would make the price tag of a new MacBook pro much more palatable for
FCPX/Logic X users anyway.

------
dstaley
I'm not holding my breath, but I really hope Apple works with Microsoft to
ensure that Boot Camp still continues to work. Windows on ARM on a MacBook
with a powerful A14 sounds like a delight.

------
tecleandor
I'm not a Mac user anymore, but this might be interesting to move the ARM
market on laptops. I'd like to have good options for high performance ARM
linux supported laptops.

~~~
paulcarroty
ARM Chromebooks exists, but putting real Linux on them will be not easy in 99%
cases. Thanks, Google.

------
major505
This would close the gap beetween Ipads and macs even more... after this
transiction is done, do you guys think that apple is going to unify both
products line?

------
tosh
A related question I was recently asking myself: would it make sense to put
two (or more) A13 SoC in a Macbook (context: Apple must see enormous economies
of scale now that the A13 is also in the iPhone SE)?

Ask HN: Would it make sense to put two A13 into a MacBook or iPad?

[https://news.ycombinator.com/item?id=22955301](https://news.ycombinator.com/item?id=22955301)

------
osdiab
I wonder if this will have an impact on other parts of the computer besides
CPU/GPU - for instance, connecting an iPhone/iPad to Airpods is buttery
smooth, but my experience connecting to them on a last year Macbook Air is as
crappy as any other Bluetooth device. Maybe whatever they're doing in the
mobile line will come to Macs too?

------
tl
Many suspect that ARM Macs are coming, but I would not trust reporting from
“The Big Hack” Bloomberg [1].

From the article, Bloomberg reported [2] that Apple would also do it in 2020,
so they’re planning on being eventually right.

[1]:
[https://daringfireball.net/2018/10/bloomberg_the_big_hack](https://daringfireball.net/2018/10/bloomberg_the_big_hack)

[2]: [https://www.bloomberg.com/news/articles/2018-04-02/apple-
is-...](https://www.bloomberg.com/news/articles/2018-04-02/apple-is-said-to-
plan-move-from-intel-to-own-mac-chips-from-2020)

~~~
Traster
I would really love it if we could try not dismissing entire news sources out
of how hand based on 1 bad piece of reporting years ago. It's not like this
article is even by the same journalists. We should be ignoring Marc Gurman
because because John Robertson got something wrong in 2018?

Also, if you'd actually read the article instead of dismissing it out of hand,
you'd realise your big gotcha in your second citation is actually directly
referenced in the article with details about exactly why they made that
prediction. Frankly, there's no way even Apple knew whether they'd be shipping
ARM macs in 2020 or 2021 back in April of 2018. In fact, this reporting
actually backs up that reporting - given where we are now, it seems highly
likely that Apple were planning to ship ARM macs in 2020 when that report was
done in April 2018.

~~~
klohto
We should do exactly what the OP says because that's the whole point of news
outlets. Bloomberg stood by that piece and never commented on it, they just
ignored. I don't see a reason why I should trust the same outlet.

I'm happy reading their opinion pieces (by matt levine) but I see no reason to
not dismiss their usual lets-see-if-we-are-right stuff.

~~~
ericlewis
In this case, it is Mark Gurman preciously of 9to5mac. He has a very stellar
reputation for being correct on this stuff and has for nearly a decade.

Bloomberg sucks, but he’s still Mark.

------
mistrial9
If MacOS "hides" certain files, prevents mounting `ext4`, requires signing of
apps with Apple-ssl-keys to run any binary, tracks every user's app installs
and useage, limits installs of applications from only the Apple Store, or any
number of other dark-pattern consumer behavior, this loyal Mac user says NO

~~~
rswail
What has that got to do with what processor is in the Mac? This is not about
i(Pad)OS vs MacOS.

------
runeks
> Apple Inc. is planning to start selling Mac computers with its own main
> processors by next year [...] according to people familiar with the matter.

Sounds like it’s just tech experts speculating about this. Which they’ve done
for a decade or so.

------
jamesu
Considering the XCode on iOS rumors, I'm still waiting for a 27inch iPad to be
released myself.

~~~
Hamuko
What on earth would be the point of a 27-inch iPad?

~~~
macintux
I’ve long dreamt of an iPad scaled up to the size of an artist drafting table
top.

~~~
Hamuko
We already have the Wacom Cintiq Pro 32, which is an incredibly niche product.

~~~
macintux
I don’t want it as a drafting table, although I might use it that way; I want
it as a computing experience.

------
starpilot
Considering how they came up with Swift, I'm not surprised. They just "get it"
with coming up with efficient, yet practical approaches to technical problems.

------
mkchoi212
I’m wondering how Apple got to this position in first place. How does a
company that doesn’t specialize in CPU design, beat a company who specializes
in CPU design; Intel??

~~~
devxpy
By hiring the Silicon Ronin, ofcourse.

[https://en.m.wikipedia.org/wiki/Jim_Keller_(engineer)](https://en.m.wikipedia.org/wiki/Jim_Keller_\(engineer\))

------
thadk
I'm curious when Apple had anticipated this switch. Why deprecate 32-bit in
Catalina? Do their 64-bit APIs help them emulate x86 apps more easily?

------
nicoburns
ARM-based Macs are going to be a pain if they're as locked down as I suspect
they might be. But damn are they likely to be fast if the rumours in that
article are true. 8 high-performance cores on a next-get chip when Apple's
A-series chips are already beating intel chips in single-core performance.

I suspect ARM compatibility will be a problem for ~6 until major software
houses get themselves sorted out. Most open source software already has ARM
support anyway. And of course it's possible that Apple will actually include
some level of hardware support for x86 emulation.

~~~
ngcc_hk
The windows ARM has intel simulation. In fact Apple has done
simulation/emulation/recompilation/FAT-obj for the last few transition. It
would be the same. The problem is even if it stay the same many do not change
(like Adobe etc.) which drag it. Hence, might as well move on.

------
altmind
So does it mean Apple users will need to migrate to ARM builds for their
software? It took smth around 3 years with powerpc->x86 migration.

~~~
fluffything
If you are shipping Apps through the AppStore, they are already compatible
with any hardware architecture Apple might decide to use in the future.

(You do not ship binaries to the AppStore, but LLVM-bitcode, that Apple
compiles for you for whatever hardware they decide to ship next).

If you are not shipping Apps through the AppStore, you are on your own.

~~~
altmind
So shipping bytecode is not mandatory? Interesting, what % of developers are
actually sending the bytecode in, then.

~~~
fluffything
Shipping LLVM bitcode is mandatory for the AppStore, Apple can then compile
that bitcode to binaries for different hardware.

That's why if you are shipping code to the AppStore, you probably don't really
care whether Apple uses x86, ARM, RiscV, powerpc, or all of them in different
devices. Producing working binaries becomes Apple's job.

------
HugoDaniel
this makes me sad because I've just recently got a MacBook Pro, and it was not
cheap;

still had hopes that it could last >10yrs as my last one did

~~~
jojoo
the switch from ppc to intel definetly contributed to the fact that my
powerbook only held up ~8 years

~~~
MobileVet
Can you imagine a Windows based PC lasting half of that time?

Outside of the silky OS and tight Linux coupling, it has been a joy to buy
computers (albeit at a premium) that last more than 2 years.

I still have a 2009 iMac that is perfectly fine for everything except iOS
development (due to an Xcode OS minimum) and a 2015 MBP for that. While my new
work 16 MBP is a performance beast compared to the 2015 MBP, I can certainly
be productive and expect to be for years to come with my personal kit.

~~~
foepys
> Can you imagine a Windows based PC lasting half of that time?

Yes, my Thinkpad T420 still does. Don't compare consumer-grade $300
laptops/PCs to Apple hardware that costs $2,500+, please.

A modern Windows 10 is also still able to run applications written in 1999 and
even earlier. Apple only supports the last 3 releases of macOS which puts the
burden on the developers that often don't even support the applications
anymore.

There are many reasons to dislike Windows(-based PCs) but your argument is
very, very weak.

~~~
slantyyz
> Don't compare consumer-grade $300 laptops/PCs to Apple hardware that costs
> $2,500+, please.

If you're the type of person who doesn't put your hardware through a beating,
I find that name-brand consumer PCs to be fairly durable. But like you said,
they are obviously not as nice as Apple hardware.

Another thing about PCs is that you don't get forced obsolescence from Mac OS
upgrades that stop supporting old hardware that is still usable, outside of
having a weak gpu.

~~~
hypervis0r
I have seen this happen so often. People who evidently don't use or haven't
used a Windows based machine in the last 20 years complaining more than
anybody else.

There are plenty of reasons to hate on Windows(-based machines) but what would
they know.

------
dis-sys
This is going to be very interesting if true.

I'd buy two such ARM mbp on its release, one for the regular macos setup and
one for ubuntu linux.

~~~
ubercow13
I expect it won't be much use for Ubuntu on its release.. or for quite a while
afterwards

------
DeathArrow
Apple: Motorola 68k > IBM PowerPC > Intel x86 > ARM Microsoft: x86 > x86 > x86
> x86

~~~
rsync
Forgive my pedantry but Windows NT ran on both PowerPC and DEC Alpha and I
believe there were Windows server variants that ran on Itanium as well ...

~~~
DeathArrow
You are right. I only mentioned the architectures with most user base for both
Apple and Microsoft.

------
vinniejames
Didn't Apple try this already with the Power PC chip, then switch to Intel?

Why the 180 reversal of course?

~~~
detaro
PowerPC wasn't really "Apples own", even if they had some input in the design.
And ARM overall is a bigger ecosystem than PowerPC ever was, and they're not
bound in fabrication now.

I think it makes more sense to see it in the same line, not as a 180 reversal:
They got PowerPC because it was a good solution at the time and the 68k they
used beforehand couldn't keep up. Then 10 years later (during a time where
that meant a _massive_ increase in speeds etc: 10 years of PowerPC went from
60 Mhz to 2.3 Ghz) they saw that IBM couldn't keep up with Intel (and small
unit numbers meant that it was expensive) and switched platforms to then-
strongest choice. Now it's 15 years later, they already have expertise and
designs for ARM from the mobile market, Intel is less clearly _the_ leader in
x86, ... and they are reconsidering again. To me, the connecting line is more
that they are willing to break the ecosystem to follow wherever the best
option is.

You could see it as an reversal in that they go from niche (PowerPC) to
mainstream (Intel) to niche (nobody makes ARM desktop systems), but I think
that ignores that they already have leading expertise with ARM, ARM is not a
niche overall (tooling support and ecosystem), they're not bound to fab tech
of a specific company anymore (because they now can have whoever is the best
in fabrication right now make their designs, that wasn't really a thing in
PowerPC times), ...

~~~
vinniejames
It seems to make more sense around the rumors iOS and OSX will merge into a
single platform.

One of the issues back in the PPC days was lack of applications on Macs.
Partially due to the chip, as more apps supported Intel architecture. As soon
as the switch to Intel happened it made porting existing software much easier.

Assuming they move to ARM. I would bet my morning coffee that the plan is to
merge iOS and OSX then start porting over mobile apps into the desktop system
as opposed to requiring existing software packages to rewrite desktop software
for the new platform.

------
Geee
What kind of memory modules would it use? Are the standard PC memory modules
x86-specific?

------
edeion
If you'll pardon a naive question: would that be the end of hackintoshes?

~~~
DeathArrow
Pretty much yes, as there aren't many ARM PCs. And even if it were, Apple
would probably make the Macs incompatible.

Hackintosh works now because MACs are PCs with custom firmware. But that will
change.

I guess when Apple switched to x86, they didn't have money for custom hardware
and they relied on implementing PC specifications. Otherwise they would have
done it incompatible from start.

------
diryawish
A MacBook Air with a touchscreen would be a nice introduction to these chips.

~~~
api
I don't like touchscreens on laptops. I call them smudge screens because your
screen ends up looking like the bottom third of your windows when you have a
dog or a toddler.

Touch screens belong on small form factor devices.

------
DeathArrow
It seems many people wonder why Apple's CPUs does so good compared with x86
and other ARM makers in Geekbench.

Here are my guesses:

1\. Intel is still leading all the rest for single core performance, but it
does that at very high frequencies and using a very high power envelope due to
process node.

If Intel could use 7nm, they could have had a much higher performance if they
desired. I can speculate here that they wouldn't have gone for the absolute
best performance, because they don't have to. Their goal is to make money, not
to make the fastest CPUs they can. When Intel had a clear lead, they improved
the CPUs little by little, each generation.

2\. AMD has the best performance per dollar and watt and also the best
multicore performance. That does bring them a lot of sales and they are happy
with that. Their goal is to stay a bit ahead of Intel, so they can sell more
CPUs. So their strategy is to be have a better value for dollar. Also, even if
they are technically ahead in some areas, there's due to Intel being behind
TSMC.

3 ARM

Arm makes CPU designs and it seems it doesn't care too much about having a
clear lead in performance. Maybe with their investment in Ampere and the push
towards ARM on servers from the likes of Amazon, they will start to care more.
But right now, ARM makes a ton of money by licensing slow and power efficient
cores to hundreds of licensees.

4 Qualcomm

It's not clear if Qualcomm has the know how to push performance. Or that they
are willing to allocate the resources for that. I can speculate that they
would have done that with 8CX and SQ1 they built for Microsoft if they had the
know how and made business sense.

Qualcomm doesn't see Apple, Intel, AMD as threats. They compete mostly with
Mediatek, Samsung, Huawei. From here, they are very safe. Mediatek optimizes
for the absolutely cheapest CPUs they can make, Samsung doesn't care much
about CPUs and most of their phones use Qualcomm while Huawei doesn't threaten
Qualcomm much yet as their best CPUs are on par with Qualcomm's and they don't
sell CPUs to other phone makers.

So Qualcomm thinks they can do the same things for the foreseeable future: get
some cores from ARM, optimize them a bit, pack with their modem and make a
nice sell to the Android makers.

The disruption of this model where most CPU makers are happy with slower CPUs
can come from ARM server space.

If the companies pushing ARM for servers are smart, capable and hungry enough,
they might start to eat Intel's and AMD's pies. That would push Intel and AMD
to make better optimization and the tech might slip in mobile CPUs, too.

------
prirun
"Apple is exploring tools that will ensure apps developed for older Intel-
based Macs still work on the new machines."

Part of their release timing might be to ensure that emulating Intel on Arm
has acceptable performance for most users.

~~~
jonplackett
If they really can do this at decent performance, the arm chips must be even
faster than we'd expect. The reason that worked OK going PPC->Intel was the
intel one was mostly a lot faster.

------
msoad
I always wondered how much of my development environment will keep working on
ARM? Electron apps such as VSCode will work fine. Node.js, Python and Ruby
will also work okay. Docker can work too. What else?

~~~
aembleton
I think you'll need ARM compatible docker images though.

------
bobongo
I wonder what the impact of this will be on AMD.

------
jjuel
I for one cannot wait for an ARM Macbook! Might be some pain early, but in the
long run it will be great for laptops and computers in general.

------
meerita
How that would affect all the packages? My entire workflow depends of ruby,
npm packages oriented to intel CPUs

------
gigatexal
A big A-series chip with a lot of thermal headroom and active cooling? I can't
wait. Take my money now!

------
kalvisk
PowerPC back in the game? :D

~~~
jaboutboul
Most likely ARM as that is what the iOS devices use now.

------
mportela
Without paywall: [https://outline.com/ubJCBg](https://outline.com/ubJCBg)

------
einpoklum
There's a paywall on that site. Grrr.

~~~
mportela
[https://outline.com/ubJCBg](https://outline.com/ubJCBg)

