Hacker News new | past | comments | ask | show | jobs | submit login
Apple Plans to Announce Move to Its Own Mac Chips at WWDC (bloomberg.com)
325 points by sleepyshift on June 9, 2020 | hide | past | favorite | 456 comments



Don't get me wrong. I love the potential native performance gains from this transition, but I can't help but just be a tiny bit scared for the future of my day to day work.

For better or for worse, my company uses Docker for Mac for the vast majority of the stack that I don't work on but need to actively develop. I'm already paying a huge VM cost and it's pretty terrible. I don't see Apple working on any kind of native containerization solution. Does that mean that I'm going to be eating the current VM cost + x86_64 virtualization cost in the future?

I really want to keep using macOS as a platform. I know I can just stay on the hardware I have, but it's not really practical to be on an end of life architecture. It seems just a tad shortsighted to ditch x86_64 when a lot of people depend on it specifically because it's a shared architecture with other platforms.


> I don't see Apple working on any kind of native containerization solution.

It's a shame really. Apple doesn't do containers. They don't do virtualization. They don't do open source. It's all sort of like the jackling house [1]

It would be SO cool to do FROM macos:10.9 in a native mac dockerfile.

1: https://en.wikipedia.org/wiki/Jackling_House

(don't do = don't do in a direct committed way)


I've given up on the dream of working in a place where I can still run "all the things" on my laptop, at least in a place that has embraced cloud services, Kubernetes, micro services, etc.


Yep. We're hanging ourselves with all this crap. There has to be a better way.


VSCode has an SSH extension that lets you seamlessly work on a remote server, this has been invaluable with my old dual-core Macbook first offloading docker and my work to a vultr.com virtual machine and then a 6-core Intel NUC on my local network, which just feels like using the same computer.

The only drawback I've found is I use "puppeteer" a lot which is an API for programmatically using Chrome and Firefox, sometimes I want to see what's going on.


Yes that’s the IDE I use as well with WSL2 on windows 10. Weird that windows turned out to be the best Linux distribution I have used.

Ive got an 8 core Ryzen 3700x desktop with 32G of RAM and I can bring that to its knees easily. It’s ridiculous.


You're a bit RAM starved at only 32 GB.

My preferred environment is yours, inverted. Native Linux (likely Ubuntu 20) with Windows in a VM as necessary. That is making much better use of hardware IMO. It also gives you native Docker, and a ton of easily obtainable software. Much new software development is beginning in Linux or Linux-like environments, including most language development.

Perhaps more importantly, Linux is the deployment target for most cloud based software. The company I work with (Fortune 100) uses Linux as the preferred deployment platform simply due to cost, along with sufficient quality. It's nice to develop, test and deploy all on the same infrastructure.

Software turns out to be very interesting in terms of real lifecycle. Here in 2020, COBOL skills are in (relatively) high demand! While I'm interested to see the "Son of Unix", based on something like Rust and completely cleaned up, rationalized, and secured, Linux and Linux skills will be valuable for decades longer. Windows, I'm not so sure about...

My next development system will likely be a 16/32 core/thread AMD chip. I'd hate to hobble it with the Windows kernel, it's really not aimed at that class system. Linux is.


nix is pretty great for decoupling the build process from the deployment format, and it gives most of the advantages of containerization without the penalty (on Macs, at least) of running a VM.


> It seems just a tad shortsighted to ditch x86_64 when a lot of people depend on it specifically because it's a shared architecture with other platforms.

It would probably be shortsighted to do absolutely what you say.

The longsighted game is that they don’t see x86_64 as the long term future so they might as well eat the migration now.


I wouldn’t think you’d have to pay double in the long run because you would be virtualizing arm containers.

If anything having a major platform completely switch to arm would be a boon for cross platform support. Something has to bring it into the mainstream and it might as well be the company that gets devs to move mountains for them.


> I don't see Apple working on any kind of native containerization solution.

You don't see Apple's solution to a problem which doesn't exist yet on an architecture they haven't announced. Nor do I. I'm not sure why you'd expect to see anything at this point.

> It seems just a tad shortsighted to ditch x86_64 when a lot of people depend on it specifically because it's a shared architecture with other platforms.

A lot of people are interested in how Apple is going to tackle this issue. There are a lot of ways they might address this issue and reverse compatibility. While Bloomberg has been very forthcoming about rumors like this one, they haven't had much in the way of information about how Apple is actually planning on transitioning here.

Apple has done this twice successfully already so it'll be interesting to see how they tackle this. So talking about whether their unknown solution to this issue is a bit premature.


> It seems just a tad shortsighted to ditch x86_64 when a lot of people depend on it specifically because it's a shared architecture with other platforms.

What exactly depends on x64? Hypervisors? JITs? I can't think of that many examples.

The most important software already runs on ARM, because ARM is a really important platform.

Sure, adding yet another platform (macos-aarch64) to everything is going to be some work and some things will fall by the wayside.

It'll still happen for the most part because you know there will be a significant installed base. That's something that other companies (like Microsoft) can't pull off.


Might also be interesting targeting ARM SERVERS. E.g run ARM containers natively.


With Thunderbolt couldn’t you have external coprocessors?


Leave it to Apple to make the CPU a dongle.


Why wouldn't you just build your docker images targeting Arm instead of x86?


ARM doesn’t really have better performance though.

They should just buy AMD or something.


AMD’s x86 license isn’t transferable


Wait, even if the company is wholly bought?

What if I as a private citizen just decided to buy a majority stake in AMD? Is that different than a corporation doing the same?


Moving to it’s own chips may be fine and all, but I’d much prefer Apple to deal with the rotting quality of its OSes first.


It does say something that even with some of its flaws (i.e. ads, telemetry, patchwork leftover UI from over the years) I vastly prefer working on Windows for most things these days, especially with WSL and the new terminal in the mix.


I've been developing on macs for 10+ years. Recently got myself a Windows desktop for testing stuff, gaming etc. It turned into my main development machine thanks to WSL2. Everything works better / more stable and this is extremely ironic given that I switched to mac a decade ago for the same reason. Also I have many more hardware options (which is overwhelming)

Meanwhile, my MacBook Pro 16" has daily kernel panics when it is connected to an external display. It has been reported by many people on Macrumors forums and has been happening for months now. No fix...


I’d support windows in a heartbeat, if they would stop treating users as the product (by showing ads and tracking them). This even extends to the Pro version. Also, they should just remove the new settings UI. It’s confusing.


I'm returning a Surface today due to the egregious telemetry and additional upgrade fee they charge for full disk encryption/Bitlocker.

I counted six toggles related to tracking that were enabled by default in the OOBE. Forget one toggle and you'll spend 20 minutes digging through Control Panel for it. The "NEW" Microsoft, right.


I haven’t seen any ads yet. Might be geographic though.

Also setup asked about some tracking stuff and I said no but haven’t done any research to see how they treat it.


Usually you'll have ads for OneDrive (unless you already use it). And things like Candy Crush preinstalled.


https://old.reddit.com/r/TronScript/ or https://wpd.app or https://www.oo-software.com/en/shutup10

Shutup10 is the best AIO, most tested tool I have found for Windows 10 that doesn't break stuff.

There's also custom ISO's with binaries and entire product suites removed which may break some stuff that runs on Windows 10 (eg requiring the Store for some games) - but I'm not comfortable with the level of transparency custom ISO's offer, so I don't do this: https://ameliorated.info/

My recommendations is to get a copy of Windows 10 LTSC (which doesn't include Candy Crush etc) and run shutup10 on it for your work environment. It's the least risky option for an every day driver.


Thanks for the recommendations. I used a similar tool when I first installed Win 10 and it worked well until one of the bigger updates got installed and removed most of my fixes.


I think the ISO's purpose was to remove Windows Updates for that reason. You need to run these tools semi-regularly, at least after updates, to ensure Microsoft hasn't re-enabled stuff. Which they do. All the damn time.

You can use /quiet and /force flags on shutup10 and create a scheduled task to re-run it occasionally.


That was me after I got the iMac Pro. Daily kernel panics pointing at the T2 chip. It took about 6 months and they finally got it resolved, but that was the last Mac computer Ive been really happy with.


Has performance improved in wsl?

I was playing around with it a few months ago and filesystem-intensive operations (like running npm install) were horrendously slow. Has that changed with wsl2?

I’ve been considering picking up a ryzen laptop and running Linux on it, and trying that out as a daily driver. But I don’t have any patience these days for the hacky wifi driver installation dance. And I’m nervous after having a Linux laptop cook it’s own battery when it failed to sleep properly in my backpack. It’s been a couple decades - do Linux laptops just work yet?


I tried WSL a year ago. Filesystem was slow and it was slow everywhere.

WSL2 improves that by offering fast fs on linux side. The boundary is still slow. For example, if you go into /mnt/c and use git status on a large repo it will need to go through the boundary. It is slow (not painful but not pleasant)

My current setup has all my source code on the linux fs (/home/user) and I access them using VSCode remote extensions. The performance is fine. I can't really tell if native macos experience or this is faster. I also tried JB iDEA. It works across the filesystem boundary and even if the access is slow, it does not reflect to real world usage much. But you'll feel it if you look for it.

Another option is to run GUI apps on Linux side and use an X11 server on Windows. This just works fine and they recently announced that the WSL will have built in support for Linux GUI apps.


> do Linux laptops just work yet?

Linux on laptops is fine nowadays. Recent versions of the kernel vastly improved battery life on Intel. It's much better than it was 10 years ago.


Agreed. I only have experience with Intel graphics, but all the recent machines I've installed Ubuntu on have just been install and it works. If you pick a distro that doesn't include non-free drivers/firmware, you might have to do some more work to get wi-fi working.


What about Nvidia drivers with Wayland? If I remember correctly Nvidia drivers don't work with Wayland and for touch gestures I need Wayland so because of the Nvidia and Wayland mess I can't have touch gestures which seems totally strange that they would be intertwined with touchpad.


Sadly, Nvidia has shown multiple times to be extremely user hostile and only concerned in keeping its hold on the computing space. There are I think almost no other reasons to refuse to write an open driver except to avoid to expose exactly how CUDA works, I think. AMD's new open source driver is plug and play, and a true joy to use. In general I avoid Nvidia as much as I can, especially on laptops where I honestly don't see the appeal of having a dedicated monster that wrecks your battery every time you power it on. New iGPUs are excellent, especially those from AMD, and for bigger tasks assembling a desktop computer gives you much more processing power for a way lower price.


I understand where you are coming from but at the end of the day it works flawlessly on Windows and not on Linux and to the user that's all that matters. I really don't care whose fault it is.


Yes indeed. WSL 2 launched a few weeks ago with the latest Windows 10 version, and it has different sort of filesystem setup. WSL 1 had a Linux-like view of some subdirectory of the host NTFS filesystem, so all the fs IO went through Windows kernel. In WSL 2, there's an ext4 image instead, so filesystem access is essentially as fast as with any Linux system in a virtual machine.


Make sure to check how well supported your wifi card is. If you try something like Ubuntu drivers generally should be less of a problem. Though you may have to upgrade to the 5.7 kernel for better support of the cpu. I rarely sleep my laptop so I can't tell you anything about that, but general usage on my hp laptop (i5 7200u, intel igpu and intel wifi) has been working fine for me the last six months with Arcolinux on it.


You really have to have a super crappy card to have Wi-Fi issues on Linux. Nowadays laptops have mostly Intel, Qualcomm/Atheros or Broadcom chipsets which work flawlessly out of the box. The only problems I had in the last years were due to Killer WiFi cards or crappy Realtek-based USB dongles, both of which kinda sucked on Windows too in the first place.


Try Manjaro Linux on your laptop. Last laptop I installed it on, everything just worked.

And it's based on Arch, so you'll be able to take advantage of the excellent Arch software repo and wiki (if necessary).


I recently decided to try out coworker's TB3 5k display.

Unfortunately, late-2018 MBP with 560X is apparently incapable of setting the right widescreen resolution (5160x2160 or so), an error that was reportedly "fixed" in 10.14.2 beta... yet here I am with 10.15.4 and nope, you either get everything scaled too big, or you get big black bars on the sides.


Try installing SwitchResX. macOS hides resolution options from you, even though they're supported.

I had an old 27" dell monitor that I thought was a blurry POS, until I plugged it into a Windows box and realised it was macOS that was the POS


The issue is that something is broken with handling of the display at driver level, and even if you use low-level APIs the native resolution simply does not show. AFAIK even SwitchResX (though I haven't tested specifically it, I tested other tools) won't be able to set the correct resolution.


WSL and the new terminal pushed the balance in windows favour for me too.

The value proposition of OSX 10.2 when I switched to a Mac was a clean, elegant UI with that unix terminal for developer tasks. The Windows 10 UI is good, and these days the *nix of real interest is Linux, so the WSL handles that side, so for the same logic, I've moved back in the Windows direction.


Indeed - I was a pretty heavy Mac user for years but I genuinely find that Windows gives me less grief lately.


There is a good chance that macOS is being neglected precisely because Apple is moving away from x86.


Fwiw, Apple had been building OS X on x86 for five years prior to announce of the last shift.

https://youtu.be/ghdTqnYnFyg

Not sure how this impacts theory of neglect.

Is it correct that the best folks from any given team at Apple are assigned to new, big initiatives?


Unlikely. I think they are just spread very thin, internal communications between groups is still broken, the dev process is/was broken (but is supposedly being fixed), and departments probably still don't cooperate. Priorities lie with features that sell the eco-system, not fixing bugs.


And because they hoard profit instead of hiring more QA staff.


From personal anecdote I find that it's just poor culture which prevents qualified people from getting hired. Even with tons of reqs open, no one new as added to the staff.


Engineers should do QA. QA is unnecessary


> Engineers should do QA

Yes, they should. The vast majority of QA should be automated unit and integration tests.

> QA is unnecessary

This is objectively false. Even if you perform 99.999% of all QA in an automated way, you're still going to miss those edge cases, like the infamous daily kernel panics due to buggy support for external USB-c displays that only real humans testing your product in the real world with 3rd-party devices can find. QA is still extremely important, and when companies like Apple neglect it, they end up shipping buggy and disappointing products.


I’ll take 99.999%. I doubt a normal sized QA team would have caught this bug you mentioned.


It is quite plausible, that the quality problems of Catalina and iOS13 at the launch were a consequence of shifting the development team around to also support the development of MacOS on ARM.


its possible, but im not convinced

1. mac os has been buggy for quite a few releases already

2. outside of drivers and apps, ios and macos are basically the same kernel and userspace/libraries, so there isnt much to port


On the surface, there isn't so much new in Catalina oder iOS 13, that you would think that things break so badly, but they did. Starting a new branch of the development always is a disruption. Developers would be reassigned to new groups, probably the best ones even, new developers would join the existing groups. This process probably has started like 2-3 years ago and intensified like 1-2 years ago, especially after they had the first silicon to play with - and I assume the CPU will be sufficiently different from an iPhone so that you want to at least optimize your code for it, there might also be entirely new features to support.


"at the launch"

mac os at large has a quality problem. I have gotten more kernel panics on my personal and work laptops in the last 6 months than I have on my windows machines in the past 5 years.


I think the difference is, that Apple threw everything they could to fix iOS13, as their survival depends on it, so it did stabilize, while Catalina still hasn't recovered.


I don't quite agree -- I think iOS has been decreasing in quality as well. Apple in general is producing software of lesser quality than it has in the past, in my opinion. Very disappointing as I used to be a huge fan of their products :\


iOS has also been buggy


There's not that much overlap between the sets of people who'd be able to work on those two things.


They will do it via iOS Core™️.


Hopefully the new macs are faster. Here's something you can try yourself:

My 10 year old Macbook pro running lion, if I go to the Safari menu in the corner click it and move the mouse up and down really quickly over the menu items, the responsiveness is instant & there is no flickering.

Doing the same with the * * max spec * * 16" Macbook Pro has flickering. Easily reproducible in this very Safari window if you're reading this on a Mac.

Everything has been downhill this decade with the quality. If the focus was on ARM then hopefully if it was a resource diversion the quality is good after.

Well i hope at least..


This is not just happening to Apple. Every tech giant is losing sight of the fundamental experience over marketable features, shiny and profit.

There are a lot of places you could assign blame for this nonsense, but perhaps a better path would be to just start a new company and release products that don't suck. Surely there is at a small, yet viable market out there for "how macbooks used to be prior to 2014".

I'd throw my wallet at the first company to produce an exact clone of the late 2013 macbook pro running a non-shit variant of OSX. This is easy money sitting on the table, but there's probably more money in shoving 12 month product cadences down everyone's throats.


I'd argue the 2015 MacBook Pro was the best they ever released, but perhaps I'm forgetting something they changed between 2013 and then?


Trackpad cables are liable to fail about every 1-3 years on the 13", and the 15" had the battery issues.

Not the end of the world, but not perfect devices from a hardware quality standpoint.


For 13" Mid 2014 is the best rock solid for nearly 6 years, even Louis reccomends it[1]

[1]: https://youtu.be/xFIVZYevfGU


Wait, why not 2014? I have a mid-2014 13" rMBP and it's pretty great (except MacOS has gotten worse). What changed after 2013?


Well, the last non-shit version of OS X is Mavericks, which came out in 2013. :)

(I'm not 100% sure whether the 2014 MBP can run Mavericks, it depends on when exactly in 2014 in was released.)


I feel like it peaked in Snow Leopard to be honest. I can't think of a single time it ever crashed on me. Even with a 5400 RPM spinny disk it felt fast as hell.


I post this every time it comes up: I am convinced that Snow Leopard is remembered overly-fondly because Lion was so comparatively bad. I did a lot of testing between Snow Leopard, Mountain Lion, and Mavericks in Virtual Machines side-by-side (because I wanted to switch my main machine over to one of the three), and could not find any reason to pick Snow Leopard. Mavericks really benefits a lot from memory compression in particular (although this is of course moot if you have memory to spare).

I admittedly didn't test with an HDD though, Snow Leopard would probably have done better there. On the other hand, you give up a lot of good, useful features by going with Snow Leopard. I know a lot of Mac users were annoyed with autosaving when it was first introduced, but I remember typing essays with Pages on Snow Leopard in High School, and losing many hours of work due to the lack of any autosave functionality.


I agree on this, I never wanted to update beyond Mavericks, but was forced to because I bought an iPhone 8 and Apple's website lied about Mavericks being supported (turns out the version of iTunes required to use an iPhone 8 required Yosemite -- they updated the website a few weeks later)...


I had Mavericks on my 2014. I kept it on there until El Cap. I agree, everything that has come since has been, in most ways, worse.


Wrong the last non-shit version is Leopard because I can run it on PPC. You can’t prove me wrong unless you escalate the purity spiral.


Well, for what it's worth, I practice what I preach—I'm typing this on a Hackintosh I purpose-built for Mavericks compatibility in early March. :)

But just for fun, if your benchmark is Power PC support then I am delighted to introduce the latest version of OS X that can run on your hardware: newly-PPC-compatible Snow Leopard! https://forums.macrumors.com/threads/snow-leopard-on-unsuppo...


Wow, really cool to see that it’s possible to get developer beta builds of Snow Leopard working on PPC; thanks for linking the thread!


Are you not worried about security vulnerabilities? Otherwise I love your idea here.


Here's the little security analysis I did for myself. This machine:

• Is running an up-to-date web browser (Firefox ESR).

• Has all important data backed up to cold storage regularly.

• Is behind a router. All incoming ports are closed, except one for ssh. I've looked through openssh's CVE list and there's nothing concerning.

• Is running local software which I consider trusted.

I was originally also planning to get a copy of Little Snitch, as an additional form of monitoring. I haven't actually done that yet, but I should.

Some possible attack scenarios I can imagine:

• There's a zero day in Firefox. My vulnerable OS does not offer the extra layers of protection that a newer one might.

• Someone exploits Spectre/Meltdown/etc via Javascript. My attacker is incredibly lucky, and the tiny portion of memory they retrieve just so happens to be the bit that contains something vital like Bitwarden's master password.

• Someone emails me a malicious image which isn't caught by Gmail (personal accounts) or Microsoft Exchange (work account), and it infects my machine upon being rendered by Apple Mail.

• A person I know/trust is tricked into sending me an infected document, likely a PDF or MS Office file. Even though I don't have Acrobat or Microsoft Office installed, the vulnerability is compatible with Preview and/or iWork '09.

None of these scenarios seem particularly likely. The key here is that I'm not important enough to get hit by a targeted attack, and frankly I don't think I would survive one on an up-to-date OS anyway. In exchange for this minor risk, using my computer makes me much happier right now than it did six months ago.


It's not retail, you have to modify it. No thanks.


I actually have no strong opinion regarding the 2014 year. 2013 was just when I happened to step into the store and pick one up. I believe they're pretty close in terms of features/quality.


This doesn’t make any sense.

Revenues for Apple and MS are at record highs.

It’s clear that they are valuing the experience of the majority while rightfully ignoring the vocal minority.


Alternatively, they just successfully kill any competition.


The annoying thing is this is what mac was famous for, for a long time.

Often, the price would give you considerable lower specs than a PC, but the mac would be faster to desktop, faster to start working in an application, smoother IU, longer battery life, etc.

Focus on macs used to be "faster to work with, not faster to calculate with"

It was those things that warranted the higher price tag.


> Often, the price would give you considerable lower specs than a PC, but the mac would be faster to desktop, faster to start working in an application, smoother IU, longer battery life, etc. Focus on macs used to be "faster to work with, not faster to calculate with"

Linux is the new Mac, then. It's really fast even on otherwise low-end hardware, and the price tag is pretty hard to beat.


As a guy who had been trying Linux desktops for 2 decades before I finally switched, I fully recommend Manjaro Linux over any other distro. Based on Arch, it's the only system I've tried that never killed itself with updates.

Whatever you do, don't make Ubuntu your first distro. You'll regret it. You'll spend more time tinkering with it and rescuing it than anything. Manjaro is fast, simple and you won't have to hunt down software because the Arch repositories have everything you need.

I don't prefer laptops, but I did install Manjaro on one laptop - an Acer E5-575G, and it ran perfectly. Everything works: wifi, sleep, even the Fn keyboard controls to adjust volume, brightness, etc.


I've used Linux as a daily driver for close to 9 years now and it's decent, but still has a way to go compared to the polish of OSX.

Where I see it winning out in the future is that Proton/WINE on Linux is working pretty dang well and my gaming options on Linux are great. Graphics drivers are good, and my choice of hardware is very broad. GNOME 3 is decent enough, and there is KDE or MATE or Fluxbox/Openbox or whatever.

(sidebar: haven't tried any of the Wayland compositors but they look interesting)


> Linux is the new Mac

As a Linux desktop convert I find it hard to disagree. I used and owned macs from the 68k through PPC and one intel (C2D), they were all fast (to use), that last intel one from 2009 still felt slick... but that's where it stopped for me, everything got super bloated and slow afterwards.

I may not be 100% correct here but it seems like most of the problems are with their OS and software, which is a shame because they are the only remaining computer company that has retained control over all of their hardware and software - something that should have always given them an edge in fine tuning, but they seem to be throwing it all away.


FWIW I wrote this on my blog the other day which supports your observations from another angle:

> I have seen developers starting to use Linux machines for a few years already. It kind of reminds me of how it felt like when devs started to adopt Macs around 2005/2006 or so when Ruby on Rails became popular. And just like when Macs became popular, mainstream adoption seems to follow: I've already seen a sales guy running Ubuntu Linux (by his own choice) over a year ago.

Note: the driving forces aren't exactly the same, but the feeling is reasonably similar in my opinion.

Also I should note that the adoption of Linux among sales and other non-devs has surprised me.


> Also I should note that the adoption of Linux among sales and other non-devs has surprised me.

That is interesting, I've seen colleagues go both ways - well three ways -

Web devs going from windows/mac to linux, then a couple going back to both windows and mac due to needing to use adobe products (used to dual boot and now use WSL).

Most people pick up on and appreciate the difference in speed/responsiveness and ease of doing dev (i.e it has an actual proper built in package manager that is fast, no brew crap). Whether they stay seems to be more to do with dependence - people who really like it even start to re-evaluate if they are truly dependent on software they liked, vs their new found utility.

Media seems to be a big problem, (adobe et al), but sales perhaps not, i mean office is all going cloud these days anyway.


Same history here except I think you have rosy glasses here. I regularly boot old Macs, and while the snappiness of Classic is unmatched, the ~2005 ones are incredibly slow to work with. SSD were really a game changer for OS X. But I agree the bloat is real.


You are right in that it's not as simple as pre2009=fast, there are ups and downs, some versions of classic were a bit bloaty too, and sometimes the software was trying to do a bit too much before the hardware caught up...

However it's nothing compared to the super bloat of today, and the stupid inefficiencies like 10s of GB of updates.


Agreed. After recently spending some time with GiMP and Dark Table, I have realized that when my 5k iMac eventually needs replacing, I’m custom building a desktop Linux machine. My laptops already run various Linux distros and I’ve gotten to the point where it’s actually more painful to have to deal with a non-Linux OS than to derive any real benefit from the Mac ... homebrew is great but I’d rather just use apt.

The only reason I can imagine buying a Mac again would be for iOS development, but it seems like virtualized MacOS has come a long way—anyone using hackintosh or virtualized macOS for iOS development? Do you run into weird edge case problems due to Xcode not running on its “proper” hardware?


How is power management these days on laptops? That was my biggest issue with Linux. I'd like to run it on one of my older macs.


Make sure TLP is installed, and it should be fine. I'm using mainstream laptops (Thinkpad, Latitude) and with TLP I have a better battery life with Linux than with Windows 10.

https://linrunner.de/tlp/


But that leaves out something very fundamental about Macs: They've always been very accessible and user-friendly. Even on Catalina, people tend to find their way around the OS really quickly, in part because lots of things they know from their phones are there as well (LaunchPad has all the apps, like on a phone, there's Spotlight, like the search on the iPhone, there's Preferences app, ...) – and you don't have to do anything to keep it running and working, everything auto-updates at night, it's hard to break things, just ... open that Macbook and use it. If you have other Apple devices and peripherals, they might even auto-pair on demand and things like that. Usually a very smooth experience.

Of course, there are lots of people who never had a problem with Linux at all ... except that when they share their screen in MS Teams, it's two 4k screens plus laptop display stitched together with a fairly complex fix that involves scripting virtual webcams, or there's no way to use the projector at something less than 100%-scaled 4K that can be done on-the-fly, or wifi stops working until the next reboot when plugging in the USB-C hub at the wrong moment, or the fingerprint sensor either never works, or nondeterministically sometimes works, or company e-mail works fine in Outlook and Apple Mail but not at all in Thunderbird, or the fans run all the time ...

I've never had a smooth ride myself trying Linux on a desktop or laptop, nor have I ever talked to anyone who didn't eventually admit to having issues like that – and while that may be tolerable for someone who values being able to use Linux a lot, it's unacceptable and broken for everyone else. Maybe there are people who actually do have a ride so smooth they never even have to touch a command line at all – but I have strong doubts.

Yet that's one of the bars to clear. With Macbooks, I never have to touch a command line (I can, if I want to, but I know no one outside of my dev/ops/research/nerd bubble who's ever actively used a Terminal). The vast majority (for a very large value of vast) of people out there have never touched a command line before and have zero inclination to change that. They don't have anyone to sysadmin for them, either.

Ubuntu on a Macbook-Air-ish affordable and somewhat stylish laptop, that'd be it, can't use those Airpods. Or any other headphones. Or play any sound at all, because for that particular sound chip, alsa needs this particular config to be set in such and such a way.

To be the "new Mac", it'll have to "(very mostly) just work" like MacOS, Android, iOS, ChromeOS, or even Windows – bonus points if it has a walled garden with lots high-quality apps and no really easy way to break out or actually break things, because that's a big feature for people who just want to _use_ a computer, not build, admin, secure and maintain a complex OS.

And all of this is deeply frustrating, because I _like_ Linux. But I hate having to put in all the work and still lose so much comfort, efficiency, dependability and ability to do things I can do easily on MacOS, on inferior hardware, without a Touchbar, without a proper touchpad – it's just not there yet, not by a long way. And if it's too much hassle for me – someone who's built a Linux router on a 200Mhz Pentium 2 when Pentium 2s were still decent machines and 10mbit LAN was fast – then I guess it's not going to work for the vast majority of people, either.


> Even on Catalina, people tend to find their way around the OS really quickly

Last week I had to use OS X for the first time ever, on a friend's iMac. It was irritatingly animated and it required unintuitive steps to open a terminal. Something called Finder ( but I'm not searching ), then clicking down into some folders and finally Terminal.app

All whilst trying to ignore the distraction of various icons bouncing on the dock bar. And occasionally I'd press something my mistake and Finder would minimise itself.

Thankfully I had my phone to hand to look up how to use this intuitive system...


There's literally a launcher that lists every installed application, and the launcher even has "Launch" in its name. There's also a built-in search in the menu bar that has a magnifying glass icon, which seems fairly self-explanatory.


I don't think you're the kind of user I had in mind; I was thinking more of non-technical people who know their way around their phone and enough Windows to use Excel and Word on Windows on their work PC that IT admins and locks down completely. Like, people who take those training sessions for switching the new Microsoft Office version and profit from them, or maybe they don't need those, but have zero experience at admin tasks. I guess those would make up a very large part of Apple customers.

On a new Catalina install, you'd have a hard time to not find crucial apps easily, like Safari, Mail, Calendar, something to write a letter in ... because they are in the Dock already, or prominently in LaunchPad, which works just like the home screen on a phone, and is in the Dock in the leftmost position – and the OS has an onboarding flow that pops up right away and that'll take care of explaining those things. So, if you, as someone with little expertise in IT, open that new Macbook Air, chances are you'll find your way around quite well with the tools you have. The weird @ placement will keep being an annoyance though. Chances are you know how to use the Preferences app on your phone, and you'll probably be able to figure out the System Preferences app in LaunchPad with that knowledge, and it can do everything important. Pretty big deal!

If you need a terminal, you're way way way ahead those people in terms of computer literacy and certainly wouldn't have any serious problems navigating any sensible operating system there is, just some annoyance if it doesn't work like (I assume) the Linux setup you've spent a lot of time dialing in perfectly. Most people wouldn't know that their animations aren't ideal for them the same way I couldn't tell if a certain dentist's drill had somewhat less than ideal ergonomics for a cavity in a specific place. And besides (n=1) I've been using Macs all day for way over a decade and I never found any issues with the animations; maybe I'm part of the majority and that's why people aren't very vocal about the OS animations.

If I extrapolate from my experiences (which are numerous pieces of anecdata, but of course still anecdata), there is a significant number of people who use Macs because they want something stylish and pretty that is easy to use for "casual" computing needs and is fairly frustration-free. If something is wrong, you dig through preferences some and there'll be a radio button to change that. If that projector is at native 4K, you open System Preferences, Displays, click on the icon that has the largest font – that'll do it. And it's built with the knowledge that most users will not be able to do much more than that – so having to resort to a shell and fixing up config files based on a forum post from 2012 isn't going to happen – they'll turn up at the Genius Bar and will be disgruntled customers if the people there can't help.

So, if Linux is to be the new Mac, it'll need to offer that kind of experience as well. Currently, in my experience, it doesn't offer this at all.


This is definitely not true. Macs were much slower UI wise than PCs. Most PCs came bundled with graphics cards. Most Macs used integrated graphics. It was especially bad in the 90s.


I guess you never used the original Mac


Oh come now, that's not fair. GP was talking about ten years ago, not 30.


Yeh I know heh. Was a joke. You will have to pry this macbook pro from my cold dead hands.


Surely not during classical Mac OS days.

Working on those LC II was more of a need that nothing else was available at the computer lab than anything else.


A Quadra 950 feels fast running old Mac OS applications.


I concur. Booted up my G4 this week, on OS9. Feels so much faster than anything released this century. So much so, I'm keeping it as an air-gapped word processor.

Modern computers, SSDs, super fast RAM, GPUs etc, all needed for computing. But the older machines with outdated hardware, low and slow RAM, and tiny slow PATA or SCSI HDDs feel so fast. I don't doubt that the Quadra feels just as quick. It's not just Apple, same goes for Windows of the same pre-Y2K era compared to now.

The question is why. Why is the user experience so poor. Why is it all so sluggish now. Have we exchanged performance for developer convenience just to punt out more underperforming software? Is the art of creating fast UI code dying out? Is the shift from C/C++ to Swift/.net interpreted bytecode-type solutions the cause? Too many layers of abstraction, too much cross-platform code with compromises to make it run anywhere?

Whatever it is, we're churning out software quicker now but on faster hardware it runs worse than the older tools. It feels backwards to me.


> The question is why. Why is the user experience so poor. Why is it all so sluggish now. Have we exchanged performance for developer convenience just to punt out more underperforming software?

Generally speaking, yes. We prioritize speed to market over everything always.


Why?

I can maybe understand it in the early experimental phase, but there ought to be a point where the software's authors go back and do it properly.

It's worth the effort! If one application is significantly more responsive, people will think it feels nicer to use.


Because speed to market equals speed to market dominance.


> The question is why.

Simple answer: Abstractions.


Not when measured against PCs or graphical UNIX workstations of the same age.


You couldn't run Mac OS applications on a PC or UNIX workstation. The point is that older Mac software was written to run well on the hardware. WriteNow on the Q950 is particularly fast.

I haven't run any benchmarks but I would expect the Q950 to be comparable to an early SUN SPARCstation, the Q950 has more RAM too. I do run NetBSD on both of them.


I thought we were comparing platforms over here, that was the whole point of my remark regarding the LC II usage at the computer lab.


Retina screens mean considerably more work. On Lion you only had a fixed size bitmap that could be moved around easily, but, with a high density screen, you have multiple densities that need to be computed and cached and that take at least 4x more memory.

Also, Catalina is doing a lot more work under the hood than Lion.

What I saw was not flickering, but that I could move the pointer faster than a bar could be rendered and that the UI opted not to render it in that case. Is that what you refer to?


Apple didn’t release devices with Retina displays until they could be confident that such devices were at least no worse than non-Retina devices in pretty much any regard. They were faster, not slower. The sole exception to this may have been battery life, which if I recall correctly was a bit worse despite them having bumped the capacity up. (OK, so memory usage for screen buffers was also necessarily four times as high. But they still made sure they had enough memory for most people.)


That’s not at all true.

A few years ago I replaced a 2012 MacBook Air with a much more expensive 2016 MacBook Pro (with Retina display) and there was an obvious regression in UI responsiveness. It was bad enough that I regretted giving away the older laptop so quickly.

The difference is almost certainly the higher pixel count. At least as of 2016 the onboard GPUs in intel chips haven’t been fast enough to make the UI feel snappy. I ended up buying an egpu for my desk to make my external displays usable. The experience out of the box using the onboard gpu was awful.


Can confirm. My data points:

1. MacBook 12" 2015 feels dramatically faster at 1x scaling vs. 2x, even with the same output resolution (2560x1440@1x vs. 1280x720@2x.)

2. Mac with 8GB RX 580 GPU. With >1x scaling, as I add load (i.e. Chrome tabs) it eventually runs out of GPU memory and starts paging to system RAM. The whole thing grinds to a halt at that point.

The rendered pixel count makes this effect worse -- dual 4ks are slower than a 5k, is slower than a single 4k, is slower than dual HDs. This is to be expected, but we're talking about an 8 gigabyte GPU, and the framebuffer sizes should be a tiny drop in the bucket.

None of this happens at 1x scaling. Running a native (2x) scale factor feels a little faster than a downscaled (2x -> 1.5x) scale factor.

As far as I can tell, load and GPU mem usage scales with:

(number of GPU applications * total rendered pixels) ^ k.

At 1x scaling, k is close to 1. At HiDPI, it's greater than 1.


> This is to be expected, but we're talking about an 8 gigabyte GPU, and the framebuffer sizes should be a tiny drop in the bucket.

It depends on how many high dpi bitmaps at full depth are kept. Uncompressed HDR 4K gets large pretty quickly.

And then it's not one bitmap, but multiple layers - backgrounds, rendered text...


Right. Say -- pessimistically -- that I'm on a 5k display (14Mpixels) at 8 bytes per pixel, and double buffered, we can do about 35 copies of that display in GPU RAM. This seems about in line with my experience.

And this is where I don't know what to think because "keep it all in vRAM" is clearly a terrible algorithm, and I have faith in the macOS developers, and yet my data suggests that this is exactly what they're doing. I don't see things swap out when there's memory pressure; it just keeps on allocating until it's full, then spills over to system memory and makes everything slow.

Counterintuitively, the iGPUs have an advantage here. They always run from system memory but they have a faster path to system memory -- they don't need to operate over PCIe -- so (a) you don't have a large perf gap between the 'full' and 'not full' cases, and (b) spillover doesn't hurt. You use a lot more RAM, but you can buy more of that. I can't easily add RAM to my GPU. At best I can double it, and that just means 70 Chrome tabs instead of 35.

I'd love to hear from someone at Apple who's seen the internals of this.


I cheaped out and got an external gpu with 4GB of ram. I think it was a mistake - those spills happen much more often than I’d expect, and I assume there’s much less bandwidth available over thunderbolt than you’d get with pci-e directly. So swapping in and out is worse.

I think successive software updates have made the situation marginally better - but I might have just acclimatised to it.

I’m very curious what the situation looks like with arm laptops when they launch. How will the graphics performance of an A14x chip compare to my situation at the moment? I assume they’ll be faster than intel iris graphics but slower than an egpu; but it’ll be very interesting to see!


Yeah, I imagine this is better with newer MacBooks, but my 2015 MBP can be noticeably laggy when opening mission control on a 4k screen. It used to be laggy on the built in screen until Metal was introduced.


I apologise to you and the others that have responded to me: clearly my fuzzy recollections of what I heard and read at that time are way off the mark.


Apple released the 3rd iPad generation with 4x the resolution of the prior generation, despite the GPU being absolutely miserable. The net result was that many graphical operations were much slower. It was a few more generations before the responsiveness caught up with the lower resolution 2nd generation. We know that they've done the same with Macbooks, going to dramatically higher resolutions while often relying upon the the iGPU that has only marginally improved.

There has never been a hard and fast criteria, beyond that ostensibly the benefits outweighed the detriments. I had a 3rd gen and enjoyed that super crisp display, so it was an okay compromise.

Add that recent software iterations have put dramatically more emphasis on power savings/efficiency, and are much more proactive about frequency scaling, etc.

As an aside, this post will be autodead, and that's cool. Love ya HN.


> Apple didn’t release devices with Retina displays until they could be confident that such devices were at least no worse than non-Retina devices in pretty much any regard

That's... not how it works. More pixels means more CPU, GPU and memory cycles to fill them and more DAC cycles to read them out and scan them to the display. That's just fundamental.


I really wish it wasn't doing more work under the hood. I can't really think of any features that were added to macOS in the last several years that were worth the bloat and instability that came with them. I guess I feel the same way about Windows though, every iteration past 7 feels like a downgrade.


>Retina screens mean considerably more work. On Lion you only had a fixed size bitmap that could be moved around easily

Didn't the first retina MacBook Pro ship with Lion?


The late 2013, 13" retina MBP shipped with Mavericks. It looks like a "Mountain Lion" version shipped earlier that year. Unsure if there were earlier ones.


EveryMac.com says that the mid-2012 retina MBP shipped with 10.7.4.

https://everymac.com/systems/apple/macbook_pro/specs/macbook...


I think the first Retinas didn't have fractional scaling. I think it was hard coded at 2x scale. Not sure, though. My 15" presents 5 possible scaling options.


Technically, fractional scaling doesn't exist on macOS: it's integer upscaling to the nearest ceil(scaling), followed by downsampling.


I still use the first-generation retina and it does have the same 5 scaling options.


With the original OS?


IIRC yes but the scaled options were so laggy they were hardly usable. It improved with updates.


Cool. Thank you. I didn't remember the UI details.


And yet the UE5 demo can run at a stable 30fps on a console.

Computers aren't slow any more. Even laptops can render a menu bar.


The modern Mac is blazing fast hardware. The UI however still has a lot of ancient baggage which drags performance down. I am running a PC game on my iMac that is written in Direct-X at full frame rates via Crossover. Metal is incredibly fast access to the GPU, yet the Finder is still lame at doing lots of things. Like any 20+ year old technology, it's a combo of old baggage holding on alongside massive improvements. Windows 10 is no different, just in different ways. I do wish Apple spent a little more time on general performance and usability improvements (looking at you Xcode) but I know supporting a massive codebase with such a long legacy is really hard.

That said Apple doing their own hardware has always been the ultimate goal. End to end control was always the dream.


Wasn’t Finder rewritten from the ground up in snow leopard?


Looks like it was ported from Carbon to Cocoa:

https://arstechnica.com/gadgets/2009/08/mac-os-x-10-6/18/


>The modern Mac is blazing fast hardware.

Ryzen 4000 has something to say about that ;)


Ryzen 4000 may be blazinger, but that doesn't mean those 10th gen Intel CPUs are slow.

I can order a machine that outspecs a MacPro from Dell or Lenovo, but it won't be as tightly integrated as a Mac


With 6/8 threads and a low/mid-end GPU there is nothing "blazing fast" about the hardware. They are common, average components for laptops.

The build quality and software is where macOS was always excellent when Jobs was at the helm. Now software is finicky, lacks support for standards and the build quality cannot be trusted as a whole.


> They are common, average components for laptops.

The point is precisely that. Average components are blazingly fast these days.


Even low-end hardware today is blazing fast. It's the slow software that drags it down.


Here is my 16" doing what you suggest: https://imgur.com/v4HNhZY

Moving frame to frame you see that the blue highlight lags the mouse by one or two frames. Seems okay to me, I don't see any flickering.


There is visible flickering during the last second. Moving frame by frame: the blue highlight goes unseen together with Clear History, then it's Settings for This Website, then Hide Others, then Quit Safari twice, the highlight reappears on Hide Safari, then disappears with Clear History. A little bit later Settings for This Website is highlighted but the text stays dark instead of becoming white.


At the 4 sec mark in your video the text "Clear History..." vanishes resulting in flicker. I have a screenshot if you want.


I see it now, thanks for pointing it out. Just for fun I recorded it with the 240 fps video mode on my phone and you can see that the text colour is changing from black to white in response to the mouseover but the blue highlight does not always render.

Taking a new screen recording I did a quick time-space plot in Matlab.

https://imgur.com/a/Pih5Q8m

It appears to be an issue when you move your mouse more than one pixel per frame of the refresh rate.


Nice work!


The video is full of problems, you can see the 'Quit Safari' not even change to a white font despite having the blue highlight, there are cases of disappearing text in your video too.. There is a lot going wrong in the rendering there for something that is so utterly basic.

You don't have to view this frame to frame to see this you can notice it with every day use.

The thing is a decade old machine running the same software can do better, even if you analyse it frame by frame, its perfection. We've gone downhill.


Yeah this has got to be peak absurdity. Inspecting menu responsiveness frame by frame.


Yeah, this literally has zero impact on anything at all.

I simply don't care, even if someone can find an anomaly in frame by frame inspection of rendering.


I find it mildly interesting from a graphics point of view. MacOS has always had lots of attention to detail in its presentation and its interesting to see how the wheels can come off.


That's a fair point -- but I would argue Apple has always focused primarily on user experience, which often, but not always, aligns with graphical presentation and perfection.

There are times -- and perhaps this is one -- where it makes sense to drop things in presentation perfection in order to make a UI more responsive and better for the user's overall experience. I'm not a UX expert by any means, but I know for me there is little as infuriating as when I issue some sort of command and get no response -- if the cursor is pointing at something in a menu, that thing had better be selected and I don't really care a whit whether or not the UI representation of it moving there was frame by frame perfection.


I get that, but I would think that they would then avoid changing the text colour to white also if they don't render the blue highlight.


What you want is lower latency, not necessarily higher throughput. Unfortunately it's not just Apple, it seems newer hardware in general is less responsive:

https://danluu.com/input-lag/

The iPhones and iPads are quite good, however, but I suspect it's because Apple specifically optimised them to be.


For me that's one of the major advantages of iPhones over Androids. Android has always had those micro-delays. On a new device in the price range of an iPhone they're acceptable but it's (subjectively) never as smooth as an iPhone. I was always curious if it's closer integration between hardware and software or if Google just doesn't really care about it as it doesn't hurt sales.


I recently switched back from Android to iPhone with the new iPhone SE. I was just blown away with the speed and tactility of the iPhone. Just how the hardware and software work together to make it one experience. Whereas my Android always felt slow and laggy. Granted it was a mid-range one, but still, basic things like opening and switching apps should come with seconds delay and loading times.


That's weird. What did they break in the 16 inch model? Surely driver related. For reference, my MBP 2018 model is lightning fast doing that test of yours. I am running Mojave.


Yeah, there's something weird with the 16in integrated GPU. Hard to tell if it's the OS, the driver, or the hardware. I've run into performance regressions in the JetBrains Rider IDE that make typing absolutely miserable. I suspect they are related to the slow iGPU.


Is that with or without an external screen connected? I have also had issues in the past with JetBrains IDE's when I had a HiDPI (retina) external screen connected. When only using the laptop screen, it was quick. It turned out to be a JetBrains issue (in fact an underlying Java graphics rendering issue).


Without an external display. Probably a Java graphics issue like you said.


They could be pinning the refresh rate.


Same Mac, OS and result here too.


I just opened Safari and tried this on my 16" MBP (8-core i9, 32GB RAM, 1TB SSD) and I don't see any delays or flickering. The menus draw the same as they do on my work 15" MBP (2019, 8-core i9, 32GB RAM, 512GB SSD).


That doesn't seem like a reasonable test.

I dunno about interface responsiveness, but things I actually want to DO -- server virtualization, Lightroom, etc. -- sure do seem faster on my new rMBP than they did on my 5 year old model.


This is a weird bug in Catalina, related to an external mouse. If you try the same menu responsivity test with the trackpad, then it’s OK and everything’s instant. Basically, everything you do with the external mouse is sluggish, until you re-login or restart the machine.

I blame a memory leak in the WindowServer for it. Once it has over couple of GB allocated RAM, then the mouse is basically unusable. Especially text selection by mouse is a nightmare :)


If I remember correctly it was Lion (or Leopard?) that was an actual performance improvement on the previous instalment of Mac OS X. It was just so much notable snappier. Haven't had a single update since then that did the same.


I believe you're thinking of Snow Leopard. Definitely not Lion, which was a somewhat poor release, and not Leopard, which had a lot of performance issues initially.

Mountain Lion is also a possible candidate (because it fixed the problems with Lion), or Mavericks if you were on memory-constrained hardware (because it introduced memory compression). But it's probably Snow Leopard.


Snow Leopard was essentially a maintenance release, and I think you're correct there with the rest.


You could be correct there with Snow Leopard, definately not Mavericks. It was a release without a lot of new features but more under the hood stuff.


It really could be Mavericks in that, if you upgraded a machine with 4 GB of memory or less (which were still fairly common at the time!), I'd expect you to see a significant speed improvement. https://arstechnica.com/gadgets/2013/10/os-x-10-9/17/#compre...

I've seen this work and it makes a big difference.


Is this due to Retina Scaling? Have you tried it after using an app like Display Menu to choose a non Retina scaled resolution?


My 3yo iPad Pro is ridiculously faster than my 3yo iMac at some tasks like photo editing. It's night and day. Also quiet, cool and thin. Very excited about the Arm transition, warts and all.


My favorite workflow comparison between desktops and iPads is rendering/exporting 4K video.


This is actually a trick comparison. A lot of desktop software doesn't make use of hardware encoding facilities, even if available, since a good software encoder will provide better quality.


> a good software encoder will provide better quality

Why is hardware lower quality? Isn't the encoding algorithm deterministic and so the same wether you do it in hardware and software?


> Isn't the encoding algorithm deterministic

There is no "the encoding algorithm" when it comes to video encoding.

Think of a video codec as being analogous to HTML. HTML isn't defined as "this is the algorithm for producing a HTML file", it's defined as "these are the things a browser will do when you give it this data". How to combine those things to do what you want is up to you.

It's the same for video: the codec standard doesn't tell you how to encode the video, it tells you what you need to support if you want to _decode_ the video.

So different encoders can encode videos in entirely different ways, even when using the same codec. As such, they can have very different performance characteristics.

As for why a software encoder will provide better quality, it's because a software encoder has luxuries like a high-power general-purpose CPU and copious amounts of memory that allow it to do things that are difficult in dedicated hardware.

A hardware encoder on a mobile device for example will often be a tiny tiny piece of silicon in some corner of the SoC, so it'll have limited physical space for complex logic. It'll also have serious power and thermal constraints. These are things a hardware encoder has to work around that a software encoder doesn't.


The decoder is deterministic. The encoder is free to decide where to allocate more bit of which block type is better in a specific place or other things. A software encoder can be improved without replacing the entire cpu/gpu.


Ah so I guess hardware is generally a bit behind due to longer release cycles and more conservative in design due to cost of experimentation and mistakes.


Many video codecs allow you to pick quality settings. This allows live transcoding and streaming on things like phones and other low-power devices at the cost of quality (or size). This allows a lot of versatility with the same video format, because this way it can be used for both 4k movie footage and security camera streams.

You wouldn't want to encode a movie with phone settings. Encoders like NVENC and QuickSync would drastically improve encoding speed at the same cost, albeit likely at slightly higher quality because those chips can be actively cooled and therefore process more data within a time frame.


Hardware encoders are also often limited in features, often in relation to expected usage - which is often realtime video encoding from camera.

Software encoders allow complex settings or just spending more time on each frame, as well as they just might have more memory (I'm not an expert, but IIRC h.264 quality levels corresponded to huge jumps in memory usage, which had impact on available hw encoders)


One part might be that hardware encoders target real-time encoding. Fire up ffmpeg and you'll find that you can't encode x265 with its better quality settings in real time.


This just proves my point.

At the end of the day the enduser doesn’t care about the technical differences.

They just want their videos. Which is why the iPad is so impressive.


If Apple doesn't provide some form of acceleration or support for x86 hypervisors I can see this leading to mass exodus of the Mac platform for web developers. It will be interesting to see what Apple does.

Given the technological steps Apple has made, it seems like it is only a matter of if, not when Apple will switch over some computers.

I personally would predict the Macbook Air (potentially a new Macbook), Mac Mini, iMac and potentially the iMac Pro will switch over to Arm first. It seems like a poor risk/return ratio to switch the Macbook Pro and Mac Pro lines to Arm at this point in time. Who knows what the manufacturing yield will be on the initial 5nm chips.


> I can see this leading to mass exodus of the Mac platform for web developers.

Why _web_ developers?

I'd have thought that web developers would be some of the last developers to abandon Macs due to a change in architecture given that lots of web development is done in scripting languages which would need minimal support to move architecture, and the fact that Apple's ARM chips tend to perform well in JS benchmarks.

I'd expect that it would be system software engineers working in languages like C/C++ who would abandon the platform given that the majority of their tools and libraries may need extensive porting work.


Web developers frequently use Docker for Mac which is a way to run Linux containers (which are most frequently built for x86-64), which requires a way to run a x86-64 hypervisor.

Docker for Mac runs a Linux VM that in turn runs the containers running on developers laptops.


Doesn’t this answer your own question though, the Docker vm is just virtualized to x86 and that’s it?


It's virtualized for the same architecture, which is not very costly on today's CPUs. ARM emulating x86 is a whole different beast.

On the other hand, why not run arm-docker on a virtual arm-linux?? Why does it have to be x86?


Last time Apple did this (PPC -> x86), the new Intel CPUs were so much faster & more efficient than the equivalent PPC chip that programs ran at the same speed under emulation, and the system & native programs ran much faster, so it was still a worthy upgrade.


For power efficiency this may be true this time as well... but for sheer performance and latency under spiky web-style workflows, I'm not hopeful.


Virtualization ≠ emulation, for purely cpu-bound tasks performance can be near-native as long as the underlying processor architecture is the same. Not so if you're emulating x86 on ARM...


Many web developers still need to test Windows-only versions of IE and Edge.


Windows 10 now supports ARM64. I don't know if it includes IE, though.


I think there were some Win8 IE builds for ARM but I don’t think the full range of versions people target is available


Yeah they had IE for ARM because of the Surface RT, I believe


web development has close relationship with graphics, either creation of graphics, or consumption. The latter category require the rich ecosystem of apps on the Mac platform.

Linux is a poor option in this regard.


Probably alluding to the current web developer love of Docker.


Not just docker, any virtual environment. If I'm deploying to a VM on Linode, its x86. Unless you are serving static content, you want your dev environment as near as possible to your server.


I think web developers will be just fine. I am more worried for people who use complex software such as CAD, music software or even Adobe suites.

Porting to ARM is not impossible, but it's an additional cost they have to deal with.


it's so lame that they removed 32bit app support via a OS update (I can't use a lot of the software I enjoyed OR go backwards) and are now doing this... like if you're going to switch architectures anyway at least let me run the damn software I was able to before the update it feels like no one can make up their minds over there

overall I see how moving to ARM is a good move, but it's so annoying at the same time especially since they just fixed up the mbps which drove a lot of people to buy them (including me) I feel more burned than excited wouldn't surprised if the same folks who greenlit the butterfly keyboards were behind this


I suspect they are removing 32-bit support now to ease the transition rather than doing everything at once. In a few years the pain of Catalina will have faded (somewhat) and presumably they'll only have to worry about backward compatibility with AMD64.


That’s exactly what the article says. It’s a slow transition starting with lower end MacBooks


Will ARM ever be able to emulate multithreaded x64 fast? ARM has a relaxed memory model that would cause issues.


Be interesting to see if some cloud players start migrating workload to ARM as well in the future. Apple has a knack for pricking up on trends a bit early or creating them. I think iOS really kicked of ARM / smartphone market share.


AWS has released Graviton2-based EC2 instances[0] with excellent performance[1]. It seems like a lot of workloads can be moved to ARM-based VMs with little fuss.

[0]: https://aws.amazon.com/blogs/aws/new-m6g-ec2-instances-power... [1]: https://www.honeycomb.io/blog/observations-on-arm64-awss-ama...


I think the phone market was already entirely ARM when Apple joined the party.

You are probably thinking of 64-bit ARM, which Apple was very early to use.


Actually - Apple drove the smartphone market with ARM.


The iMac Pro currently uses some pretty high-powered x86 processors, so I have a hard time imagining that will switch over soon. The others though seem reasonable.


Have you seen the benchmarks between the iMac Pro and the chips in the recent iPads Pro?

Now take away the space, power, and heat constraints of the iPad Pro.

Edit: wow, this touched a nerve and got downvoted into oblivion. Well, here's a benchmark for the latest iPad Pro:

https://browser.geekbench.com/ios_devices/58

And the iMac Pro:

https://browser.geekbench.com/macs/imac-pro-late-2017

The single-core scores are about equal -- even though the iMac Pro is averaging a higher clock rate.

And at 18 cores, the iMac Pro trounces the iPad Pro's 8 cores, but again, heat and space constraints.


An interesting argument, but even if the 8-core iPad processor could achieve somewhat higher performance without thermal/power constraints, it's nowhere near enough to bridge that multi-core score performance gap.

Adding more cores isn't necessarily an option for Apple yet either, considering that (from what I understand) their yields are relatively low because it's hard to make chips that large on such a small node. Even more cores means lower yields and greater costs.

I'm not saying the iMac Pro won't get there, but I highly doubt it will be a part of this first wave.


Mac sales are a drop in the bucket compared to iPhones and iPads, and I strongly suspect they will share a chip family. Yields should be the least of our concern.


The problem is the larger the chip, the lower the yield. High end desktop/server chips are much bigger than mobile chips.


I'm really worried - there's software that I rely on such as parts of the JVM ecosystem that haven't had as much much work put into them for ARM as they have for Intel. How long do we have to bring things up to speed? Just a year? Obviously everyone has known this is coming but I haven't seen much action yet.

If we get the worst case scenario and Apple ships only ARM hardware from January 2021, then I feel like there's going to be some serious problems.


If we get the worst case scenario and Apple ships only ARM hardware from January 2021, then I feel like there's going to be some serious problems.

That seems extremely unlikely to me.


IIRC, and it's been 15 years, for the last transition they sold prototype towers that could only be purchased and operated by folks with an active and current Apple Developer account, and did not make consumer hardware available for 15 months after the June announcement.

So at the first WWDC the developers get a one year head start to work with Apple to get everything ready for the new platform, and then at the second WWDC Apple finalizes the launch plans and sets the launch dates for new hardware.

It seems likely that Intel editions of the 16" MacBook Pro and the Mac Pro will continue to be sold after September 2021, due to hardware DRM dongle issues — and it's likely that they'll keep around an Intel Mac Mini as well.

Given that ARM is now a datacenter option, it's probably time for the JVM ecosystem to stop being Intel-only in any case, but historical evidence suggests you have at least 15 months before general developers have any hope of buying one of these.


From the OpenJDK commit logs aarch64 seems to have been getting a fair bit of attention over the last year.

I do run OpenJDK on aarch64 and it seems fine but I don't run anything particularly serious.

If Apple ship an AArch64 machine with plenty of RAM then it would make a big difference to what applications people try to use.


For me it's the new compiler - Graal - many people say it isn't as tuned for AArch64 and some people have claimed its architecture isn't brilliant in all places for AArch64 either which could be a longer-term problem. I push the Java compiler pretty hard so a few % regression over C2 on AArch64 would be pretty nasty for me.


I think the complaint was more about libraries using JNI than the JDK, itself. There's a popular sqlite library that ships with native libraries, otherwise falling back to transpiled code. I'm not sure if they've done much work for ARM.

Java's actually the easy case. ARM means a lot of docker containers won't work, your development architecture is different than your protection architecture, cython libraries, etc.

This is probably the right move if you want to build a laptop with good battery life, but sort of like removing the escape key, it's problematic for a large segment of Mac buyers.


> I think the complaint was more about libraries using JNI than the JDK, itself.

No it's the compilers in the JDK, at least in my case.


I'm pretty sure the first device will be a low-end laptop. It should be great for that as the iPad shows. I'm really not convinced they have anything that can rival the iMac/Intel for extensive load.


yes, I can't imagine docker is going to be happy either.


It seems that the initial plan is that only "MacBooks" aka budget macs will have the ARM cpus. The Pros will remain x86....for who knows how long though.


On the other side, AMD APU code name was found in latest macOS release which should hint the other side of the story.

https://www.tomshardware.com/news/apple-may-start-selling-ma...


I much prefer them to switch over to AMD if cost was a concern. Rather than outright dumping x86 codebase.


I doubt that cost is the concern. Both Apple and Intel are big players, they can find a fair price between them, and Apple always had the threat of switching to ARM to get better prices.

I'm pretty sure this move is for power consumption and maybe so all Apple products are on the same architecture.


A13X cost $30, compared to cheapest Intel used in MacBook Air cost $200+. I think it is quite a difference. That means consumer are paying $300+ for x86 compatibility.


> A13X cost $30, compared to cheapest Intel used in MacBook Air cost $200+.

You aren't comparing costs fairly here. A13X costs $30 each + $XXX million to develop. With Intel the development costs are part of the SKU. If Apple launches a series of desktop CPUs, the cost to develop those chips is going to be substantial. Some of that cost will be in common with the iPad/ iPhone, but a good chunk will be unique to their new CPUs. Since Apple ships far fewer Macs than iPhones, the development cost/ unit will be significantly higher.


iPad Pro is already beating some MacBooks in CPU benchmarks. Apple might just reuse the same CPU’s.


> iPad Pro is already beating some MacBooks in CPU benchmarks. Apple might just reuse the same CPU’s.

Maybe some. Just considering the size of the devices, I'd expect the 16" MacBook Pro would have beefier CPU options than the iPad Pro.


Sure, but I don't think the design cost is going to be that high--maybe even less than the design cost of having separate iPad and iPhone CPU's.


Maybe. But they will likely have at least 3-4 different CPUs for the various Macs and different clock speeds for those different designs (though clock speeds and core count will likely be handled primarily through binning). Development cost for each additional CPU will be spread over fewer and fewer units.

- MacBook Air

- High performance MacBook

- iMac / Mac Mini

- iMac Pro/ Mac Pro

If next gen Macs are going to support some kind of x86 emulation/ compatibility layer, performance isn't going to have to be comparable with Intel, it's going to have to be 2-3 times faster so I'm expecting something quite a bit beefier than what the iPad Pro ships with.


Yes that is why I also wrote in another reply [1] it doesn't make much sense financially. And I dont quite see how it make any sense technically either. Even if Apple refuse to use AMD CPU for whatever reason Intel's investor roadmap ( Which tends to be more accurate then what they shared to consumers ) shows they are finally back on track. ( It will still take a year or two to catch up though )

Software is expensive, writing, testing , QA.

On the hand, they are spending billions on stupid Apple TV Dramas, I guess they might as well make their own CPU for high end Mac.

[1] https://news.ycombinator.com/item?id=23465728


> it doesn't make much sense financially.

This I disagree with. The Intel premium here is likely somewhere in the ballpark of $100-200 per CPU. Spread across 16-20 million Macs sold per year, we're looking at conservatively $2 billion/ year they can invest in CPU design.

More important, Apple will control what features get added to their CPUs and can integrate other functionality into the CPU the way they have with the A-series chips.


Yes if you look at it from all of Mac perspective and selling it at the same price ( Which I hope they dont ) But per unit, it would be MacBook funding development of higher TDP CPU from 50W to 250W. Those are low volume, require new Node tuned for Higher Power, and possibly some design changes. If they follow the same Chiplet design as AMD, that could be $500M budget. If they are making the same monolithic die that could go up to $1B+.

And this is a recurring long term investment.


Source on that pricing?


Apple designs their own chips. They have a single fixed cost for the design work which gets amortized by the massive volume of device sales. The only variable cost is the cost of third party fabrication. AMD can’t compete with that.


I think it’s about time we relegate x86 only codebase to VMs.


That would actually make sense. I was kind of surprised to see many Hackintosh builds with Ryzen CPUs and reference motherboards working pretty much out of the box...


Yes, if you look up Geekbench, nearly all the top scores are from AMD Hackintosh.

https://browser.geekbench.com/v5/cpu/singlecore


I'm not familiar with geekbench - is it expected that a phone is at the top of those rankings? That seems a bit sketchy to me.


The top results are nonsensical.


Is a CPU with both ARM and AMD-sourced x86-64 cores possible?


It seems like it should be -- especially if Apple licensed AMD's Infinity Fabric. Apple could buy discounted dual or quad-core chiplets and add them onto their system. x86 performance would decrease to encourage shifting architectures, but it would allow a couple years of transition time.

All the talk about x86 emulation doesn't seem feasible. x86 is crufty enough when implemented in silicon and would be much, much worse being implemented by a team that hasn't spent their entire life learning all the weird little performance tricks for the architecture. Even if they somehow succeeded, Intel has deep pockets too and lots of lobbyists and would probably push for (and get) an injunction while in court. Even if Intel lost, the injunction would hurt Apple severely during the transition period. Apple would need still-patented x86_64, SSE x.x, AVX, virtualization instructions, etc that are all patented. In addition, if Oracle v Google decided in Oracle's favor, that would open yet another attack avenue.

Throwing in a couple hardware cores shouldn't cost a ton and would stop those legal concerns in their tracks.


Physically? Yes. The trick is finding someone willing to license the X86 IP to make it.


AMD already shipped one ARM server chip. At this point, I think they're more interested in their patents covering non-x86 parts of the chip that make it possible to pipeline data into the CPU.

If Apple is transitioning regardless, it's either lose out on potential profits completely or take what they can get for a few years. Making a deal would hurt Intel and get them money. AMD could probably hold out for a guarantee that Apple would buy their chips for the next 3-5 years too (at least on desktop).


Funny... A move to AMD would be less disruptive to the macOS ecosystem and would solve the roadmap issues. It seems AMD will have the lead for a good couple years right now.

Intel must be creating a lot of problems for Apple to warrant this move. Or maybe AMD is not willing to give Apple the same sweet deal Intel gave Apple to get the transition.


...Or more likely, they don't want to ever be relying on a third party ever again for chips since whoever they go with holds great power over their progress and timelines.


> progress and timelines

There haven't been many interesting CPU changes in a long time, and they're still using TSMC for fabrication. Arguably, you're better off with two vendors. I'm not sure if Apple has genuine roadmap concerns or is falling in the not invented here trap.


Wouldn't be a first for Apple, but this is no longer Jobs' company.


I believe AMD has an incentive to give Apple a good deal. They got a lot of attention from some areas for their new processors but most non technical users are still Intel first. If Apple starts selling with AMD it could boost their image and help them massively in other areas (e.g. servers, workstations) as well.


Well, there goes the Hackintosh!

Well, eventually, I imagine that Apple will continue to support their x86 Macs for some time, especially as we've recently had the launch of the revamped Mac Pro which is not a cheap machine. But maybe ten years down the line they'll stop updating it and that will be that.


I imagine you're right, but G5 owners might predict differently. They only got one more major release (10.5) once Apple switched to Intel, although there was a longer release cadence than there is now.


Ahh yes, before we had yearly iOS compatibility updates for the mac... I'm still salty that Apple broke my reminders in Mojave just because I upgraded my phone. They will have to use a much larger carrot or stick before I consider moving to Catalina.


This recent event has converted me to a full blown supporter:

https://youtrack.jetbrains.com/issue/JBR-2310

I lost days of time to what very much appears to be yet another hardware bug in Intel's latest core. If you don't believe me that it's a hardware bug, read the whole thing. It's probably a zero day security vulnerability too.

The only catch is: if Apple also takes the opportunity to iOS-ify Mac and lock it down to the point that it is no longer useful for professional work, I will have to drop the platform entirely. I've seen some decent AMD Ryzen laptops showing up on the market and I could use Linux with a Windows VM for the occasional Zoom call or similar thing.

Honestly though... I think if Apple pulls this off well without alienating their user base, it probably spells the end of the X64 architecture outside cloud and servers. Given that people prefer to deploy to the same architecture they develop on, it probably means X64 will eventually die in those areas too. AArch64 could end up being the core architecture of almost everything by the 2030s.


> use Linux with a Windows VM for the occasional Zoom call or similar thing

The other way around works pretty well these days. Windows 10 and their new terminal app plus WSL2 for development is a pretty good combo. Most things work as expceted, and if you use Visual Studio Code it has good integration between the two environments.


Except for bad font smoothing on Windows on standard DPI displays.


Subpixel antialiasing is disabled on newer OS X builds, so standard DPI displays look worse on OS X than windows now.


FWIW, Zoom runs reasonably well on Linux - as long as your webcam is supported.


For my use (occasionally joining someone's meeting), the Zoom web client completely eliminates the need to install the app. They don't make it easy to find, but:

https://zoom.us/wc/join/enter-your-meeting-id-here


Yep, when a computer manufacturer with 10% desktop market share switches processor architectures, surely that will dictate whether the architecture will die. <-- This is sarcasm, to be clear.


Strangely enough, Apple's architectural hardware choices have a history of having effects on the broader market disproportionately to their market share.


aarch64 is not only chosen by apple. There are so many linux arm64 sbc floating around. MS also has their own plan on arm platform


Apple ARM chips probably have similar issues. They are just not under the microscope yet like the Intel chips are.

Modern processors are crazy complicated.


Yes, I'm sure ARM will never ever have a hardware bug so you will be safe!


>appears to be yet another hardware bug in Intel's latest core.

Does this bug also appear on Linux or Windows?


Yes. It also hard crashes the host Mac when run inside a Linux VM in Parallels.

Read the above sentence again. It escapes VM isolation.

It also crashes the whole machine on a Microsoft Surface tablet with the same chip in it.

So three OSes, and it escapes VM isolation. It's a hardware bug.

Mitigation so far is to try to use exotic JVM options to make the JIT be less clever and not emit whatever the offending code is... but we don't know what it is, so it's shooting in the dark.


You sure it's not a bug with the VM? VMs often have bugs.


If it's a VM bug then why does it also happen on other OSes without running a VM?

The same bit of software:

* Crashes the OS when running native, where OS is Windows, Mac, and Linux.

* Crashes the host OS from inside a VM running Linux or Windows.

I can't think of any explanation other than a hardware bug. How can the same crash happening inside and outside a VM on multiple OSes be anything else?


Apple ditching intel could lead to some great improvements in hardware, but I would be much happier if they made no changes to the hardware and actually started investing in OSX again. Catalina is a disaster. I've been using macs since OS 7 and I cannot believe how bad Catalina is.


What is bad about it? I'm still on Mojave


In my personal experience, I would say it's the buggiest and least snappy version of macOS I've used in quite a long time


I think that the lack of snappiness is mostly caused by the new sandboxing and security features. I have mixed feeling about this. Things can run slower especially at startup, but extra security is a good thing.


For me macOS seems rock solid as long as you don't install certain drivers (Dell being the worst) and some security software in which case it quickly turns into Windows XP level of "blue" screens. They really need to work on making that stable that but I'm not sure how that would work. Normally you would just sandbox it. For example I used certain software because I was working in the government that was basically government mandated spyware it installed global certificates, checks and uploads network traffic and more. Mandated and buggy as hell.

My personal computer on the other hand uses pretty much only App Store and Homebrew cli utils and is rock solid.


“They really need to work on making that stable that but I'm not sure how that would work“

Apple has outlined how it would work: by moving drivers into userspace (https://developer.apple.com/support/kernel-extensions/ , https://developer.apple.com/documentation/driverkit)

I guess that, with a move to ARM, they would enforce this for all drivers, if performance allows it (video drivers might be the exception)


I regularly have the terminal crash when switching between full screened and windowed mode. Other times it will freeze up and beach ball for 30 seconds. How you could possibly screw up an application that simple amazes me.


My wild prediction: Apple will remain using x86, but based on AMD chips and integrating stuff like ML, security, etc. Both of them use TSMC as foundry.


Came to say something similar. I’d love to see some kind of hybrid x86-ARM system that is able to retain the value of Apple’s x86 investments while also leveraging Apple’s deep ARM investments in the PC product lines.


Hyperscaler servers (eg AWS Nitro) have dedicated hypervisor processors, and then customer workloads run on another processor. Imagine this architecture in a PC: macOS as “hypervisor” and applications as the user workload.


A Ryzen MacBook Pro is a dream machine for me. I think its unlikely though.


Could it be that Apple brings an X86 emulator on the machine like they did with Rosetta in the PowerPC to Intel transition? Most calls to native libraries like Metal and UI would be handled natively so we probably don't notice slowness in most apps. Even Chrome and nowadays Adobe Photoshop are compiled to ARM versions. If so this will be a smooth transition.

https://en.wikipedia.org/wiki/Rosetta_%28software%29


It's early from a pure legal no-negotiation standpoint, but remember that we're approaching the point (which has gotten surprisingly little coverage to me) where a lot of core patents on x86-64 will expire. AMD64 was announced in 1999, 2000 saw the spec out, 2003 was the first implementation if I remember right with Opteron, and in 2006 Intel had Conroe out. Even Apple officially dropped support for early x86-64 Macs, it was often due to things like early issues with their EFI implementations or GPU drivers for Intel IGP, not because of some added instruction they were depending on. There were many examples of people doing some hacks to get non-supported versions of macOS to run long after it wouldn't install by default and it would work fine. So in terms of the minimum needed to achieve essentially all the backwards compatibility Apple would care about, it's gotta be pretty close since patent applications and priority dates are typically well before implementations. And even if not everything they wanted was expired, the mere fact of it happening would create leverage for other bits.

So it'll be really interesting what they've done, they may well have not merely an emulator but something involving a hardware layer as well. Or I guess maybe they'll just dump BC like some have pessimistically expected, but useful forms of x86-64 becoming open to any player seems like it might be a pretty big deal for the industry doesn't it? As we've reached the flatter part of the S-curve and the ISA lifetime has stretched out far longer then anyone expected at one point, it's actually catching up with old IP law for once. 20 years is a long, long time in tech. But it's not forever, and maybe not so long as it once was even.


The hardware patent expiration situation is interesting. It will be exciting to see open hardware reimplementations (FPGA and ASIC) move beyond the usual, long-unencumbered 6502 and Z80 architectures and into stuff like 32-bit 80x86 and 68000 (In fact there's already a high-performance, pin-compatible reimplementation of the latter, and it's quite usable for retrocomputing). And the situation wrt. peripherals is just as compelling, with quite a bit of new, retro-compatible hardware being released and extending the usability of these well-understood architectures.


32-bit x86 has been patent-free for a long time. There's a small number of Chinese companies making "586"-level SoCs with them. Here's one of the more well-known: http://www.vortex86.com/


I wonder if AMD could, under the terms of their patent cross-licensing agreement with Intel, deliver Apple an "x86 processor" in the form of a mask ROM containing an x86 emulator for ARM. I'm sure AMD would be happy to do so to get any amount of additional revenue from Apple.


It would be much easier to sell them an Infinity Fabric license and defective chiplets (Apple probably only want's 2-4 of the cores to actually work anyway).


Right, I expect the seamless native library integration to be there too, but AFAIK that would be new secret sauce: For the Intel transition they just shipped fat versions of everything.

My guess is that dropping 32-bit support last year was a way to front-load the developer cost / user pain of the architecture swap in service of a mechanism like this, which presumably only would be practical 64-bit x64 to 64-bit ARM.


I’d like to know more about ARM Photoshop — is this fully featured photoshop, or are you referring to the iOS app?


As announced originally the iOS version is compiling the same code as the desktop version. The unfortunate thing is that they have to slowly redo the whole UI touch first. That disadvantage is not present on macOS where they can use the current UI code and just compile it to ARM.

https://theblog.adobe.com/adobe-photoshop-aero-gemini-dimens...

Don't forget that most photoshop code also ran on PowerPC and is thus platform agnostic.


The iOS app should be a fully featured one - https://www.iphoneincanada.ca/news/photoshop-ipad/


Yeah, that didn't happen.

The iOS app had a tiny fraction of the functionality of the full Mac app. Adobe were simply lying when they announced that. Indeed, it's a very poor imitation, just with some of the same codebase underneath (and literally nobody is "wow, Photoshop's famous codebase! That's what I want because it's totally not an infamous piece of garbage that I tolerate because of it's functionality!").


The problem was that they needed to do the whole UI to be touch first. That won't be a problem on an ARM macOS version.



> If so this will be a smooth transition.

That is extremely unlikely.

Anything involving gaming is unlikely to run at even 50% of it's performance.

It is going to be an extremely, extremely rough transition.


Anything that uses a lot of CPU or pushes much data around.

I use Logic to make music. Many of te plugins run on one core, and can take up 100% of it. If performance via emulation is around 50%, that means a lot of drop-outs in the sound, and basically an unworkable situation. Been there, seen it, don't want to go throug it again.

The bigger plugin makers will probably port their products, but it's going to cost the user; the smaller ones may simply give up macOS completely.


> The bigger plugin makers will probably port their products, but it's going to cost the user; the smaller ones may simply give up macOS completely.

The writing has been on the wall though - Catalina with all its dropped frameworks & notarization already pushed some developers to give up on the Mac (or tell their customers to just stay on Mojave). Loads of Waves users were unhappy having to upgrade to v11 for Catalina support, and won't be happy paying yet again for Mac ARM support.

Come join us on the dark side. Plenty of us music people already switched from Mac to Windows. Even Linux audio seems to have come a long way, with Bitwig, Reaper, Renoise all supporting Linux now, and even plugin devs like u-he.


The fact that Ableton doesn't work on Linux has been the only thing keeping me with Mac hardware. I've been so resistant to Windows. But my resolve weakens with the continual erosion of the Apple ecosystem.


You assume it is a lot of work to port the plugins. Most likely it only needs a compile for a new architecture. And they will probably have months to check that checkbox in Xcode. It is not like they have to switch to a new programming language.


Audio plugins are one of the few software domains where you'll still often find handcoded assembly in performance critical sections. I wouldn't be surprised if a lot of third party vendors don't make the transition.

On the other hand DAWs are more complete in the box now than ever so this is probably less of an obstacle to switching from the users' point of view than it would have been 5 years ago.


Video too. I paid attention to the commits to x265 for a while, and quite a lot of them were like this:

  asm: AVX2 version of saoCuStatsE3, (136881c -> 45126c)
Just counting lines, there's more assembly than C++:

  ~/x265> find source -name '*.cpp' | xargs cat | wc -l
     95358
  ~/x265> find source -name '*.asm' | xargs cat | wc -l
    168690
That said, x265 does compile for ARM, and has ARM assembly as well, though much less of it, as of when I last updated my copy (March 2017):

  ~/x265> find source/common/arm -name '*.S' | xargs cat | wc -l
     11014
Looks like, in today's codebase, the line counts for .cpp, .asm, .S are 111188, 203423, 12217 respectively—so proportionally much the same.


I think almost no audio plugins (other than those ported to mac from iOS) use a blessed apple toolchain to the extend that they could just flip a switch in Xcode. And I wouldn't be surprised if some of them have very processor specific optimizations in their code.


Yes, I do. And I remember the previous transitions: they were far from smooth.

I've been involved in more than one platform transition, and there are always problems. Plugins are complex.


DSP etc. are the kind of things likely optimized for architectures, using intrinsics for vectorization and whatnot.


Any big macOS game would have an iOS gaming equivalent. The code was probably already metal and ARM compatible and probably only needs a recompile. They will have more than six months to figure that out.

Other recent more desktop like games are probably using a big game engine like Unity/Unreal with Metal support and probably also only need a recompile. And older games which are less likely to be recompiled can likely run perfectly fine even at 50% performance on a new ARM MacBook which is very likely way faster than the latest non pro intel MacBooks.


> Any big macOS game would have an iOS gaming equivalent.

Lol. That isn't true, in any way. There is virtually no commonality.

> Other recent more desktop like games are probably using a big game engine like Unity/Unreal with Metal support and probably also only need a recompile.

Do you really think that's how game engine support works? Heck, if that was remotely the case why does Metal get so little support with titles even on x86?

> And older games which are less likely to be recompiled can likely run perfectly fine even at 50% performance on a new ARM MacBook which is very likely way faster than the latest non pro intel MacBooks.

Any game with graphics performance is not going to run at 50% performance through an emulator. If it's done very well it might run at 20%, much more likely 10%. A "way" faster ARM MacBook isn't going to run five times faster than the Intel equivalent. Different thermal windows aren't magic. A 10% performance improvement is unlikely.


Given the progress, depth, and variety of iOS games, it's probably not a massive concern to Apple, no? There'll be some enforced pain on OSX game developers, but from a "majority" user perspective, they'll probably be able to turn on a big enough library of titles that it'll make no difference (and arguably be better).


How are mobile games relevant?

There is a huge gap in gameplay, complexity and whatnot between desktop/laptop/console games to those in phones.


We're talking about Apple's transition to ARM here - it makes perfect sense to compare iOS games to OSX games, given the relative improvements in iOS gaming and their running on ARM.

Apple's consideration about gaming will be "we have this massive catalogue of titles customers can play already here that we've been marketing, investing in, and pushing for years (and which get us a nice 30% cut of profits...)"

> huge gap in gameplay, complexity and whatnot between desktop/laptop/console games to those in phones

I made no judgement on that, simply that Apple's rationale for gaming is going to be based on the millions of ARM optimised titles they already have ready for the jump.

The mobile gaming market is a lot bigger than the desktop gaming market for Apple - expecting that to not have an out-weighted influence on their decision making is absurd.


Why don't those titles have a bigger representation on MacOS now given Catalyst?

The answer is that they're not the right types of title for the Mac demographic, and that the business model isn't there to justify the porting costs (and moving to ARM does very little reduce those, which are mostly UX related).


I wouldn't call gaming relevant in the macOS sphere at all, really. Certainly in the mobile sphere it is, but there's not really any crossover to desktop OS X.

There are games that run on OS X , but even for me as a relatively hardcore OS X type, I go sit down at my Windows 10 desktop machine when I want to game.


If they market this right, I don’t see big problems.

Many users live mostly in the browser, anyways. Photos, Pages, Numbers and Keynote will be native. If they convince Dropbox and Microsoft to have native apps, support iOS apps better than they do now (that should cover casual gaming), and make it cheaper than the ‘equivalent’ x64 one, I think they would have a product.


Microsoft already builds ARM Office for the Surface Pro X, and has the only official ARM-based Chromium browser, Edge, as well, so I suspect they'll be well positioned.


There are official ARM ChromeOS devices (and therefore Chromium).


Indeed – Edge is based on the ARM Chromium branch – though oddly enough, you can't get an official Chrome ARM build yet. I'm assuming Apple's push to ARM would make them actually release it. Also just remembered that Visual Studio Code is now building for ARM as well, which is interesting timing...


My only concern here is that the Mac line still only represents 10% of Apples revenue, and they might not give these desktop processors the attention that a supplier like intel or amd would to their own processors. I hope I'm wrong but I feel like Apple has been making serious missteps in the mac line for the past 10 years because its no longer their core product.


The Mac (a PC-style computer) is a strategic investment, due to its role in software creation. The brand is also part of the core image of the company.

Apple still commands only a fraction of the overall PC market, and while that is not growing as whole, their portion could by a great deal.

You’re right that there have been major missteps. But there have also been major corrective steps as well, which are just as important in gauging how the company will behave in the future.


Having tried Windows on ARM with the surface, I was surprised that many apps I use every day were not available. Perhaps Apple will be better at onboarding developers to the transition, but will also be interesting to see how long developers continue to support Intel. Can anyone speak as to the difficulty of working with ARM and Windows?


> I was surprised that many apps I use every day were not available

Why's it surprising? I wouldn't even know where to get an ARM workstation to test my software on.


They announced the move to Intel in 2005 and dropped PowerPC support in 2009, so I'm guessing about the same period for the ARM transition.


Unless they pull a rabbit out of hat, they will not have a x86-64 emulator this time (for the same reasons why intel is struggling with speed/power efficiency ratio for years now) so it won't be that easy. It will be either ARM of Intel hardware and people will have to choose.


> Unless they pull a rabbit out of hat, they will not have a x86-64 emulator this time (for the same reasons why intel is struggling with speed/power efficiency ratio for years now)

I don't understand what you mean - you absolutely can emulate AArch64 in AMD64. You can emulate any instruction set in any other instruction set.


They're saying it won't be fast or power-efficient.


Oh I'm sure they'll have one. It'll just be slow.


> They announced the move to Intel in 2005 and dropped PowerPC support in 2009, so I'm guessing about the same period for the ARM transition.

Zero chance. It's quite likely they don't even have a roadmap for how they'd develop silicon for the higher end chip sector at this point in ARM, never mind getting it done and moving software support in four years.

If this happens it will take fifteen years, and I think Apple may well abandon the entire endeavour (or MacOS) before then.


Pro-tip: Multilib. Adding a cheap binary translation VM for non-native apps running in a multi-lib fashion should be no big effort.


So you push the emulation boundary up as high as possible, rather than at the syscall interface? Good idea!


>Apple’s chip-development group, led by Johny Srouji, decided to make the switch after Intel’s annual chip performance gains slowed

I'm just not really buying this as the justification, Mac has almost never been about competing on raw performance and moving to ARM could even mean a large performance hit for most software where performance counts for years to come.

Although I guess we also keep getting lectured on how amazingly powerful iPad Pros are yet we never really see them do anything beyond a paint program, GarageBand level music production, basic video editing and keynote.


Apple has successfully done this twice before. Each time, major developers ported within the first two years and in the meantime, Apple sold computers with the old processor and new processor.


It’s more so with Intel missing deadlines.

A good example is when core 2 duo was delayed and Intel forced apple to use 32 bit chips in the MacBook Pro.


I think this would work well for me. Most of my day is spent with SSH or RDP sessions to systems that do the heavy lifting. So I'll welcome the power and heat savings.

Though I don't think I'll be buying one soon. When I travel I have a 2018 MacBook Air and when I'm at home I have a 2013 Mac Pro. Both machines still work great for my needs, and I plan to keep the Mac Pro until Apple stops OS updates for it. When it comes time to replace it, I'll replace it with a Mac Mini, and I don't need a machine that powerful anymore.


I guess I'm one of the smaller minority who prefers a hybrid with a significant amount of local compute power and remote compute power. Going all-in with thin client architecture is something I've never been a fan of, namely because it puts a fundamental resource I need to do my work in someone else's control/hands.

Much of the time, this is advantageous (and I prefer someone else managing things for me when it works) but I run into far too many snags where its nice to know I can get something done with local resources that I have control over when needed.


I am quite excited about this rumor. If only to finally find out what Apples plans with respect to ARM based Macs are :). Also, just in general it would be exciting to have a real contender for intel-compatible chips on the deskop. I still can remember the times when there were several competing architectures. And obviously, Apple has the potential to create really game-changing chips, considering what they are doing with the iPhone hardware.

One thing I still find peculiar is, Apple could have had nice ARM-based computers for quite a while. They are actually selling them in the form of the iPad, especially the Pro. But what keeps people to the Mac vs. the iPad is less the hardware, but mostly the software. The decisive difference in practical terms between macOS and iPadOS are the mostly artificial software limitations of iPadOS. While Apple loudly advertices the ability to copy files from an USB stick to "Files", the fun usually stops there. App support for file exchange is still very limited, you cannot even copy music to Files or your iCloud drive and add this to Apple Music or the TV app. So I find it a bit odd that they have to create ARM based MacBooks just as a solution to a basic problem of their software.


> I still can remember the times when there were several competing architectures.

That was when having detailed manuals and being relatively open about your architecture was the norm. I'd bet whatever Apple releases will not be as forthcoming.


I expect that Apple will gradually phase out MacOS, and add more features to iPadOS that will make it suitable for content creators. It never really made sense for Apple to maintain two overlapping but incompatible product lines. I predict in a few years they'll launch a laptop running iPadOS.


The problem is, and that is what I tried to point out with my comment, that the only thing preventing the iPad to be useful for more people is, how artificially limited iPadOS is. It is a great device, but not a replacement for a computer.


If you look at the direction of recent releases then you can see those features being added. The latest version has trackpad support for instance.

I don’t imagine much is going to change right now but I wonder what they might be planning for the next decade.


Yes, since they separated the OS from iOS, development of the iPad has somewhat gone into the right direction. But it took them a decade to add mouse/trackpad support, this is moving at a far to slow speed. And the fact, that you cannot add music to your iPad on your own, shows, how happy Apple is with its limited usage szenarios, there is no good reason for this behavior. They could have made the iPad a real MacBook competitor, the hardware is up to it. Apple has decided to keep the software limited and even prohibit third parties of closing many of the gaps.


It's not trackpad support. It's the ability to run your own software. If I can't run my own code (without jumping through hoops) it is not a real computer. iOS devices are consoles.


Personally I'd prefer a release of macOS for iPads. That would actually get me to buy an iPad Pro.

There's been some speculation on Twitter that for WWDC instead of releasing an ARM Mac for development purposes (for the Intel transition then loaned people Intel motherboards mounted in PowerMac G5 cases), they'd release a version of macOS to run on your iPad Pro.


They're really not incompatible -- iPadOS is really just a reskinned macOS. Under the hood they're essentially the same thing.

There won't be any phasing out of macOS -- it really wouldn't make any sense at all. They'd alienate all their devs.


I read this article very carefully, and I still have not yet seen any confirmation here or prior that rules out a semi-custom solution involving the other x86 vendor.

Perhaps this is just a game of semantics?

"Its own mac chips" vs "x86+ARM chips co-designed by AMD & Apple, fabricated by TSMC, and slapped with an Apple logo".

From AMD's semi-custom page:

"We are not bound by convention and don’t subscribe to “one-size-fits-all” thinking. We develop customized SOCs leveraging AMD technology including industry-leading x86 and ARM® multi-core CPUs, world-class AMD Radeon® graphics, and multimedia accelerators. We offer the option to further customize each solution to include third-party and/or open market IP or customer designed IP. Discover how you can differentiate with AMD Semi-Custom solutions."

https://www.amd.com/en/products/semi-custom-solutions

I still cannot see a hard switch to ARM without any HW x86 capability in the mix. The impact to user experience would be very dramatic and the PR would be a nightmare to deal with. The way I see this playing out is that the next gen of Apple hardware provides both an x86 and an ARM stack, with subsequent generations potentially being ARM only (i.e. w/ x86 emulation). There is just too much software investment in the x86 ecosystem at this point. You have to give people a path to migrate peacefully or they will never return. This isn't like prior architecture switches. The impact with PPC->x86 was not even 1/100th what the impact would be today if Apple forced a hard x86->ARM switch.

All of that said, I can understand why they would want to keep something like this under wraps until T-minus 0.


What you said makes sense, particularly:

> There is just too much software investment in the x86 ecosystem at this point.

But I don't have any confidence in Apple's leadership anymore. There was a lot of software investment in 32 bit apps, too, and they merrily launched a nuke at that entire library.

Do I think they're going to drop X86? No. But is it a possibility? Absolutely.


Have you been following Apple for the last two decades? They have always dropped legacy support - 68K, Classic MacOS, PPC, 32 bit software.

Keeping compatibility forever has its own drawbacks - maintenance, performance, increased vulnerability surface, regression testing etc.


68K support on the hardware side happened in OS 8.6. On the software side, it wasn't dropped until the Classic environment was.

Legacy support has deteriorated more rapidly recently, it seems.


The last 32 bit Mac was shipped in 2006. They announced they weren’t going to port Carbon to 64 bit about a decade ago. Was there really any great surprise that you shouldn’t be writing 32 bit software in 2015 let alone 2019?


I think the bigger surprise is that the first Intel Macs were 32bit and/or had 32bit EFI.

My gripe is more with how long they supported PowerPC on the Intel side.


I think there's a better-than-average chance that they'd cut compatibility just to show that they're capable of doing so. They'd probably get away with it, too.


It is easy to reason for switching to their own CPU on MacBooks or MacBook Pros, roughly at 16M Unit per year. But what about Mac and Mac Pro? Combined to less than 2M Unit.

Are we going to have Split in platform where developers is expected to debug on both Arch? This isn't the same as moving from PowerPC to x86, where majority of Pro Apps are already on WinTel. ARM is still relatively new on many Pro Apps. Adobe may be slightly better equipped, but not AutoDesk.

If not, would Apple spend additional hundreds of millions on 100W+ CPU design that are sold in tiny quantities?

It is also worth pointing out Mark Gurman has been saying this since before he joined Bloomberg when he was at 9to5Mac. And since 2016 when he joined Bloomberg the rumours were taken more seriously.

And the first rumours to suggest Apple is working on ARM Mac goes back as far as 2010.


There were no PPC Mac apps and developers shipped “far binary apps” that worked on both 68K and PPC Macs.


They don’t want to be dragged along by Intel failing to meet deadlines.


Or they just want to fold the hundreds of dollars per machine going to Intel into their own profit margins.


Intel forced Apple to support 32bit computing for almost 10 years because they couldn’t meet deadlines.

Apple has been dreaming about this day for years.


It's hard to imagine this is really Intel's fault and not just Tim Cook looking at a multi-billion dollar per year expense as an opportunity - they buy like 20 million processors a year it's a ton of money even by Apple's standards.

Even Intel's lacklustre results in recent years seem like an unlikely catalyst - Apple couldn't have foreseen 14nm++++++++++++ back in 2015. In 2008 though, Apple acquired PA Semi to design chips. This project has probably been percolating since shortly after that.


> the company plans to eventually transition the entire Mac lineup to its Arm-based processors, including the priciest desktop computers, the people said.


It's Bloomberg, they don't have a great track record on Apple.


It’s Mark Gurman, he has one of the best track records on Apple. He and Ming Chi Kuo are the two most consistently accurate Apple leakers/predictors.

It’s better to view Gurman as talent writing for Bloomberg than as Bloomberg itself.

https://www.vox.com/2016/6/1/11835514/bloomberg-mark-gurman-...


Given who wrote this, and for which publication, this article definitely needs the Daring Fireball disclaimer.


Here's the disclaimer that appears below any post on Daring Fireball that links to Bloomberg.

> Bloomberg, of course, is the publication that published “The Big Hack” in October 2018 — a sensational story alleging that data centers of Apple, Amazon, and dozens of other companies were compromised by China’s intelligence services. The story presented no confirmable evidence at all, was vehemently denied by all companies involved, has not been confirmed by a single other publication (despite much effort to do so), and has been largely discredited by one of Bloomberg’s own sources. By all appearances “The Big Hack” was complete bullshit. Yet Bloomberg has issued no correction or retraction, and seemingly hopes we’ll all just forget about it. I say we do not just forget about it. Bloomberg’s institutional credibility is severely damaged, and everything they publish should be treated with skepticism until they retract the story or provide evidence that it was true.

[0]: https://daringfireball.net/2020/05/bloomberg_publishes_click...


Unsurprisingly, DF commented today on the subject of Apple transitioning to ARM: https://daringfireball.net/2020/06/on_apple_announcing_the_m...


It's interesting how Apple takes the opposite approach of PCs. PCs have been on x86 variants forever, to the point that MS-DOS will run on a new PC without much fuss.

For Apple, this makes, what, the fourth architecture for Macs?


Imagine the cruft and patches that have accumulated in the past 42 years since x86 came up...


On the flip side, how many generations before a new CPU architecture becomes robust and mature to be relied on for critical work? We're still using crufty Wintel at work (vaccine r&d), its simply the best platform that keeps our investment in existing software.


Luckily ARM isn't anything new. Even Aarch64 is almost 10 years old.


By including even the Mac Pro in the eventual transition, Apple seems to be expecting to have their own chips beating out Intel/AMD compute performance for workstation-class tasks within the next 5-10 years. You'd assume they'd keep the "halo products" running whatever chips are best-of-class, rather than whichever are most cost-effective to put in; so if they're switching for even those product lines, they're seemingly expecting their own chips to become best-of-class.

That's an interesting bet, given how long the two giants have been at this.


Is it? How much faster are CPU cores going to get? Also, Apple has many billions of dollars to throw at this.


> How much faster are CPU cores going to get?

Whenever we hit a Moore's Law bottleneck, we see a transition to new CPUs being increasingly optimized for power-efficiency instead. Whether or not FLOPS remain a moving target, FLOPS-per-watt will very likely continue to grow for a few more decades.

> Also, Apple has many billions of dollars to throw at this.

Unless they're planning on selling these chips on the open market, I don't see how "throwing billions of dollars at this" project can be justified to their shareholders, even if it's something they can technically afford to do. As it is, it's a pure cost-center optimization (i.e. removing the need to pay Intel, at the expense of now needing to make the chips themselves.) This presumably balances out slightly positive on their books, not mind-bogglingly positive like a new product line would be. "So"—the prototypal shareholder asks—"why are you putting $bns/yr worth of silicon engineers to work on this, rather than putting them to work on feature cores to create+differentiate a new product line?"

In my mind, it only works out for Apple if it's actually not all that expensive for them to reach parity with Intel/AMD; i.e., if it's something they can do while still having silicon-engineering talent left over to keep doing feature engineering for new hardware. Which is what I find interesting: how did Apple reach this point, where they can leapfrog Intel/AMD without it even being a "drop everything" moonshot project for them?


AMD's whole revenues are very small compared to Apple, and the Mac and iPhone will share the same big CPU core uArch.


Whelp....that's the end of Apple for most/many professional developers. Apple is working really hard to give up their PC market share again like in the PowerPC days.

Great OS (although worse than usual recently), doesn't run any (hyperbole but rooted in truth) software.

If they would just focus on running MORE software, especially games, they could probably grab so much more market share, but they are happy at 10% it seems.


I agree it's pretty frustrating from a developer standpoint.

However, the majority of people aren't developers and just want a computer that works. I could see this as a way to _increase_ market share and reduce the barrier to entry to expensive Apple products.


Why the end for developers?


I'm curious to see if they'll open up access to the T-series chips in our existing Macs to at least experiment with or use as a co-processor. The T2 is no slouch—it's based on the A10.

It also makes me wonder if they'll ship a lower wattage Intel part alongside their Arm chips in a transition period. I think that would be kinda cool, and would ease a lot of backwards compatibility woes. Or they could keep things more or less the same and just beef up the T2 with more cores and interconnect bandwidth.

It might not make much sense to ship a dual CPU Macbook Air, but it would certainly be cool to see Arm PCIe addon cards for the Mac Pro, where power and heat concerns are not as significant.


I've seen this headline for a decade. Is there anything that makes it more believable now?


The ARM Mac rumours have been quite hot lately if that's any indication of their accuracy.


With WWDC approaching and with Apple doing it online, they're likely pretaping it and with that it's more likely that some of the info in it has leaked.


I would like to see a return of the Macbook. I loved the form factor on mine, but after a couple OS updates the anemic processor became a painful bottleneck. On the flipside my several years old iPad pro still feels blazingly fast.


Really the new MacBook Air is essentially a slightly larger continuation of that design that also has a fan, specifically because the processors in the 12" Macbook weren't that great. I could definitely see a future ARM based Air returning to a fanless design though.


I don't use a Mac any more but from what I see and hear, most Mac users aren't clamouring for more speed or for even thinner laptops, but for a more stable, less annoying operating system.


I want a thicker, heavier Mac with lots of ports.


Truthfully, and to go against the grain a bit, whatever makes the platform faster sounds good to me. If my IDE, browser, and terminal tools continue to run just fine I'm not going to be up in arms about this change. We've been married to x86 derivates for too long, even more so to Intel's critically broken implementations of them.

Shitty keyboards and useless touch bar aside, Apple has had a long history of pushing the envelope in radical and beneficial ways.


That means they have to keep up with CPU performance for the next decades. It might be easy now but let’s see if they don’t hit a pothole and have to go back..


This is great news for software engineers, IMO. More battery life (hopefully), comparable performance (hopefully), and lower cost (hopefully). Contrary to popular belief, I think Apple has embraced those three principles in recent releases more so than they were 5 years ago.

Everyone who writes software on macOS is probably virtualized already. Really shouldn't be any downside to this for the vast majority of programmers.


Notably missing: Any hints about hardware for developers.

My conspiracy theory is that the sketchily rumored “gaming laptop”[1] is actually a hot rod ARM MacBook focused on developers to get the transition off with a bang.

[1] https://www.macrumors.com/2019/12/30/sketchy-rumor-gaming-ma...


Imagine a cheap Apple ARM low power high performance Mac Mini for devs... Apple would never do it, unless as some sort of educational campaign to get a low cost education and coding platform into the hands of billions of kids


That article describes a high-end gaming machine, with a potential pricetag of 5000$. Given that the Intel-based developer transition kit was only 999$, I can't see their developer model being that expensive.

For the developer model, perhaps they'll revive the Macbook 2015 form factor with an ARM chip instead, to distinguish it from the Intel-based Airs and Pros.


The Apple of 2020 is in a very different market position than that of 2006.

I also expect that you won’t need this box to do ARM development: Why should ARM macOS development on an existing x64 Mac be any different from ARM iPadOS development there?


I mean, ideally it shouldn't. The vast majority of apps should just recompile. But I don't think you can compare them quite like that, remember that the iPad/iPhone emulators on Xcode run x64 code, not ARM. If you're doing anything that depends on architectural features or drops into inline assembly, you're going to need either an ARM emulator on your x64 box (which apple doesn't do right now for iPad/iPhone dev), or an ARM box to test on. Right now for iPad/iPhone development, if you wish to use some assembly you can test it on device.


It may help that clang/llvm is a cross-compiler by default.


$999 buys a low-spec MacBook Air and could buy a slightly higher spec ARM-based machine for the same margins. I still think they'd be very constrained. That original PowerMac chassis with a PC inside also won't happen again unless they have a ton of surplus MacPro chassis they need to offload.

$999 can buy the MacPro monitor stand...


I'm not saying it will be 999$, but it will probably be at lost less than 5000$ of that article. I'd say in the 1500-2000$ range. Something similarly priced to an iPad Pro + Magic Keyboard + Pencil combo.


Apple is on a very different track today than it was back then and buying hardware for developing for the platform is much less risky than the $999 price tag implied.


My conspiracy theory is that there will be a dual-boot iPad Pro + Magic Keyboard combo for a few $K…


If Apple wanted, they could just offer a developer version of macOS for ARM that runs on an iPad Pro. This would completely solve the question of hardware availability for developers. Especially as lots if not most developers already have one of these.


Mightn't even have to dual boot if Hypervisor.framework were ported to ARM.


I don't know about dual boot but I wouldn't be surprised to see a clamshell iPad.


Apple has never been good at gaming.


I guess it depends on the development tool chain, but when I tried compiling some C++ stuff on the AWS ARM instances, I found it was like 50% slower.

I suspect it was from some combination of branch prediction and OOO execution being weaker.


Were you using a1 instances (Graviton) or m6g, and c6g, r6g (Graviton2) instances?


a1, this was before Graviton2 was released.


I'm wondering if this is an unpopular opinion, but I wouldn't want to use anything other than a desktop for development.


Well, the rumor also mentions a “gaming iMac” which is just as preposterous, so maybe they roll out one of each.


Could the new iPad Pro works as a development machine?


Yes, but that’s so awkward technically and from a marketing / product segmentation perspective that I’d be shocked if Apple went that way.

They want to sell this transition as letting them build the best Macs ever. A kitbash iPad, despite its virtues, is definitely not that.


I don't mean as a product to sell to the masses but something that could be used to develop / test ARM apps before the new shiny laptop arrives sometime in 2021. Not the most ergonomic way to develop apps but maybe possible?


Absolutely, but iOS and its "security" is holding it back.


I meant iPad running a version of macOS


It already does,

Pythonista, Continuous, Codea, Shaderific, Playgrounds

Are enough to keep me busy.


Try out code-server[1]. It let's you run VSCode in a web browser. I have a 10$ a month vultr server set up for personal git hosting and running this, and it works great on my iPadPro (as well as my PinebookPro).

https://github.com/cdr/code-server


I've said it before but I really love code-server. Recent releases have been a bit messy (change of deployment model didn't really work) but it's amazing. I got it running on a $5 VPS to access my dev environment from any machine without the need to sync anything (no more forgetting to commit code before I leave for a trip).

As long as I'm <10ms from the VPS I can't tell any difference between code-server and native VSCode. I really hope their business model is successful.


Thanks, but there is already the Azure version of it, where VSCode actually started.

https://azure.microsoft.com/en-us/blog/cloudshelleditor/


How well does this run on an iPad Pro? I assume that this is a web app. Does it run OK on iOS Safari? Thanks in advance.


I never used it, just the apps mentioned above, just wanted to note that there is where VSCode came from.

I usually only use tablets when flying, not the ideal place to access Internet (very few Airlines offer that on European flights).


The article mentions higher GPU, NPU performance and higher efficiency. All of this I would expect since the SOC is likely incorporating much of their mobile experience/IP.

However, it doesn't mention CPU performance or IPC, both of which will be extra important due to the binary level compatibility for x86 I would expect them to ship.


There have been a few models of Windows-ARM snapdragon-based computers for a little while now. Somehow, mindblowingly, they didn't bother to ship the first ones with a native Chrome port (only Edge). Now that they have native Chrome, and Apple is moving macbooks, I wonder if the tide will shift towards ARM for all laptops?


How many times have I heard about an ARM MBP, an Apple TV (a real TV with screen), an Apple Electric car, or Apple Glasses?


I read somewhere that emulation of x86 on arm is much wrose compared to emulation of arm on x86. Can someone confirm this?


I don't know the answer to that, but keep in mind that Apple is control of the entire CPU design. They could for example put an x86 decoder in front of the ARM cores.

After all, modern Intel processors decode x86 to a simpler instruction set used internally anyway.


Doing something like this would be the smart move. If the do this they’re going to need to give developers lead time to get their software up to date.


x86 has total store ordering, which requires to add barriers to make the order respected on Arm while doing emulation. On newer Arm chips, barriers are much cheaper, solving the problem.


Not only that, but Apple if they wanted to could strengthen the memory model of their custom chips, meaning the barriers wouldn't be necessary during emulation.


I see in the comments that a lot of web developers are frustrated.. what programs do web developers use that only work on macOS? or is it also because of the retina screen? I've been developing on Linux for years without issues so trying to understand why devs choose macOS.


Haven't we been here before?


a couple of times, yes. apple has a pretty good track record overall when dealing with these transitions. typically it's been to a platform that has been a strong improvement.

if apple's making the decision to transition again, they must feel that it's for a very compelling reason once again. we don't have our hands on their current tech to agree whether it makes sense as a cpu upgrade, but likely it would make sense as a battery upgrade. whether that's a good enough reason to risk a transition, we will have to wait and see.

personally, I've been burned too many times on first-gen apple hardware to jump on the bandwagon immediately, so am taking a wait and see approach.

previous transitions that have been successful:

m68k -> ppc

ppc -> x86

x86 -> x64


The ARM rumors, or Apple performing a major platform transition? (In both cases, yes, but these rumors have been gaining credibility for quite some time.)


This would be the third major CPU change for the mac architecture: 68xxx to PowerPC, PowerPC to Intel and now Intel to ARM.


I'd say technically fourth, since x86 -> x64 was also a hard cutoff eventually with Catalina.


And iOS's ARM -> ARM64 cutoff was another architecture transition Apple (and developers) had to weather.


ah yes. I don't tend to do iOS development, so completely forgot about that one other than having a lot of the apps I loved stop working. still grumpy about pirates and civ rev being gone forever.


iOS apps on the desktop and the switch from x86 to ARM have uncomfortable many parallels with Windows 8 / RT.

How is Apple's approach going to be different from Microsoft's? Will they keep backward compatibility? Will the CPU architecture really be a day and night difference or are our expectations too high due to the ubiquity of inflated Geekbench numbers?

This will either make or break the Mac as a platform and I feel like currently, any further investment would be a gamble.


One thing that would be very different from a windows transition is that Apple is much more aggressive about pulling support for backwards compatibility. MS spends a lot of energy to support really old software for businesses whereas Apple typically gives a 3 year window then pulls the plug.


I wonder if there was consideration within Apple of switching back to PowerPC (power9/power10) now that it is open (for whatever that means, I am not sure)? They would appreciate the control that would give them.

Does somebody have an idea how much approximately would it cost Apple to switch the Apple A14 from ARM to POWER? Usually it is said that the instruction decoder is a small part of a CPU core, and the ISAs are not hugely different (compared to AMD64/Intel, at least).



I wonder why they're deciding to go back, after all they used to develop their own chips back in the day.


Apple have never used their own CPUs for desktop computers, only for phones and tablets.


Unless you count the AIM alliance - https://en.wikipedia.org/wiki/AIM_alliance


It says it right there in the article.

> Apple’s chip-development group, led by Johny Srouji, decided to make the switch after Intel’s annual chip performance gains slowed. Apple engineers worried that sticking to Intel’s road map would delay or derail some future Macs, according to people familiar with the effort.


Interesting, funny that Apple's own design constraints have in the past not been able to fully get the most out of Intel chips anyway.

Take the cooling/throttling problems as an example.

Goodluck to them on building a better chip.


"Project Kalamata"


The great unification continues between iOS and Mac!


A lot of discussion here about Windows/WSL2 or Linux on laptops, how it's gotten good enough for the HN crowd.

The overloading of the Control key as the system menu shortcut ("accelerator"?) key when it's also the default Emacs bindngs for readline in Bash -- drives me utterly insane.

If it were consistent, great, but on Windows there are many different text widgets, from PC console, Win32, and other layers. I simply can't develop the muscle memory.

I have a very cheap HP laptop, the trackpad driver is nearly unusable. I installed the Synaptics Control Panel, and use it to "reset" the trackpad each time it wakes from sleep so that I have a chance at scrolling without randomly selecting the entire document's text and deleting it or dragging it to random places. It's horrible.

On the Lenovo x230, the tiny trackpad is a bit better, but tiny, and the physical trackpad buttons give me the chance at dragging etc in the face of such madness. It's all very nerve-wracking.

The trackpad on the MacBooks have never been a problem for me.

Then there's text encoding. The Win-1252 Code Page. Turning UTF-8 into unreadable line noise in unpredictable situations. The CRLF madness that grows back no matter what.

I use WSL2, Terminal, and VS Code, but it's unbelievably exhausting. Digging out of config issues with Code Signing certificate policy required a re-pave and 12 hours of re-installing etc to get back to sanity. Something needed an old VC Runtime DLL, which installed fine, but also seemed to overwrite a Microsoft root CA cert. Differential analysis with a working Windows we couldn't find the broken cert. Various msc tools and System Policy analysis and Troubleshooters and so on couldn't find it. The CERT: filesystem "provider" stopped working. It was a dead machine.

Linux and macOS configs, I generally know where config files are, and can restore from backup.

Want to restore from a Windows Image Backup? Or a copy of the file system from another Windows installation? Go ahead. Try it.

I got heavy into PowerShell, there are some nice bits there, but it still has to fall back to text processing in pipes if some other tool doesn't output the correct data. Usually JSON, but is that schema documented? Certainly have yet to invest the time in Bash tools that might interact with PowerShell.

It just doesn't stop. It just doesn't. I must accept that people are actually getting work done by adding WSL to the mix, but I guess I just break things.

I don't seem to break things as badly on Solaris or Arch or even Gentoo or any of the BSDs.

I have not given up on macOS. On the contrary, I will get another Mac laptop.

It's a lot of work.


Thus far, the only source for this is Bloomberg, which published the story "The Big Hack", which was shown to be complete bullshit. They have still not issued a retraction, an apology, or any kind of acknowledgement that their reporting was so completely wrong.

Their credibility on issues of tech—particularly Apple—is very suspect. (See also: Gell-Mann Amnesia Effect)


That’s me going bye bye then. It was fun while it lasted. After the keyboard fiasco and “Vistalina”, this would be the last straw.


I, too, make major decisions based upon limited information.


I used the conditional, didn't I?


The keyboard "fiasco" was fake news. Overblown by the loud minority and with later iterations probably not an issue at all. But it's a stain that couldn't be washed away (they tried with the replacement program, but likely wasn't effective enough).

At worst the keyboard "issue" is just a preference.


>At worst the keyboard "issue" is just a preference.

Yeah, I prefer to have my B key registering 100% of the time instead of 80% of the time.


What segment do you envision preferring keyboards with keys that randomly fail?

I've had 3 Macs with these keyboards (should have been two, but one died completely and had to be replaced), all of them had this issue to varying degrees. This was not uncommon for the other Mac users at my workplace.

Even if fiasco is an overstatement, the initial denial of the issue I think is the major source of it becoming a media issue.


Next in line: macos on ipad.


iPadOS on Macs seems more likely given Apple's trajectory, I'd say.


Would be cool


This transition would be fun to watch. Mac has a huge legacy, enormous amounts of apps and sometimes it is embedded deeply in a lot of workflows that it would be very challenging to displace. Unlike the PowerPC -> Intel transition, this time round, Apple has the iOS ecosystem to tap into.

We'll probably still have 4-5 years before the Intel Macs are completely abandoned but then this is Apple we are talking about. For all intents and purposes, they might cut the umbilical cord in 6 months.

Adobe is probably the only company which can delay the complete transition for some time.

Edit - removed the line about electron. As others have pointed out, electron already runs on ARM.


> This probably might mean the death knell for Electron

Electron already runs on ARM-powered Windows laptops [1].

[1] https://www.electronjs.org/docs/tutorial/windows-arm


Apple threw away a lot of that legacy by not supporting 32 bit apps anymore and the world did not stop turning.


OTOH, the world continues using 10.14 Mojave. At least, I do.


Yeah, I've transitioned my main Mac laptop to Catalina, but I keep my Mini on Mojave because it does happen every once in a while that I need to run a 32 bit app. I can't afford to be all in on Catalina at this time.


True, but people are still buying new Macs that start on Catalina, and the non-HN crowd is almost entirely oblivious.


> For all intents and purposes, they might cut the umbilical cord in 6 months.

No, they're selling the Mac Pro today. Intel support is not going away anytime soon.


Electron is basically Chromium and node.js, if those two won't run it won't be much of a Mac.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: