Hacker News new | past | comments | ask | show | jobs | submit login
[dupe] Apple will announce move to ARM-based Macs later this month (theverge.com)
141 points by mike_ivanov on June 9, 2020 | hide | past | favorite | 168 comments




This is just a rehash of Bloomberg reporting.

https://www.bloomberg.com/news/articles/2020-06-09/apple-pla...

older: https://www.bloomberg.com/news/articles/2020-04-23/apple-aim...

---

Personally, I gave up on the Apple ecosystem years ago and moved to Linux. But I feel ambivalent about this.

On one hand, the iPhone/Pad chips have impressive performance and power consumption, at least for the mobile use case. Some competition in this space can't hurt, and it will be interesting how well they fare against Intel and AMD in a different environment.

On the other hand: this seems like a golden opportunity for Apple continue the path of merging iOS and Mac OS, turning the desktop platform into an equally walled and locked down garden.

This will also be a rocky road for very performance sensitive applications which Macs are used for a lot. Like video editing, Photoshop etc.


Funny how every regurgitation of the same article becomes more certain: "may" - "aims" - "plans" - "will".

"According to people familiar with the matter."


A lot of reporting on every different type of topic is sourced like this, but from the quality outlets, the facts tend to be right more often than they're wrong, blunders notwithstanding.


A lot of the time it is because the information comes from insiders who actually do know what is going on, but they'll get in trouble if they are identified, since they were not authorised to talk to the media. (Or, more rarely, they actually are authorised to talk, but on a no-attribution basis only.)


> On the other hand: this seems like a golden opportunity for Apple continue the path of merging iOS and Mac OS, turning the desktop platform into an equally walled garden.

They can't turn macOS into a totally walled garden, without making it impossible to use as a development platform. If I can't compile and run my own software, or software I downloaded from others in source form, I can't use it as a development platform.

And Apple needs a development platform, for people to develop iOS/macOS apps on, and for its own OS developers to use in developing iOS/macOS.


You can run your own code on an iPhone if you have a developer account. They could easily do the same for the Mac.


...for 7 days. Then you have to plug your iPhone back into your Mac and re-compile the software and reinstall it.

It’s an extremely shitty limitation that Apple put in place to give the appearance that they allow sideloading.


Sounds like the future for Mac on the desktop.


They don’t claim it’s to allow sideloading.


> And Apple needs a development platform, for people to develop iOS/macOS apps on, and for its own OS developers to use in developing iOS/macOS.

Have they shown developers much respect in the past? You can't even compile for the mac on other OSes, from what I hear.


> Have they shown developers much respect in the past?

I totally agree Apple is not the most developer-friendly company, and I wish they'd focus on developer needs more. They should copy a page out of Microsoft's book and take Steve Ballmer's "developers, developers, developers" to heart.

> You can't even compile for the mac on other OSes, from what I hear.

I can compile Windows software on Linux or macOS using MinGW. Did Microsoft do anything to enable that? No, independent developers cloned the Microsoft API headers, etc, to build a cross-platform compilation environment for Windows.

There really isn't a MinGW-equivalent for macOS. There should be. But, just as Microsoft didn't do anything to produce MinGW, I don't know why we should expect Apple to do anything to produce the equivalent for macOS. (It should actually be an easier job than the MinGW developers had – macOS uses the open source LLVM toolchain which is already cross-platform – it is just a matter of providing header files and library stubs for use when compiling on other operating systems.)


Why would you need mingw on a Mac?


Suppose you have an application written in C, which is designed to be portable to macOS, Linux and Windows. Using your macOS laptop, you can compile it and run it under macOS. You can also compile the Windows executable from your macOS laptop, although you still need to move the executable to a Windows VM or machine to actually test it. But it means you can at least check your code change compiles correctly for Windows (which does have various differences in APIs, headers, etc) without actually having to go the Windows machine/VM.

(You might try testing the Windows executable under Wine on either macOS, or even a Linux Docker container with Docker for Mac – however, that doesn't work well in practice, because Wine is full of bugs and gaps, and a lot of the time the executable doesn't work properly under Wine due to Wine bugs/gaps but works fine on real Windows. So you still need the Windows VM for testing. But you can stay away from it during compilation.)


Ah, ok. Fair enough, I misunderstood what you were trying to do.

Mingw on mac is evidently in homebrew. I've never used it, no idea how well it works. I don't write Windows software if I can ever help it.


> Mingw on mac is evidently in homebrew. I've never used it, no idea how well it works. I don't write Windows software if I can ever help it.

Well, it works just as well as MinGW on Windows does. If something isn't working with MinGW on macOS/Linux, it isn't going to work in MinGW on Windows either.

And MinGW on Windows works pretty well, but occasionally you hit some odd problems you might not have had with the Microsoft SDK (e.g. [1]).

[1] https://stackoverflow.com/questions/57885666/mingw-localtime...


Maybe they could go the route of herding most users to a walled garden, but giving developers more freedom? (yuck)


It seems absurdly implausible to me that Apple can produce an ARM chip that can compete with Intel and AMD performance wise.

Maybe they can get close enough on power consumption or instructions per dollar to be worthwhile, but I can't imagine any way an ARM Mac is going to be competitive for a workstation or gaming machine.


The new iPhone SE outperforms all of the Mac laptops in single core performance, which is crazy. Even if you have a fast MacBook Pro you will get hit by thermal throttling all the time. So even if this ARM chip can't compete on multicore performance in bursts it may be able to beat it on sustained performance, which may actually make it quicker.

It's not going to compete with the Mac Pro for a little while, but who knows.

I think this is the result of two trends, firstly ARM getting so much better faster (especially Apple SoCs). Secondly, how dire Intel has been at improving performance. They really haven't got very far over the last decade.


I remember when they went from ppc to x86, and you could not only run your mac software, you could dual-boot windows on the same hardware. (a reasonable desktop AND good games)

The future was very bright when that happened.

Now, it seems apple is gazing ever more inwards.


Windows is also on ARM, so I don't think they have to get rid of dual boot. Windows on ARM also supports emulation of x86 apps (though strangle not x86-64).


Discussed https://news.ycombinator.com/item?id=23465364

Does this article provide any actual new info compared to the one from Bloomberg?


I feel like this would probably be a good thing for a lot of Apple customers. Especially is they started this with their cheaper models.

Switching to arm might improve power efficiency (though surely most power goes to the screen?) but it could have a bigger impact on cpu temperature and reducing thermal throttling.

Bloomberg suggests that other laptop makers might also switch to arm. I think this is a place where apple could have a key advantage because their arm chips seem to always be significantly better than the competition (at least for mobile).

Most people will do computing consisting of:

- browsing the web/using complicated web apps

- watching video

- word processors or similar programs

I think these days we should be less concerned about some traditionally compute-heavy tasks because they have been being encouraged to switch to gpu compute for performance gains for a while now. Most things that care about performance will have been ported to gpu already if possible.

I feel like programs like word processors are probably more memory than cpu constrained and a small potential drop in cpu performance would not matter so much to programs that were designed to run on much slower hardware.

Web browsers (well chrome and safari) have already been seriously optimised for running on atm chips.


I envision there being a lot of ARM support issues across apps that I rely on. We finally got over the keyboard drama, I don't have confidence this will go well either.


Well they've already figured out multi-targeting in XCode and gave done this once before so I would imagine for a lot of developers it will just be a recompile.

I saw a really cheap MacBook Air deal the other day and was tempted until I saw they're i3 based. Assuming App support, I could see them replace that with an ARM based CPU without anyone noticing.


Yes Apple has mastered multi-targeting. Apple has completed MANY major transitions 68k -> PPC -> Mac OS X -> Intel -> a near perfect 64bit transition.

Then there is the NeXT side that ran on 68k, x86, PA-RISC, and SPARC

Of course OS X was the foundation to iOS

I think the forced death of 32bit apps last year entirely in preparation for moving to ARM.


> I think the forced death of 32bit apps last year entirely in preparation for moving to ARM

You probably mean this is so they're not in multiple transitions at once. Something to note here is that Microsoft's x86 emulation only works for 32 bit apps (a processor limitation?), but with Apple having more control over their chips (and no knowledge about the exact issue) I could imagine them having it solved.


No, what I have heard from a Mac developer friend is that Apple's Xcode tool chain is likely able cross-compline x86-64 applications to ARM with relative ease. By forcing developers to move to 64bit last year, much of the work will already be done for the ARM transition.

(Of course, there are always difficult cases that will take lots of work)


They've done this 2 to 5 times before, depending on whether you count the 32-bit to 64-bit conversions and iOS.

68k->PPC, PPC->PPC64, PPC->x86, x86->x86-66, arm->arm64


I'm not sure you could say they fully transitioned from PPC to PPC64, as only the G5s were 64bit, and laptops never got the G5. I think the x86 transition would have been easier if it started out 64bit, but that's probably Intel's fault.

More practically, the transitions so far have been like this: 68k -> PPC (With full 68k emulation) -> x86 (with partial PPC emulation) -> x86_64


what is x86-66?


X86-64 is the 64-bit version of x86, also known as AMD 64 and Intel 64. The first computers that apple shipped with Intel processors only supported x86 (32-bit). The most recent version of macOS removed support for x86, only allowing 64-bit apps to run.

https://en.wikipedia.org/wiki/X86-64


Yes, but what is what is x86-66?


A typo


You say it like it's a bad thing. I can't wait to have more competition as far as instruction sets go. Of course things will break but it'll give us a reason to make our apps and toolchains ARM-compatible beyond Raspberry Pis and small devices.


Is there a way to say "a lot of ARM support issues across apps that I rely on" as though it isn't a bad thing? Hahah. I do agree some instruction set competition would be a good thing, especially if this can further improve battery life on portable devices.


Depends on how fast they are and who knows, there might be custom hardware to help the emulation. I’m pretty sure while there will be issues 4-8h of battery life and your laptop working at full speed when not on the mains will be great wins and eventually worth the change.


Interesting. I imagine they'll have to provide some sort of x86/amd64 compat layer as otherwise this kills every application available for macos right now.

I hope they don't also continue to lock down macos (though I imagine they intend to) by restricting applications to only being available from the app store. That would set a terrible precedent for personal computing.


It kills every application whose developer doesn't issue an update in the next 12 months. If you buy an ARM based Mac and your favorite app doesn't work it was abandoned by the developer.

As a developer there is a nuisance of one more damn thing to test, but compared to the nuisance iOS developers go through to make sure it looks nice on all the different sized devices, testing on an ARM before release isn't bad.

Having developed through the 68k-to-PPC and the PPC-to-x86 and the x86-to-amd64 days I think I can safely say I never had a problem with any code in a transition¹². There are developers that will, if you have written optimized vector code or some such and you care to also be highly performant on the ARM you will be spending resources to make sure you map onto whatever the hardware provides, but you've been doing that for the various Intel instruction set changes anyway.

¹ I'm not saying architecture doesn't matter, doing early ARM porting for Debian I can tell you having C "char" be unsigned was a huge pain in the codebase. But Apple has control top to bottom and has every incentive to make it easy for developers, if only for themselves.

²At least not any problems bad enough to leave a mark on my memory. I'm sure I had to learn about intptr_t sometime after it was invented and probably learned a lesson about size_t at some point.


An app that isn't updated in 12 months isn't necessarily "abandoned". It can also be complete.


Yes, but recompiling it for a new processor architecture shouldn't be a huge energy expenditure. For better or worse, applications meant to run on "moving target" operating systems like macOS need to be updated.


Except if it used Atomics in a slightly wrong way and it just happened to work on x86 but now will lead to all kind of unexpected failures ARM which has a weaker default memory guarantees.

If you run into this it can:

1. take a while until you aware there is a bug (as it likely only happens spurious potentially with different effects every time)

2. take quite longer to somewhat pin down

3. And if you have not atomic experience and the bug is in a library might be outside of your skill/payment level to fix requiring major rewrites. And all of this parallel to all the other work you have to do.

=> While this should be rather easy for many app for some it might be a nightmare which is combined with the regularly planed work might very well take many month, after customers start using it and run into the bug.


I think the first two steps of this process are to pay Apple $100 and try to build the software under a newer version of XCode. Based on my experience trying to build XLD under any version of XCode this seems like it will probably take a person-week or so.


> It kills every application whose developer doesn't issue an update in the next 12 months. If you buy an ARM based Mac and your favorite app doesn't work it was abandoned by the developer.

Ah, so just like the Catalina 32-bit Armageddon.


>> Apple has control top to bottom and has every incentive to make it easy for developers, if only for themselves.

Uuuh, but they have a track record of terrible developer support, so I don't see this actually happening.


> I imagine they'll have to provide some sort of x86/amd64 compat layer

Just like they did with the mk86->ppc transition, just like they did with the ppc->x86 transition.

> I hope they don't also continue to lock down macos

Let's not confuse moving Macs to ARM processors with turning Macs into iOS devices. Apple has always treated the OS families differently and will continue to do so because they know more than anybody that they meet different needs.


> will continue to do so because they know more than anybody that they meet different needs

Will they, though? Because Catalina seems like a giant step in the wrong direction if you’re thinking they’re acknowledging the developer - or even power user - need. Seems to me they’ve doubled down on the iOS approach to macOS.


In Catalina, I see the move towards these macOS & iOS compatible apps ("Catalyst") as making it easier to develop apps for both. It's a service for developers, if you will (I am not an app developer!)

I don't necessarily see it as trying to create a single OS for both desktop and mobile.


To be clear: I’m not saying they’re trying to create a single OS. I’m saying the “Apple knows best” concepts they’re applying to macOS that are seemingly inspired by iOS are very hostile to a large part of their user base.


Fair, thanks for clarifying!

I, in fact, find the catalyst apps on macOS still needing work. And I’m talking the Apple ones: Music, News


Music is not a Catalyst app. Did you mean the new Podcasts app? Music is a trimmed down version of iTunes which is a Cocoa app.


Trimmed down in some respects, buggier and slower in others. It may seem to be a Catalyst to some because many aspects of the UI severely regressed to custom implementations of standard Cocoa controls that iTunes 12 used.


I wouldn't be so sure. Apple was happy to leave behind "pro" customers who needed floppy drives (1998), optical drives and ethernet (2013), and USB ports (2017).

Sure, the industry ended up following them on a lot of this, but that should be as much a cause for concern as the rest of it. An awful lot of the pro audio / video work that happens on high end Apple hardware could absolutely happen on a machine where the only software comes in via an app store.


> Apple was happy to leave behind "pro" customers who needed floppy drives (1998), optical drives and ethernet (2013), and USB ports (2017).

sure, but all that functionality remained externally - just not as part of the laptop. locking down the laptop is fundamentally different.


I don't know if those are the best examples. You can still plug a floppy drive or an ethernet adapter into a 2020 macbook pro's usb port.

I think what they did was they changed what it looks like to be a pro user. The average user gets a sleak experience while a "pro" user is going to have a docking station.

I actually like that they transitioned to the docking station approach. It means two things:

1. I have a lighter macbook pro for work travel 2. At my desk, one single cable: charges the computer; connects two additional monitors, ethernet, keyboard, mouse, sd card reader, high speed storage, and a scanner. I much prefer that one single cable vs the bundle of cables I used to have to plug/unplug each time I wanted to take or return my computer.


Exactly. The exec team have confirmed that including Schiller in recent interviews.

macOS is for everything, including Dev. They need that for the ecosystem. Look at the new Mac Pro ffs.

Can’t most apps just recompile to support new chip? We don’t need fat binaries right?


And they'll continue to say that right until the moment they yank the rug out from under you when all the pieces are in place.


The trop that your logic falls into is there's no way to disprove Apple will ever do this.

10 years from now, we'll probably be having the same argument, and people will be pointing to the latest release of macOS and complaining that surely this one means the end is nigh.


Indeed. I've been hearing people crow on about this since Lion. That was 10 years ago. OS X might be macOS now, but it's still fundamentally the same OS.


But there is an easy way to prove that they will and that 10 year horizon is short enough that I believe it will happen. Apple never cared about developers, they care about moving units and profits. Right now they are not getting a slice of the action of all software sales for the Mac and that must be a temptation they find hard to resist.

Remember how they turned their backs on the creative people that kept them alive for over a decade when everybody else had switched to windows? Similar story, similar ending.


> Let's not confuse moving Macs to ARM processors with turning Macs into iOS devices.

I wouldn't be surprised if they just do that. Ship the lower end laptops with ARM and either an extended iOS or a constrained MacOS.

> Apple has always treated the OS families differently and will continue to do so because they know more than anybody that they meet different needs.

But the different needs part only really applies for the high end Mac Books and from apples POV might no longer be a think if you can develop Swift/Objective-C and similar on iOS.


How would we expect a amd64->ARM compatibility layer to work? Emulation? Do we have any reason to think that would achieve acceptable performance?


Windows already does x86 to arm (not 64bit yet) in Windows on ARM and it works fine. Can play games and stuff ok on it, so definitely "acceptable". Not sure if great, but it's definitely usable.


Likely you'll just be able to ship fat applications that have binaries for x86 and ARM code with one click in Xcode


That is if you have a Apple only application using Swift/Objectiv-C maybe C/C++.

Also don't forget that ARM has weaker memory ordering, so expect to see some fallout from not completely correctly written atomics/lock-free code.


I would guess that they'd target these to light usage consumers first, i.e., the people who are somewhere in that weird place between ipads (but not pro) and macbook airs.

I tend to think that Safari, iPhoto, iMovie iWork, Mail, Calendar and maybe Office support (initially) would be more than enough for that segment of users.


Shipping for the high end first may also be a statement. It would motivate flagship app developers to invest in arm optimizations.


They did it before with Rosetta when they switched to Intel. Times and uses are different, but it'd be hard to imagine otherwise.


> That would set a terrible precedent for personal computing.

That precedent was set about a decade ago, and even here on HN there is a vast crowd of people who happily support(ed) that.


Honestly as a dev, this would make life a lot easier for a lot of engineers. People who care about openness will still have Linux and it's variants. The general population does not need that level of flexibility and just adds to the cost of developing / managing software. Web browsers wouldn't have taken off if writing thick apps and distributing them for Windows/MacOS was as easy to support as writing an iOS/Android app is today.


On the one side, locking down the Mac even more would be the worst idea Apple could have because this would push even more users out of their system. On the other side, looking back at the past few years, it is quite possible that they do so. The recent changes in both their OS and hardware indicate that they absolutely don't care about their pro users.

Hard to predict which path they choose, but I try to stay positive and assume that they have at least one manager capable of thinking rational and in favor of the pro users.


I would love to see new MacBook pros with the latest AMD Ryzen, speedy and power efficient, processors. I can understand the move to ARM, since iPhones and iPads are already there and it make sense to unite the ecosystem.

Fun fact: Kalamata is the second largest city in southern Greece. I tried finding why they named the Apple project like this, but couldn’t find anything relevant. Any ideas?


Olives are to apples, as arm cores are to intel cores.

(Kalamata is a kind of olive, a small tougher and pithier version of its blander behemoth brethren.)


I was thinking more like Kalamatas are black instead of green.


> MacBook pros with the latest AMD Ryzen

Looks like that ship has sailed unfortunately. ARM has won, and it won't be long before it conquers the minds of other laptop (and desktop) manufacturers.


Re: Kalamata, perhaps after the olive?

https://en.wikipedia.org/wiki/Kalamata_olive


After the last few years of hardware problems with Macs let's see who's the brave one to buy a v1 of any Apple new product (meanwhile there's probably people already making a queue to buy one)


If you look at the problems with the 2016 - 2019 MBP, and also the problems with the first generation intel MBPs, the biggest issue in both cases was heat. (Although there were clearly other issues as well.) The new Apple chips are going to be much more power efficient, so I think there is actually hope here. I still wouldn't buy a v1 because I think they'll have problems, but I think whatever problems they'll have will be less than previous v1s.


> the biggest issue in both cases was heat.

I don't think Apple was in any way naive about how hot Intel chips can run under load. It's also not impossible to make a laptop body that can run those chips with decent temperatures... if you're willing to make compromises on size.

IMO, the biggest issue was they still chose to make the systems thin in spite of the thermal tradeoffs that would result from that.

Yes, thinner's probably more desirable, but would people have stopped buying MBPs if they were a little less thin? I don't think their sales would have shrunk by much, since most Mac users are for the most part a captive audience, because they love MacOS.


> I don't think Apple was in any way naive about how hot Intel chips can run under load.

It seems like for the 2016 MBP, they either made significant last minute changes or moved up the release date by several months. When they released the computers the Apple stores didn't even have the right screwdrivers to open them for several months. That doesn't seem like something that was planned, but rather the result of some sort of timeline shift.

My guess is that they decided they needed them out the door before Christmas and so skipped the last six months of QA. Maybe it was something else, but I think it was more than just being unwilling to compromise on size.

Regardless, the 2020 models are basically flawless so far, so at least there's that.


One of the biggest problems with absolute data loss for current Apple machines is the T2 chip dying.


Can you expand on that?


There seems to be a bug where the chip will effectively change the master password for your encrypted disk to something random and leave you unable to access any data. I'm not sure on the details, but I've seen that happen. You might still be able to recover the data if you have iCloud reset enabled for firevault, but that obviously comes with its own risks.

Basically just never go more than 24 hours without backing up, one is none, keep multiple offsite backups, etc.


Speaking of iCloud, Apple now is able to make money when your machine dies in two expensive ways: large capacity iCloud subscriptions and a brand new machine. Now that's courage.


T2 encrypts everything, so when it fails, you lose your stored data.

EDIT: also see https://news.ycombinator.com/item?id=23076929


Their iPhone chips blow everything out of the water so this is exciting.

Also adds more fuel to the merging of iOS/macOS.


I am having a hard time deciding which name for the result of this merge would be better: iOS XP or macOS CE.


I think it would still be normal macOS, just on ARM chips. I find it hard to believe that they would get rid of macOS too given that the iPad Pro exists.


I wonder have they considered switching back to PowerPC (power9/power10) now that it is open (for whatever that means, I am not sure)? I am sure they would appreciate the control that could give them.

Does somebody have an idea how much approximately would it cost Apple to switch the Apple A14 from ARM to POWER? Usually it is said that the instruction decoder is a small part of a CPU core, and the architectures are not hugely different (compared to AMD64/Intel, at least).


They have already heavily invested into customized ARM chips. They do all they need. So no reason to look back at POWER.

Sure for high end mac the ARM chips probably wouldn't be the best fit for now. But staying with x86 for them is just way easier. POWER might only be interesting if they do a server OS. Which they made pretty clear they are not interested in in the near future. (The rack mounted mac is still a rack mounted workstation. Rack mounting high end workstations isn't that uncommon in certain audio/video processing related job areas. For a server it is missing some pretty basic thinks like: Redundant power supply, management interface, trivial easy to swap RAM/Disks without removing it from the rack, and probably more. ).


Last time they did this they managed to pull it off really well. From memory there was the ability to package apps for both architectures so they would run on either machine. And to bridge the gap there was Rosetta which allowed PowerPC binaries to run on intel Mac’s by translating in real-time with a small performance penalty.

How close are arm chips to PowerPC? They’re both risc architectures, right?


Eh, really well is a matter of opinion. Compared to the earlier transition from 68k to PowerPC, the PowerPC to x86 transition was much more jarring.

The 68k emulation was present up until the death of Mac OS 9, so there is some software that could run on the original Mac all the way to the Classic environment in OS X.

Rosetta was sufficient for most applications during the transition, but there will always be software that doesn't get re-written. Rosetta was introduced in 10.4, and dropped in 10.7. Only 3 versions.


This seems to be the source for the verge article: https://www.bloomberg.com/news/articles/2020-06-09/apple-pla...


I think this is a good news. The vast amount of people use their laptop for little more than running a web browser, and AArch64 chips will make those laptops cheaper to produce (intel charges Apple upwards of $400+ for high end chips[1])

Then there's everyone doing local development in interpreted languages, most of which have stable support for AArch64. Java, python, ruby, go on AArch64? No problem.

The biggest hit things will be mac-specific programs and software built specifically to work with BSD syscalls. We'll see about that stuff.

1. https://www.fool.com/investing/general/2014/04/24/how-much-d...


What is special about BSD syscalls on AArch64? Are you saying they will be different or absent?


Presumably there would be some incompatibility.


What would be an example of such an incompatibility?


What's been going on with the embedded ME/PSP in x86-compatible hardware for the last 10 years is ugly. If this is real--I'm hoping this evolves into a real alternative to that architecture though it's possible it could start doing the same thing.


Do you see any major differences between a T2-backed system and an ME/PSP one? How about TrustZone?


I really have to do some research on this. I haven't because I have not really been planning on purchasing a MAC. Here's some questions I would want to have answers to.

- Does the T2 have a private uncontrolled connection to system RAM? I know it has some connection to the onboard SSD, other IO and serves as an enclave for platform keys.

- Does the T2 have a private bus/connection to any onboard Ethernet adapter or Wifi like ME does?

- Does the T2 serve as the foundation of remote management features (that are present in all CPUs even if not vPro enabled) like ME does? This is important because it means that it's supposed to be accessed remotely by design.

- Does the T2 serve as the foundation of anti-theft features like ME does? This facility also has remote-access-by-design components.

- Does the T2 serve as the foundation of Protected Video or Audio path type features?

- What can the T2 make the CPU do? Can the T2 freeze the CPU, redirect the CPU to other code, then restore CPU state (I'm fairly sure the ME can do this)

- Intel CPU's have a built-in display adapter and you can't really disable or sidestep it. Hence, there is no separation between what is on the screen and the CPU, and by proxy the ME. To what extent can the T2 modify video RAM without the CPU knowing?


But don't ARM chips have TrustZone which is quite similar to ME/PSP wrt. the danger it can have. And isn't T2 additional to the TrustZone on their ARM chips? Or did it replace the TrustZone?


What's going to happen to Boot Camp? Are they planning to support Windows on ARM, or?..


I have a hunch they'll use the opportunity to kill Boot Camp, which has been unloved for years. However I hope to be wrong here — supporting ARM based operating systems would be a super cool thing for Apple to do.


Time to urgently make sure your favourite backend is generating decent AArch64 code and you app is compatible with a weak memory model...


The memory model of the ISA != the memory model of your code. C has a memory model and any correct C code is compiled to the correct ARM assembly. It's only really an issue if you are translating over x86 assembly, and then you probably have lots of other issues to worry about.


a lot of code (especially old code bases) does not actually target the c11 memory model but just happens to work on x86. A lot of code that targets the c11 memory model but primarily targetting x86, still assumes cheap (or free) release/acquire memory barriers (although admittedly arm64 should have reasonably cheap release/acquire and Apple could make them as cheap as they want).


Arm provides different models for consistency of changes to memory from different cores/cpus to Intel. Concurrent (typically lock-free) code may be relying on intel’s behaviour


If you’re the one implementing the language’s memory model then it’s your problem isn’t it?

If you’re generating the machine code then it’s your problem isn’t it?

If you’re writing lock-fee code that doesn’t use the C memory model then it’s your problem isn’t it?

> then you probably have lots of other issues to worry about

Yes, yes we do.


Does C really have a memory model for multithreaded applications? I tried googling but could only find references to C++11.

Also, failing to adhere to a memory model is one of those tricky things where things seem to work alright on architectures with a strong SMP coherency. But then it all breaks down and you end up chasing heisenbugs all over the place when those assumptions no longer hold. It's not easy to prove a lack of race condition bugs. You usually don't get a compiler error if you forget to mark things volatile, forget memory barriers or forget proper locking.


Can you ELI5 this?


Not all compilers and other machine-specific programs are as tuned for AArch64 as they are AMD64. Some may not be designed with AArch64’s properties in mind.


Oh sure. Thanks that makes sense.

I guess though simple apps should be easy. It’s just heavy apps that would require more than a simple recompile right?


With this and AMD's Threadripper benchmarks, Intel really needs to step up their game.


So is this an underlying reason for Catalina jettisoning so much older software? Is there a successor to Rosetta, one that will only support 64bit software?

Going to be interesting which models actually debut this month and their availability. Currently I am not in favor of moving to ARM but I leave the door open for Apple to surprise and tempt me


I wonder what this will mean for Bootcamp. Stupid question time: Does MS make an ARM version of Windows 10?



what does this mean for running docker containers on macos?


It means you better think about buying a thinkpad.


I moved to an X1 2 years ago when I refused to get the new keyboard and my 2013 MBP was on the way out, very happy to have Linux as the native OS for dev work, but I don't love the added time I do have to spend maintaining my own OS now.

Also slack and zoom for linux suck :(


Docker runs on Arm chips.


Yes, but Docker on ARM won't run your x86 containers.

Docker can run a different OS but not a different arch.

The point of Docker is kind of having the exactly same thing here and there.

With Docker, I can compile, run, test and debug my backend service on Windows or Mac and deploy to Linux, knowing that I ran and tested the exact same binary.

If you deploy on x86 (as you do), having ARM Docker doesn't help you much.


Good point!


but the image you want to run in it might not.


why can't I upvote you twice?


Most of my snark gets downvoted, so the one upvote is sufficient - thx!


indeed, my comment was downvoted four times. WORTH IT.


Docker containers on macos have such bad performance that I've given up and moved to remote development with VSCode. The killer for me was the innumerable bugs related to bind mounts causing 99% CPU usage. It just wasn't worth the trouble. I'm much happier now with a remote dev box. Really the kudos belongs to VSCode; remote development is implemented so well that I hardly even realize I'm using a different machine.


It means people need to start using buildkit and ship multiarch images. Insane that this is not already the norm. There are other arches beside x64.


What’s the server adoption of ARM?

Apple switched to x86 in the first place because it had become a standard. Adopting a common standard is always going to have advantages. The advantages of supporting or adopting ARM has to outweigh those advantages to be worthwhile. This is true not just for Apple but for the server market as well.

For Apple, there are two clear advantages: power efficiency (which is the same thing as heat) and vertical integration. AWS can enjoy the same advantages from ARM. But these are both essentially hardware providers. AWS still offers x86-based EC2 instances and while they might encourage users to migrate to ARM (by perhaps passing on their own savings), there’s a lot of inertia there. Apple is a company that bites the bullet and forces these migrations, but how many developers will support or migrate to ARM just because that’s what’s running their MacBook?


My comment wasn't really related to macs; imo if you ship binaries you should aim to provide builds for, say, the top 3 arches, and this is especially relevant for Docker images since the de facto standard for deploying them is just to pull them from dockerhub. Docker also fails in this regard since even though it knows the arch (it says it on dockerhub) docker client will still happily pull an amd64 image on ARM with no warning whatsoever


Right, what I’m saying is that until there’s a lot of movement in the server space away from amd64, I can completely understand why multiarch support isn’t a priority. It’s less of a fundamental principle and more of a tradeoff.


The support for multi-arch docker images is getting there, but there's not much of an ecosystem. Between Raspberry Pi, ARM instances on AWS, and ARM-based Macs, it ought to get to critical mass before too long.


Presumably, two options:

1. Build docker containers for ARM.

2. Run Intel containers under qemu.


Sounds like a “pro” use case.


> The shift to ARM will eventually include the entire Mac lineup, Bloomberg reports.

Much like Apple turned Final Cut Pro into iMovie Pro, this is looking like every Mac Pro will turn into an iPhone Pro. I could be mistaken, maybe they'll figure out some way of handling x86 with real hardware, but if not this is a massive gift to Microsoft.


I don't think they will switch the whole product line to ARM. My guess is that they will have a line of Intel computers for a long time for professional who need them.


the article says it's eventually everything


eventually could mean one year as well as ten years.


there is no reason to think it would be a long timeframe given their history with similar transitions


Apple already supports devices on different chips (Intel on laptops, ARM on iOS). There is nothing strange about them keeping laptops with different processors, depending on the intended use.


It's not just the hardware, it the third party software too.

If the hardware is available but you can't run the usual software no one is going to buy that. this is what I meant.

The hardware might be announced but it might not come until a decent enough number of key apps are ported.


As I said before, the article states expressly that they're moving everything:

> Like it did then, the company plans to eventually transition the entire Mac lineup to its Arm-based processors, including the priciest desktop computers, the people said.


Docker on Mac already runs a VM, so not much. There will need to be a CPU emulator or translator a la Rosetta if you’re container needed to run X86 specific code.


People seem to say it will be hard to transform apps to new architecture. Is that really true?

My understanding is when you develop for Mac, you already have to compile to bytecode which means it's already compatible with other architectures.


No way. Even if the Swift language works that way, virtually all Mac software relies on being compiled in some fashion.

The only software platform that might be the case is Android.


It seems like these new Macs might as well be phone with a keyboard in the way they treat their software and "pro" consumers.

Wake me up when they use less glue to hold their computers together...


Is this going to be like the surface Pro X? Ie a copy with arm that is kinda interesting buy no one buys because its not very good - but v3 probably will be. https://www.theverge.com/2019/11/5/20948092/microsoft-surfac...


Ugh I’ve been holding out on my 2012 retina MacBook Pro and was just about to upgrade to the 16 inch MacBook Pro. Now I guess I need to limp along another year to see what comes next rather than pay 3500 big ones to end up orphaned.


It's fine that Apple is moving to ARM. It's not okay that this is going to be a proprietary, locked down CPU. T2 has shown that Apple customers no longer own their hardware.


> proprietary, locked down CPU

Is it somehow more proprietary and locked-down than the Intel ones?


I think the worry is a locked bootloader.


That and Apple being able to introduce security via obscurity means we'll probably never see a non-Apple OS on the platform.


It's already quite difficult and unideal to install a non-Apple OS on a Mac, aside from Windows via Bootcamp. Not being able to install a custom OS is a long way from "not owning your hardware". Apple OSes are part of the sum package; if you don't want that, you aren't the kind of person who's buying Apple hardware in the first place.


3-4 years ago it wasn't difficult at all to replace macOS with pretty much any EFI-enabled OS. Moving to a proprietary ARM CPU/GPU will probably result in not being able to write applications unless it's using Apple's toolchain. If you have no say over what software you can run on your computer, you no longer own that computer.

We really haven't seen such a large technological separation like this. At least back in the old days you could still run what you want.

macOS moving to the iOS route where you have to jailbreak your device to get any sort of usability out of it is not a space I want to participate in, and yet it seems that's how the entire Apple line is moving. They're almost the complete opposite compared to Microsoft these days.


As long as developers continue buying Macs for building non-Apple-native software, there will be ways to get them to run whatever you want, because that's required for a dev machine. If Apple ever decides it no longer cares about selling Macs to developers, well, I guess that's a possibility but it would be incredibly stupid, because that's a significant fraction of their current market.


Apple customers haven't owned their hardware in quite a while.


Whats wrong with the T2?

I just heard some stories that you can't recover your data when your hardware fails (which is not that huge of a problem, who does not sync their files nowadays), and that some laptops get 'improperly' resold (not being deactivated), but thats it. I probably missed something.


The problem with the T2 is if it fails in any way, you now lose access to your all of your data and your motherboard is rendered useless. At best it turns your computer into a large piece of e-waste. It's also a fairly powerful ARM CPU with access to nearly everything within the system. It paints a very large target on its back for exploits.


who makes the non-proprietary CPUs that will provide competitive performance here in 2020?


If only LLVM bytecode were actually portable instead of "looks like it should be portable but definitely isn't."


I asked this on another thread when it came up, but didn't get any replies.. What does this mean for Intel as a business?


Thanks for playing, Intel.


The Verge is a pretty good source, isn’t it?


Isn't this the Verge's summary of the Bloomberg article from Mark Gurman? Why not link to the original article?


, according to leaks


Big if true.


Welp, I'm out, Apple. No interest in an ARM-only world. Like my x64 apps too much.


Now will be a good time to stock up on Intel macs. At least we will still have hackintoshes.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: