Hacker News new | past | comments | ask | show | jobs | submit login
Apple's long processor journey (liam-on-linux.livejournal.com)
162 points by AndrewDucker on Feb 15, 2020 | hide | past | favorite | 83 comments



BeOS seemed really nice at the time. I'm not sure if it would have worked out better in the long run, but it sounded great.

I always had the sense that consumer focused computers should be engineered with real-time capabilities, and instead, they ended up being based on time-sharing systems which seems like it hampers UI and media processing. For decades, and especially presently, I get so frustrated with lags and hiccups no matter how fast the CPU and storage are.


worked out better in the long run

The bigger obstacle is that it would not have worked at all in the short run - it was closer to a technology demonstrator than a complete OS. It's often presented as some sort of key decision between similar options, perhaps because Jean-Louis Gassée was also a former Apple exec and remains a well-known commentator. But I don't think it was anything like that, Apple already had plenty of its own half-finished OS tech.


I had both NeXTSTEP and BeOS, and you are spot on about BeOS. It just wasn't finished and had a lot of things it needed to do to get to the level of NeXTSTEP.


I occasionally hear this, but it's at least worth noting that there are people -- like me -- who ran BeOS full-time. I did for over a year. Gobe Productive was an AppleWorks-like office suite (by the original authors of AppleWorks, no less), Pe was a good BBEdit-ish code editor, the image editor e-Picture resembled Macromedia Fireworks... I could go on with many other apps. As subjective as it is, I preferred BeOS to Linux as a desktop OS at that time because the nascent app ecosystem was already nicer to use and, at least to me, more complete. (It's important to remember that "at that time" was 1999, which predates the first release of OpenOffice by over two years!)

In any case, NextStep didn't do anything in the short run for Apple -- it took dozens of engineers about four years to turn NextStep into OS X, and it's hard to believe that they couldn't have done the same thing with BeOS. Most of the reports I've read suggest BeOS was actually Apple's first choice, but Be, Inc. demanded what they considered an unreasonable price.

I think the biggest difference was vision: BeOS was shooting for the creative media market, while NeXT's biggest successes had come in very enterprise-ish verticals. BeOS was arguably a closer fit to the Mac's "prosumer and creative professional" vision; my unconfirmed, probably off-the-wall suspicion has always been that The Enterprise (tm) was closer to Gil Amelio's button-down, old school semiconductor industry heart. Buying NeXT clearly turned out to be great for Apple as a company -- just not for the reasons Amelio had in mind.


Sure, there were people who ran BeOS full-time. But 'preferable to an enthusiast to Linux in 1997' is too low a bar for 'basis for a consumer OS'.

In any case, NextStep didn't do anything in the short run for Apple -- it took dozens of engineers about four years to turn NextStep into OS X, and it's hard to believe that they couldn't have done the same thing with BeOS.

I think it's very easy to believe if you compare them technically. I'm happy to get into that discussion if you're interested but it seems obvious to me a complete, mature OS that was (at the time) considered fairly advanced technologically, had a track record of being ported to every architecture available (along with a track record of running inside/alongside other environments), plus the team that built and maintained all of that was a saner choice than an unfinished OS. Finishing an OS was precisely the problem Apple had for years and was trying to buy its way out of. BeOS and NextStep were not anything close to the same starting point.

I think the biggest difference was vision: BeOS was shooting for the creative media market, while NeXT's biggest successes had come in very enterprise-ish verticals.

I think there's some truth to this, in that Apple's computers had long been pariahs in business settings and getting out of that rut was a thing Apple wanted to do. I imagine it was a factor. At the same time - there was probably way more actual prosumer and creative professional software for NextStep than there ever was for BeOS and much more importantly - Apple was looking to buy technology, not vision.


Uh, ypu do know there are people who ran BeOS for real work and swear it's better than anything else out there, even today? There's a reason why Haiku exists.


Uh, ypu do know

I do! Although I'm not sure how it relates to what I wrote nor am I sure why you're taking that condescending tone with me - I assume you do know it's less than constructive.


I share that frustration example opening MacBook in public after closing it with a video playing and mashing the Touch Bar mute key.

Another example open MacBook start typing account password, but 8 or so characters in, the OS wakes up and deletes everything you’ve entered already so have to start again.


I feel the same. I fondly remember doing hour-long raytracing on the Amiga and having set the priority to something low I never noticed the rendering going on.

On the Mac, it seems no matter what I set the priorities of the involved processes, long compiles makes it impossible to game at the same time.


I think not being a real multi-user OS would have lead to trouble sooner or later, if it wouldn't have felt so unfinished in the first place. This was at the dawn of the internet, so you didn't really think about privilege separation on a desktop OS.


> I think not being a real multi-user OS would have lead to trouble sooner or later

iOS is still doing great. :)


Responding before the horde of Jobs fanboys claim the underlying OS didn’t matter. They like to forget that NeXT alone didn’t matter, nor did PowerPC Mac’s or the initial scroll wheel iPod’s all under Jobs stewardship. The company was saved by the following:

- transition to x86, and Bootcamp fallback.

- iPhone, with decent touch interface and native (non Java) API and open AppStore

- iPad at right time.

Without this trifecta, Apple would be extinct like SGI, Sun, and the others.

BeOS was what the geeks wanted. It was modern, multi core, C++ based, and with the backing of a big company, would have seriously dominated the desktop computing landscape. I’d be willing to state that Apple would have had a bigger market share with BeOS rather than NeXT.


Apple was robustly profitable by 2003, when I purchased my first Mac running OS X and a third-generation iPod. They turned that corner in the beginning of 1998 in fact[0]. They sold 50 million iPods in 2007.

[0]: https://www.cultofmac.com/461234/today-in-apple-history-appl...

It wouldn't be a trillion-dollar business without the iPhone, sure. I've no reason to think it wouldn't still be in business, and you haven't given any.

It's certainly possible that Apple would have thrived with a next-generation Mac based on BeOS. Personally, I wouldn't have switched; I was tired of dual-booting Yggdrasil Linux and Windows, and the prospect of buying a (relatively) affordable Unix workstation was exciting to someone who cut his teeth on SparcStations.

I'm not the median customer, then or now, however. Back then she was probably in desktop publishing; BeOS would have been just fine.


Apple wasn’t consistently profitable since 1998. For example they lost $195M in 1Q2001 with revenue down a stunning 57% YoY: https://www.apple.com/newsroom/2001/01/17Apple-Reports-First...

The iPod + iTunes Music Store was the real turnaround. Before that in 2001 it seriously looked like Apple might not make it, Jobs or not.


Well most people have chosen to forget the fact that either way for Apple's NeXT vs BeOS decision, Apple was close to being bankrupt anyway, had it not been for Microsoft bailing them out for $150 Million due to competition rules. The decision and terms was up to Microsoft. [0]

So Apple was very 'lucky' for this generous investment, without it would have seen them join the local museums and nostalgia meet-ups, like Amiga, OS/2 and BeOS.

[0] https://www.wired.com/2009/08/dayintech-0806/


It was more than luck; Apple was involved in a long-running lawsuit against Microsoft, one which they might have had to settle, and were running the serious risk of losing.

Microsoft was also doing a decent business selling Office to Mac users, which they would lose if Apple went out of business.

Last but not least, Apple stock was trading for a buck a share in August of 1997. If Microsoft held on to those shares, that $150 million investment would be worth almost 49 billion dollars today.

Not bad.


Apple wasn't leveraged back then, like we're used to companies being now, like, say, Tesla is. I think this is the disconnect between common assumptions and the way it really was.


This sounds to me like someone who didn't live through the period or wasn't paying attention, because it dismisses/leaves out the iPod and iTunes store.

When the iTunes store opened, that was the exact moment at which Apple was clearly a different company that was going to skyrocket. I did not have or maintain faith in Jobs, unfortunately for me. But that was the moment of "take all your money and buy call options".

As evidence of my opinion, you could look at the discontinuity in the stock price and how it proceeded to break out of the cyclical range it had been in for years and years.


Perhaps better titled "the Mac's long journey", since it's only about one Apple platform and not really about processors. And even then it seems to end around 2006.


Hi, original author here.

Yes, I'm talking about Apple Macs, specifically. I am not aware of any other Apple product line that has transitioned from any one processor architecture to another different one while maintaining any form of software compatibility; are you?

Non-computers don't matter. Nobody cares if a laser printer has the same CPU as the previous generation of laser printer, so long as it still prints and there's a driver. Nobody cares if a new iPod has a different CPU, because iPods didn't run apps.

The iOS line has never made a CPU transition, so there's nothing to write about. It's ARM as it's always been. 32-bit to 64-bit is no big deal; I did a FOSDEM talk on this theme a fortnight ago: https://news.ycombinator.com/item?id=22265615

Only Apple's Mac made a journey like this, so there's nothing else to talk about. The Apple IIGS had a different CPU but was backwards-compatible; no transition involved.

It ends around 2008 because by then Apple was making 64-bit dual-core x86 machines with 64-bit UEFI and they still are. That was their last transition so far. There's nothing more to say.


The 32-bit to 64-bit ARM transition was a huge deal. It was made somewhat seamless to end users because both copies of the entire system stack could be included for a few releases, but it was at least as big a deal as the 32-to-64-bit x86 transition. Moreover, it came with the Secure Enclave, which totally changed the security model for iOS devices.

T2 Macs essentially also moved to the iOS Secure Boot model. There is still an UEFI firmware but it lives in the coprocessor now, and it doesn't do as much as it used to. Many of the things used to run in the UEFI firmware now run in the T2.


You could make the argument that they've made at least a partial transition since the 2008 64 bit UEFI with the whole T2 ARM co-processor thing.


Mac software the way the original author is talking about it is still basically running like 64-bit Intel Mac software, though, right? Catalina is a transition of sorts in that it's dropping all 32-bit compatibility, so that could be a footnote, but I think that's about it -- until-slash-if Macs make the oft-rumored transition to Apple-designed ARM CPUs.


This is a little unfair. Yes, the notionally "normal" state of the system, at least from the view of the nanokernel, is to be running 68K code. However, by the days of 8.5, most of the OS was native, with the remaining legacy being all those obnoxious UPPs. In fact, porting the classic Mac OS to PowerPC by then was actually the low-effort move while Apple management scrounged around to figure out what to do after the demise of Copland.


Hi. Original blog post author here.

Not really, no.

I was there and used and supported these machines throughout.

There was a CDEV available at the time. I can't find it any more. It placed an "indicator light" in your menu bar, which glowed red when the OS was executing 68K code and green for PowerPC code. I installed it on all the machines I could.

It sat there red 99% of the time. Occasionally it flashed green briefly. Only a very few very-CPU-intensive apps made it stay green: applying large Photoshop filters, for example.

Even by MacOS 9.2.2 in my normal usage, browsing the web, doing email and spreadsheets and chat, and writing, it stayed red most of the time. PowerPC-native browsers such as WAMCOM call OS code all the time, and the OS code mostly stayed 68K.

As for it being the low-effort move, I specifically addressed this point.


I don't know what CDEV you're referring to, but what it's observing sounds like a distortion. Every UPP call into the OS looks like 68K code. This enables 68K apps to "just do it." There was always some sort of thunk to handle registers and calling convention, but by 8.5 and certainly by 9 the code on the other side was usually native.

Certainly some calls were still 68K. That's why almost every OS call still needed to go through that song and dance. If all the CDEV did was see the border at the Mixed Mode Manager switch, the machine would indeed appear to be running 68K code most of the time except for those few generally non-UI PPC-specific APIs, even though it isn't (and I challenge you to spend a little time in MacsBug and see that this is true).


Full story on the transition from 68000 to powerpc :

https://www.filfre.net/2020/02/the-deal-of-the-century-or-th...


That was one fine article. Thanks for the link.


I know this is about Mac's transitions and call me sentimental but I'm a little sad there was no mention of the 6502 and 65816.


And quite a bit of MacOS was ported back to the IIgs, most notably QuickDraw. The IIgs had ADB before the Mac, and colour QuickDraw years before the Mac had colour graphics.

At the time it looks like the 65816 was deliberately speed limited so as to not overshadow the Mac. Mac was 8MHz, IIgs was 2.8MHz, the 65816 eventually got to 14MHz+, if Apple released an 8MHz IIgs it would have compared very well against the Mac. Woz is even quoted as saying that.

Also, the Mac IIfx had two 10MHz 6502 based I/O processors to offload work from the main CPU.


> The IIgs had ADB before the Mac

Marginally. The IIgs was released in September 1986; the Macintosh SE and II (which were the first Macs to support ADB) came out in March 1987.

The 65C816 eventually made it up to 14 MHz, but I'm not sure those speed ratings were available in 1986. Even when they did, integrating them into the IIgs took some significant effort by accelerator manufacturers.

> if Apple released an 8MHz IIgs it would have compared very well against the Mac. Woz is even quoted as saying that.

Perhaps against an 8 MHz 68000 -- even then, the 16-bit data bus, 32-bit registers, and hardware multiply/divide on the 68000 would have made it stiff competition for the 65C816, which still used an 8-bit data bus, 16-bit registers, and supported no arithmetic operations more complex than addition, subtraction, and comparison. Against the later 680x0 parts, there'd have been no contest.


Originally the Macintosh was going to be 6809 based, but Steve Jobs wanted more power and went with the 68000 series.

GEOS was available on the Apple //e and Commodore 64 showing that even entry level CPUs that are 8 bit can do a GUI.


Hi. Author of the piece here.

The reason I skipped it is addressed in an earlier comment: there was no code transition. The 65C816 ran 6502 code natively. Yes there was a bit of native code, but if Apple had run the CPU at full speed, and devoted real effort to the IIGS, it would have killed the Mac off. It is, sadly, for the best that they didn't.

I speak as someone who never owned or used an Apple ][. I was born in Britain. When the Apple ][ was new, it cost as much as a car in Europe. My first and second computers were Sinclairs, and my 3rd was an Amstrad (who bought Sinclair).

The Apple ][ may have been the first home computer under US$1000 but in the UK that was a very large amount of money. Sinclair made the first home computers for under GBP 100, a far more important cost barrier for us.


Didn't IBM have OS/2 Warp for PowerPC CHRP systems coming out that later turned into vaporware? It was supposed to use X86 emulation to run DOS, 16 bit Windows, and OS/2 programs on the PowerPC Macs.

Don't forget Linux for PowerMacs. Apple once had MKLinux available.


I cut my linux-teeth on PowerMacs. OS X was going to be based on unix, so I thought I'd get a jump start on this new fangled stuff (I was 11 -- "It's a UNIX system...I know this!"). I remember an issue of Mac Addict arrived with a copy of LinuxPPC and an app that, through something that felt like witchcraft at the time, would let you boot into linux directly from Mac OS.

(Also, some random OS/2-powerpc history- https://www.os2museum.com/wp/os2-history/os2-warp-powerpc-ed... )


> Didn't IBM have OS/2 Warp for PowerPC CHRP systems coming out that later turned into vaporware?

It was developed, and pre-release versions were available to developers, but the whole "Workplace OS" initiative and OS/2-PowerPC was killed when it was obvious it wasn't going to meet with commercial success. Memory is fuzzy as to the specific sequence of events, but it's of a piece with the Pink/Taligent failures as well. I also recall performance not being too great. By the time Warp 4 was released, I think the IBM Personal Systems/PC hardware business had already signed a major OEM deal with Microsoft for Windows 95 and NT.

I know I saw at least one book sitting on a major retail bookstore shelf that was geared toward developers on the OS/2-PowerPC platform (sadly, after the platform was already dead).

Some of us in the early 90s had big hopes for CHRP and PowerPC, outside of the Mac universe.

In addition to the pretty good writeup at the link posted by kylek, some others (including CD images):

https://archive.org/details/os2-powerpc

http://www.edm2.com/index.php/OS/2_for_the_PowerPC:_The_Deve...


Upstream Linux continues to support Power Macs today.


I was thinking today that we need more computer archeology, and then I read this short post on macOS & professor history I was unaware of. We need more of this!


David Friedman has an interesting book Legal Systems Very Different From Our Own. It's basically what it sounds like, a survey of legal systems from distant past and foreign civilization that evolved in very separate conditions from what we know.

I think, it'd be interesting if a technically minded person wrote an equivalent work Computer Systems Very Different From Our Own. By necessity most of it would probably be confined to retro-computing. But I think it'd be a pretty worthwhile endeavor.

Not because those computers are likely to be better designed than our current systems. If anything quite the opposite. Like Friedman's legal systems, most of them probably have glaring pitfalls relative to modern state of the art. But it's a useful exercise to think about very distant points in the design space, and how and why those decisions were made.

Even if the comparison point is clearly inferior, by virtue of being very exotic it helps us better understand the tradeoffs we make in our everyday incremental designs.


Like ternary architectures? https://en.m.wikipedia.org/wiki/Setun


Article author here.

Thanks! I do quite a bit... https://news.ycombinator.com/item?id=22265615

This talk was 2y earlier: https://liam-on-linux.livejournal.com/56835.html


Thanks for this. It’s a great read. Looking for part II on more of OS X‘s evolution (64 bit, Intel, DriverKit, etc).


I hope to see AMD in this list soon. Apple is not using the best desktop and laptop processors nowadays.


The best laptop processor would be an Apple designed ARM chip.


Possibly, but it would mean transitioning to a new instruction set. AMD processors wouldn't require a transition.

I used Macs through the PowerPC to Intel transition and it wasn't a great time. Apple had to do it because PowerPC processors were so incredibly far behind. Right now, Apple isn't behind since all their competitors are also using x86 processors. Maybe they could be ahead, but there isn't the same desperate need.

Maybe Apple should move to their own ARM chips. However, the AMD Ryzen 4000 series processors give them an easy upgrade without requiring a lot of work. Switching to ARM is a big commitment. Third party developers will need to recompile their programs. Apple will need a translation layer for older programs that aren't updated - and users will understand that they'll have older programs they've purchased stop working in a few years. It happened with the PowerPC to Intel transition, the MacOS 9 to MacOS X transition, and recently with the 32-bit to 64-bit transition with Catalina cutting off 32-bit support.

AMD's Ryzen 4000 parts are based on the 7nm TSMC process and they're able to stuff 8 cores into a 15 watt part. For comparison, the MacBook Pro 13" ships with 28 watt processors.

Is the Ryzen 4000 the best? I can't say. Is it a lot easier and less of a commitment? Definitely.


I used Macs through the 680x0 (LCII with a 68030/40 accelerator and lots of peripherals) to PPC (6100/60 with the “DOS Compatibility Card”) transition, left the Mac for awhile and came back with the G4 Mini and less than a year later with the Core Duo Mini.

The first time, the 6100/60 ran emulated code (especially with Speed Doubler) fast enough that there wasn’t a performance penalty. The second transition I had all native code.

But in general the strategy was the same. Wait until the software you need is compatible. During the first transition, my most used software was already shipping as a “fat binary”. I just unplugged my SCSI hard drive from one computer to the other.

The second time, all of the software I cared about was native.

It’s even easier now since the two most popular pieces of commercial software from Adobe and Microsoft are both subscription based. As soon as they port it, you won’t pay extra for it. Much of the other software that nerds use is open source and can be compiled easily for ARM (or Electron based).

No one is thinking that Apple will go full ARM. Most are thinking that the high end will be x86.


You are basing that on what metrics? It's all just speculation for now.

AMD 4000 mobile processors, on the other hand, already exist.


We know the performance/power trade offs of Apple’s ARM chips in iPad Pros.


And the odds are good that even that could be scaled up further for a laptop-class part. The iPad Pro is a passively cooled device which is expected to always run on battery power, after all; a laptop would have larger batteries and more thermal margin.

Not that it'd even have to be scaled up by much. The A13 (iPhone 11) already performs comparably to many Intel mobile parts; an -X variant (future iPad Pro) would surely improve upon that.


The scaling is really the question. ARM processors generally have better performance per watt because they're designed to prioritize power efficiency over absolute performance. The Intel parts Apple's ARM processors compare to on performance are the lowest power ones, which isn't exactly the sweet spot for Intel's microarchitecture.

If you gave the ARM designers the (higher) laptop power budget, almost certainly they'd increase performance at the expense of performance per watt. In other words they'd take more of the trade offs that Intel and AMD do. But to justify switching architectures it can't just be the same, it would have to be enough better to be worth the transition.


This has been exactly proven out by Amazon - https://aws.amazon.com/about-aws/whats-new/2019/12/announcin...

If Amazon can make a server class ARM chip that is competitive or better with their custom designed Xeons, I take it as a truism Apple could do the same for mobile.


There are a lot of reasons to expect that to work better for servers than laptops.

Most server workloads scale with core count, so winning in performance per watt is winning in general because you can scale performance by adding cores. A server with twice as many cores that are each half as fast is just as good, but that doesn't work for a laptop.

A server processor also doesn't have to win at every workload. If it can win at any workload, people can use it for only those, which makes it useful. That also doesn't really work for a laptop.

People run Linux on servers. It's portable. The architectural transition cost there is close to zero.

On top of that, Amazon is fudging the benchmarks in claiming it's faster. They're comparing performance "per-vCPU" which on Intel are hyperthreads rather than full cores. That's fine when Amazon is charging less for the ARM "vCPU" than the Intel hyperthread "vCPU", but you can't use that trick when running single-threaded workloads on a laptop.


And OS X is portable. It’s already been ported from PPC to x86 and parts of it back and forth between ARM and x86.


The operating system itself isn't the problem, it's all the applications. And the PowerPC transition was both easier and more worth it because the x86 processors were so much faster than the PowerPC processors of the time that they were competitive even when emulating PowerPC code, which wouldn't be the case this time.


“All the applications” don’t matter. Once you have the OS, Apple’s first party applications, Microsoft, Adobe, and a few Mac only niche apps, you cover most of your bases.

During the 680x0 to PPC transition, almost all of the applications that needed performance were already ported. The same was true for the x86 transition.

But more than likely, this time it will be the low end that would go ARM to optimize for battery life and the high end would stay x86.

Also the difference this time is that the two largest commercial vendors - Adobe and Microsoft already have ARM versions and Apple has Catalyst to make porting iPad apps to MacOS easier.

Also, during the last two translations, there was a lot more assembly code optimizations in third party apps than now. Especially going from 68K to PPC.

And sadly, there are a lot more Electron apps out there that would be easy to port.

Microsoft has struggled to get Windows ARM laptops because they also thought the performance/power ratio made it worth it.


> “All the applications” don’t matter.

Getting the most popular applications is the easy part, the hard part is the long tail. Only 1% of people may use a particular obscure application, but when there are a thousand such applications, the average person uses ten of them.

You also have to deal with a period of people wanting to run the applications they already own, not the newer version they have to pay to upgrade to or which isn't the same version the rest of their family or company uses on their existing hardware.

There are a lot of people who need nothing more than a web browser, but that's the target market for Chromebooks rather than MacBooks.


That 1% of applications is probably not performance critical and would work fine under emulation. But really, do you know the Mac scene? This isn’t Windows with thousands of line of business apps that have been running without upgrades for a couple of decades. Apple has already killed a lot of old apps when they dropped PPC compatibility and later 32 bit compatibility.

There are two existence proofs that Apple knows how to do this. They’ve done this twice and took customers with them both times. What are the Mac users who don’t like the transition going to do? Buy PCs and lose all compatibility with their apps?

The last thing that’s different this time is that Apple had a lot more to lose in 1994 when the Mac was all of their business and 2006 when the Mac was half of their business (the iPod was the other half). This time the Mac is only 10% of their business.

Also, many apps are going subscription. The upgrade would be “free”.

As far as performance. The most popular Mac has been the MacBook Air. The people buying those aren’t doing performance critical tasks using niche applications that won’t be ported.


> That 1% of applications is probably not performance critical and would work fine under emulation.

Each application having 1% of the users doesn't mean it's 1% when you add them all together. And they're all different applications. Why would you expect these to be less likely to be performance critical than any other?

Even for applications that aren't performance-critical, if the point is to save battery, emulation does the opposite.

> This isn’t Windows with thousands of line of business apps that have been running without upgrades for a couple of decades. Apple has already killed a lot of old apps when they dropped PPC compatibility and later 32 bit compatibility.

MacOS X 10.5 supported 64-bit Intel applications in 2007. That's more than a decade for old applications to accumulate. I doubt even most of the legacy Windows cruft is older than that anymore.

> There are two existence proofs that Apple knows how to do this. They’ve done this twice and took customers with them both times. What are the Mac users who don’t like the transition going to do? Buy PCs and lose all compatibility with their apps?

When they did it before, the new processors were a lot faster. There was a large benefit to offset the transition cost.

Existing Mac users don't immediately have to buy a PC, all they have to do is not buy a new Mac.

> The last thing that’s different this time is that Apple had a lot more to lose in 1994 when the Mac was all of their business and 2006 when the Mac was half of their business (the iPod was the other half). This time the Mac is only 10% of their business.

If the argument is can Apple stop making x86 Macs without going out of business, sure they can. If the argument is that Mac users should want to buy the first ARM model that comes out instead of holding onto their existing MacBook, not so much. It would have to be a real upgrade from what they already have, not just something equivalent but worse because it has to emulate some of their applications.

> As far as performance. The most popular Mac has been the MacBook Air. The people buying those aren’t doing performance critical tasks using niche applications that won’t be ported.

The MacBook Air has a Core i5. It's not exactly slow.

I'm also not sure what you mean by performance critical. Maybe they're not doing 3D rendering where you care quite a lot about the difference between three hours and five, but when people upgrade their hardware they still want it to go from responding in 500ms to 300ms instead of the other way around. And in that sense basically everything is performance critical.


Each application having 1% of the users doesn't mean it's 1% when you add them all together. And they're all different applications. Why would you expect these to be less likely to be performance critical than any other?

Care to give examples of “performance” critical Mac applications that people are using and are not likely to get ported relatively quickly?

Even for applications that aren't performance-critical, if the point is to save battery, emulation does the opposite.

What are these specialized applications? Audio, photo and video apps are historically the first to get ported. Adobe and Microsoft have historically taken a year or two.

Active Mac only developers like BareBones have have been the first to port their apps.

When they did it before, the new processors were a lot faster. There was a large benefit to offset the transition cost.

For reference 68K applications ran slower on the 6100/60 and 7100/66 than my 68030-40Mhz (accelerated) LCII. Out of the first generation PPC based Macs, only the 8100/80 was always faster than my 68030 Mac. My Mac was about half the speed of the high end 68040-40Mhz Quadras. In this case it’s the battery benefit.

And back in the 10.5 days, most apps were still 32 bit and using the Carbon APIs.

If the argument is that Mac users should want to buy the first ARM model that comes out instead of holding onto their existing MacBook, not so much. It would have to be a real upgrade from what they already have, not just something equivalent but worse because it has to emulate

And this is the same thing that happened the last two times. Users waited until the apps they cared about the most were ported. Especially for the PPC transition, it wasn’t until both the second generation high end 604s that PPC based Macs could run emulated programs faster. Heck the second generation lowend 603 based PPC Macs were slower emulating 68K Software than 601 based Macs.

I'm also not sure what you mean by performance critical. Maybe they're not doing 3D rendering where you care quite a lot about the difference between three hours and five, but when people upgrade their hardware they still want it to go from responding in 500ms to 300ms instead of the other way around. And in that sense basically everything is performance critical.

I can’t speak to the architecture of the PPC emulator for the x86 based Macs. But in the case of 68K programs that spent a lot of time calling OS routines, they would get a significant speed boost because they were calling native code - at the time emulated programs saw a significant speed boost by calling native QuickDraw routines.

The other advantage this time is that Apple is designing its own chips. It can make architectural decisions in the processor design to make emulation better.

Also, people are already using Electron apps, they couldn’t care too much about performance.

For a contemporary report of emulation speed of PPC Macs.

https://tidbits.com/1994/02/21/powerpc-reports-positive/

(reports were that PPC ran emulation at the speed of a low end 68040 Quadra without an FPU)


The most likely possibility there is that Apple will release the system with some form of dynamic binary translation, much like they did for the PPC/x86 transition. It's also possible that Apple might make some customizations to their ARM core to accelerate the translation process, or even to allow some x86 code to be run directly.


The problem is the use-case. The software Mac laptops run would perform horribly on ARM. A Macbook Pro is not going to have an ARM processor.


Why do you think the software Macs run are so much different than the software that iPad Pros run? They run some of the same frameworks. Since when you run iOS apps in the simulator for Macs you actually running x86 builds of iOS apps linked against an x86 build of the iOS frameworks, it’s not hard to compare performance


It already does, in the T2. It seems to me like future T* chips will run more and more of the system, leaving the x86 to become something like the MBP’s discrete GPU option. And then, like discrete GPUs, eventually an x86 CPU is only available in top-end models.


Would these really be an upgrade? I thought we were hitting thermal and power limits these days? (Genuinely asking btw, have just bought a tricked-out MBP 16)


It depends on what you’re optimizing for. Having a laptop that has the battery life of an iPad would definitely be an upgrade.


mac on arm is in the works apparently and is probably significantly more power efficient than AMD


The T2 chip is probably what these rumors refer to. I imagine there's also ongoing engineering research to incrementally beef up the role of the T2 chip as a system controller and possibly also perform more system and background tasks in its ARM core.

That said, I've no doubt Apple is looking closely at an ARM future for Mac. I'm hoping that Apple does this, but as price & efficiency play, not to phase out x86. There's no reason Mac couldn't be indefinitely dual platform. We know macOS already has strong multi-binary support, so there's no technical impediment to that.

It could start with them releasing a MacBook Air with ARM as an option, ostensibly targeted towards developers and early adopters. It would need to have longer battery life and superior CPU + GPU performance at the same price point. From there they could progressively introduce ARM across their consumer range, guided by the performance numbers. And as software support broadens, they could begin progressively discontinuing x86 at the bottom of the product range.

Part of me wants to say that Apple shouldn't ever discontinue x86 support entirely, but if they're ever able to make chips that soundly trash anything from Intel and AMD, I doubt anyone would be too sad if they did.


> It could start with them releasing a MacBook Air with ARM as an option

I don't see this ever happening, it'll be all or nothing. Apple just doesn't differentiate its products on technical differences, it categorises them purely on what you can use them to do, as a branding/marketing strategy.

Despite developers and other people interested in the tech being a significant market for them, it's definitely not significant enough for them to change this.

Having a macbook that can't do things other macbooks can, with the tradeoff of different performance characteristics.. I just can't imagine it's something Apple would ever do.


I strongly disagree and for exactly the same reason you gave: because there wouldn't be any need to differentiate on technical differences. Once all developers have cross-compiled their apps, the only difference between an ARM and x86 MacBook would end up being the ability to run Windows natively.

And yet, it's also ridiculously wrong to claim that Apple doesn't "differentiate on technical differences". While that's certainly true for their non-Mac products, Mac products have mostly stuck to the same formula used by Dell and HP.

Sure, Apple do place more emphasis on whole-product benefits and less emphasis on technical minutiae, but even that's splitting hairs. For example while they don't mention Intel's precise CPU product numbers in literature, they still differentiate SKUs based on tech specs like Intel Turbo Boost speeds. And when you buy an iMac, they'll still let you pick between the Radeon Pro 570X, 575X, 580X, or BTO the "Radeon Pro Vega 48 with 8GB of HBM2 memory".

I think they can handle a few products having an "Apple CPU".


> Once all developers have cross-compiled their apps

This is what I'm referring to specifically. If this were to happen, there would be a transition period where an obscure technical difference would mean they can do certain things on one mac model, and not on another.

And this won't even be like 'the entire new generation of macbooks can run this, the older ones can't' but 'this particular new model of macbook air can't run this, all the rest can, including all the other new ones' - Apple will never allow that to happen, it's completely antithetical to their product positioning strategy - simple computers that just work.

I concede that in the mac(book) line they do go into detail on some technical specs, but these are pretty much sliding scale numbers, bigger is better and more powerful and more expensive. It's very clear even to casual consumers (perhaps with a little explanation) why a bigger frequency number, more storage, more memory, is better. I.e. it's the same thing, just more powerful.

That's a very different thing than trying to explain to a consumer that the chip architecture is different, and why that means certain software doesn't work now, but will once the developer does this and that, etc etc. Again, Apple will simply not put themselves in that position.

To be honest I kind of hope I'm wrong - I'm interested in seeing what a macbook with Apple's own chips in it would look like, and if that's their plan then for me the sooner it happens the better. I just don't see it happening in any other way than all at once, though.


We already have the evidence of how well it can work with the Intel transition. If anything, adding ARM to the Mac platform would be dramatically easier than the PPC-Intel transition because the ground work is already done:

— Unlike in 2006, a greater proportion of current Mac apps are developed with architecture independence, whether intentionally or not.

— Unlike in 2006, nearly all Mac developers are already using Apple's Xcode as their IDE.

— Unlike in 2006, many Mac app developers will already be familiar with the ARM CPUs, because they'll have already worked with them on the iPhone and iPad.

— Unlike in 2006, Apple has an additional motivating force called the Mac App Store. Apple would almost certainly require all new/updated apps in their store to be cross-compiled.

So no, you're wrong.

We've seen this before. We know how quickly it played out in 2006, despite that being objectively more difficult. This would be easy. All it would take is a 6 month transition period where ARM Macs are only marketed towards developers (and informed early adopters) combined with an x86 binary emulation layer to cater for straggling vertical/corporate apps.


Yeah, those are some good points. You've brought me around to thinking this is more likely now!

I'm still just leaning towards them introducing these chips as an all in one product category, though - perhaps under a different brand of macbook model that is very clearly distinct from all the rest (looks different, etc) and a software strategy to match like a suite of ARM compatible software out of the gate that make it a useful standalone device whilst the ecosystem catches up. But yeah you're right, the transition looks like it'll be smoother than I'd assumed and it maybe wouldn't be a blocking obstacle from their point of view. Let's see!


If there's going to be a new product category, I imagine it would be an iPad Pro that runs iPadOS but can also run unmodified ARM-compiled Mac apps when attached to a keyboard and trackpad. (And the "files" app would look more like Finder.)


Windows can run natively on ARM.


You’re right, Windows can—but whatever app important enough to justify the effort of running within a virtual machine probably can’t.

(And you’re assuming that Microsoft will sell you a license for Windows on ARM in a VM, or that anyone will build that virtual machine software to support it.)


Some new AMD Ryzen processor identifiers seem to be present in the latest MacOS beta..Really curious..


As Apple's rumored to be eyeing ARM for Macs, how impractical is it to make a more power efficient x86 CPU (like an Atom 2), or is x86 the 737 of CPU architectures?


I simply can’t scroll on this website

Safari iOS


I read it in reader mode, but I am on an ipad which perhaps has a different safari from the iphone.


What was the reasoning for transitioning from Pascal to C?


Pure Pascal is not really suitable as an operating system implementation language. The obvious design choices are to:

1. Transition to C completely.

2. Extend Pascal (like HP did - HP called their resulting language MODCAL).

3. Support in-line C code in the Pascal compiler.

4. Support in-line assembly in the Pascal compiler.

For superior library support and other reasons, Apple made the right choice here.


Pascal as used on the Apple II, III, Lisa, Mac, and IIgs had the minimal extensions needed to be a systems language and was pretty much isomorphic to C.

The “switch” from Pascal to C in the Mac market was less a “switch” and more a change in preference by developers—you could use either language and line by line your code would be equivalent. And the switch at Apple was likely just following the market.

The APIs themselves used a language independent calling convention—arguments passed in CPU registers, not the stack, and system calls invoked by invalid opcodes in the 0xA000–0xAFFF range, not by JSR to a function pointer—so neither Pascal nor C really had an “advantage.”


Hi. Article author here.

Pascal isn't ideal, but it has derivatives which are.

Pascal became Modula-2, in which several large-scale OS implementation efforts were made, including Acorn ARX.

Modula-2 became Oberon, which is an OS: https://news.ycombinator.com/item?id=15753138

There is also Euclid, which was used to implement a Unix. Yes, a Unix in Pascal. https://en.wikipedia.org/wiki/TUNIS




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: