Everyone here is very positive that the transition will go smoothly based on their last few transitions being fairly successful.
I'm a bit more skeptical.
Back then, their desktop/laptop business line was much more important to them. They needed it to go well. They were also a niche player and couldn't afford to lose more market share to PCs/Windows.
But now, they barely care about their desktop/laptop revenue (being only 10% of their revenue) and they don't seem to care much about their marketshare either because they don't seem to be doing much to expand it. They care a lot more about their iDevices now.
Also, if the transition from 32 to 64 bit is any indication, they don't care nearly as much about backwards compatibility as they used to. When they did the hardware transitions, they had a compatibility layer (Rosetta) that was supported for 5 years after the transition. But with the 32 bit change, the announced that it would no longer be supported in 2017, and dropped support in 2019, just 2 years later.
I can no longer use a lot of my favorite apps because they were games written years ago with no current support, so they were not updated. Or they're things like the old Microsoft Office, which works perfectly well for me, but I can no longer run. And Apple provided no emulation environment or official way to do so.
So now my choice is to keep an old computer around just to run old apps (which could be a security issue if it's on the network), or buy something like Parallels, which is a ton of overhead for a single app at a time.
It would have been nice if they had provided a Rosetta for the 32/64 conversion. The fact that they didn't does not bode will for this transition.
Also, if the transition from 32 to 64 bit is any indication, they don't care nearly as much about backwards compatibility as they used to. When they did the hardware transitions, they had a compatibility layer (Rosetta) that was supported for 5 years after the transition. But with the 32 bit change, the announced that it would no longer be supported in 2017, and dropped support in 2019, just 2 years later
Apple hasn’t sold at 32 bit only Mac since 2007. To Adobe’s chagrin they didn’t port Carbon to 64 bit with the rest of MacOS in 2007 and they deprecated it in 2012. Were developers really caught off guard that they shouldn’t be writing 32 bit software in 2017?
>> Were developers really caught off guard that they shouldn’t be writing 32 bit software in 2017?
Probably not, but end users shouldn't have to suffer because they bought software from bad developers. One reason why IE still ships with Windows is because a lot of companies are still using old enterprise software that never got updated for the modern web.
Have you read Raymond Chen's MS blog about all of the hacks that are in Windows just to keep backwards compatibility with one bad app? Maintaining backwards compatibility forever increases the security vulnerability to footprint, the testing surface for regressions, code size etc. Have you seen how badly Windows runs in low resource environments like the low end Surface line?
IE and the collapse of MS's browser market share is not exactly a shining example of how to successfully handle a product.
You’re sort of picking the worst example, the alternate is the fact that Windows has maintained a gargantuan desktop OS market share for decades in large part because of their dedication to application compatibility.
Is that true? While I get that this isn't scientific proof, there seem to be a lot of Youtubers out there making a decent living off videos of building and reviewing PCs and Macs.
Linus Tech Tips has >11M subscribers, Dave2D has 2.7M subscribers, etc.
> Making money off of YT videos is peanuts in the grand scheme of things.
You said "No one cares about PCs as a business anymore."
The popularity of YT videos on new laptops, pcs, etc. seems to suggest that at a non-zero number of people (i.e., greater than "No one") still care about PCs.
If you want to be pedantic - a few Youtube videos as a business as evidence of a thriving market is about like showing a record store selling vinyl as evidence of a resurgence.
From a business standpoint, IBM abandoned it years ago and HP has gone back and forth between whether it wants to keep their PC unit for years.
Almost all of Apple’s former competitors are either dead or worth nothing. From a business standpoint, Apple’s strategy seemed to have been the right one. Apple probably makes more on computer sells than any one else and they only make up 10% of Apple’s revenue.
So HN posters can complain about Apple’s strategy until the cows come home, but the market has decided it was the right one.
I didn't say there were not any tradeoffs made to support those users with legacy systems.
What do you define as a low end Surface line on x86? I have a Surface Pro 1 that I still use to this day. I also have a GPD Win handheld that's running an Atom processor that is surprisingly good for what it is. I also have some older Atom based stick PCs that are pretty bad (only 2GB RAM), but those are designed for signage and kiosks, for which they are... adequate.
I see you are ignoring the low cost Surface computers that are meant to compete on price with the $329 10.2 inch iPad or even the $499 iPad Air.
The iPads can actually run well with 2GB of RAM. Microsoft Office is a much better experience on an iPad without a keyboard than any hardware that Apple sells.
The Atom-based Surfaces with the Atom x7-8700 - that's the same processor I have in my GPD-Win. It's more than usable for light users. Heck, its built-in GPU outperforms the one in my Surface Pro.
While x86 Surface (not-pro) might have been "meant" to compete on price with iPads, I don't think that's how it worked out. I tend to think most of the people who will buy a low end x86 Surface want/need Windows in the first place. It's not a substitute for an iPad.
That's an apples and oranges test, and the performance difference doesn't matter. They're for different target markets.
I don't think anyone in their right mind considers a Surface as an alternative to an iPad. Just because they have similar form factors doesn't mean they're substitutes for one another.
The things I run on my atom based handheld are only available on Windows.
So the target market is people who want a really slow more expensive tablet with a horrible interface for touch?
Microsoft’s own products run better on the low end iPad than the low end Surface. On top of that, the interface is better for touch.
And Microsoft sells less Surfaces than Apple sells Macs, let alone iPads. This isn’t really saying much about the Windows being backwards compatible for ever (what started this thread) was a great strategy.
This is ridiculous. Mac revenue was roughly $25 BILLION dollars in 2019. The idea that they don't care about it is simply wrong. Same with the idea that they don't care about marketshare.
Of course they care more about iDevices than the Mac. It brings in roughly 5x as much money as the Mac. But Apple can chew gum and walk at the same time.
This was my argument for why Apple shouldn't have killed Aperture. Sure, it was a tiny percentage of Apple's business, but it was beloved by customers and independently profitable—more profitable than plenty of entire businesses.
Apple is the best (large org) at marshaling resources, focus, priorities, pushing the envelope. It's been fantastic.
But every decision has consequences. They abandon beloved products. They deprioritorize code quality.
I wish they'd find a better balance, or clever new org strategies. Spinoff abandoned products. A separate cleanup crew to address technical debt, fix bugs. (I'm sure there's plenty of other devs, like me, who revel in house cleaning.)
Man, Aperture was really good, and I loved the fact that if you bought it from the Mac App store, you got to install it on up to 5 computers, which was a bigger deal back then than it is now.
25 billion is still only 10% of their revenue. And they won't lose it all by making it a shitty transition. Heck, they won't lose most of it. But if they lose 20% of their revenue because of a bad transition, that's still only 2% of the total company revenue. Why would the spend money on protecting that 2%?
Because the Mac and iDevices revenue streams are not as independent as you imply.
Driving the Mac off a cliff would alienate developers and have serious repercussions on the iOS ecosystem as well.
Why do you think Apple still builds Pro apps like Logic and FCP ? They are almost certainly losing money given their ridiculously low price point, but they help with brand image and keeping the Mac the go-to platform for creatives. There’s indirect value in that.
I mentioned this above—I feel like Apple's pro apps are highly analogous to what happened to the Mac. That doesn't portend a bright future.
Apple killed Shake, which was an industry standard at one point. They killed Aperture, which lots of photographers loved. They replaced Final Cut Pro 7 with a much more limited Final Cut X, and seeded lots of ground to Adobe in the process.
There's also OS X Server, although there's a strong argument to be made it wouldn't have survived in the modern world anyway, and Apple just saw the writing on the wall.
Logic does seem to have done consistently fine, at least, so that's something.
> Apple killed Shake, which was an industry standard at one point.
Sure, and this might be skewed by hindsight because they did make new releases for a handful of years, but it was an acquahire to develop Aperture and other color tools for Final Cut Pro. Small shops use After Effects on Windows, medium shops used Shake on Windows, and large shops used Shake in Linux. Because of hardware costs you'll be very hard-pressed to get medium or large shops to use macOS.
I also don't see Apple investing/competing with the big transitions that happened in CG over the next 5 years; 64bit, 4k/8k, and stereo. Apple had special deals where they offered Shake's source code when it was discontinued and a few studios clung on and tried valiantly to accommodate [1]
Logic pissed off a lot of composers with the lack of 32-bit AUs in newer versions. The few producers I know who love Logic aren't upgrading to Catalina and are already considering switching to Windows.
Driving the Mac off a cliff would alienate developers and have serious repercussions on the iOS ecosystem as well.
For iOS Developers, I am not so sure. If Apple came out with Xcode for Windows at WWDC, I'm pretty sure a lot of iOS developers would be overjoyed. Now, a build farm is much easier among other advantages.
Xcode for Windows would quickly end up like all software Apple makes for Windows: buggy, slow, lagging, and unloved. Not that it is much better as it is on Mac...
I think a large part of their crappy Windows software was there need to make it look like the Macintosh version. I get the feeling if they are putting Xcode on Windows, they will not feel the need to make it anything but a Windows app.
I would also imagine they would remove anything having to do with the Macintosh except perhaps the Catalyst compiles which should decrease some of the neglect problems causing bugs.
> Driving the Mac off a cliff would alienate developer
How many, though, and would they really switch to Windows or Linux, or retire from the industry?
Apple has always — or at least since 1984 — had an incredibly loyal fanbase. I think that the sort of slights which would make me leave a platform just roll of their collective back.
Since they released their 2016 lineup there has been a shift from Macs to Linux and Windows (just look at how many "I left macOS for {X}" have been published here on HN)
Windows is also making a very compelling argument with WSL2
A tiny vocal minority doesn’t mean much, if anything, in the overall scheme of things. The tiny vocal minorities of various tech/niche/geek sites rarely means anything.
You can’t extrapolate from that. No less when it’s not even correct for general trends most of the time.
Companies of all sizes (even FAAMNG) will act to protect that kind of existing revenue.
And my point was that you can do all of the things you listed for less than $500M; probably for less than $250M. Developers are more expensive than they used to be but $100M+ still goes a pretty long way.
Action speaks louder than Words.They only acted when the sales number or Data shows. And that was already too late. That is not caring by old Apple's Standard.
Apple sells roughly 18-20M Mac per year. According to their own words more than 50% of these are completely new to the Mac platform. ( And most of them are form China ) You would have expected 9-10M new Active Users every year if no one was leaving the platform. Instead from 2014 to 2018, during that 5 years they managed to add 20M. ( Genuine Question, Did anyone notice there are less Mac users in US? ) There are also 20M registered developers, which adds to the bottom-line of Mac. What used to be annual Active User number reporting stopped in 2019. And even stopped mentioning satisfactory rate of Mac. Mac Pro was done not because they cared, but because a few closely connected real professional customers threaten to leave the platform before they took action. ( And many left before they notice )
It is like in a marriage when your partner keeps telling others in public how he/she loves you. But did not show any care or action during that time. For years He/She dont even talk to you or notice you. You try talking to him/her about having problems but he/she completely ignore you. One day you publicly file for divorce and He/She rushed out and tell others how much he/she cares and are working on it. That is not caring, it is called saving faces.
>But Apple can chew gum and walk at the same time.
No.
Look at MS and the difference in quality.
They are both samey in the $$$ department, but MS much larger focus and enormous dev base makes it not such a nice thing for products.
Apple reduced scope is what lets them have quality thing. They do not have much manpower (I mean, they are one of the few trillion dollar company, their human resources don't look like that), spreading focus would harm quality. Their iDevices makes 5x as much money, that's where their eyes are, that focus can't be made smaller for an "infirior product" money-wise.
You may have missed GP's point. Microsoft is casts a very wide net, running on a huge array of hardware and trying to do everything from mobile to console gaming to datacenter servers. The dev experience is equally diverse: you don't even need a Windows box to develop for Windows.
Apple jettisoned all of that diversity to refine a few precious use cases to earn that "quality" aura.
I think this move to ARM is signaling a couple more use cases being tossed to the curb, like beefy workstations and PC gaming.
MS was on mobile before mobile was mobile (pun intended), and part of the issue was probably that they got in the game so early that, by the time tech was actually mature, they were out of gas. Ballmer thought he'd seen it all, and he had not; when he realized, they panicked and reacted like headless chickens. It's a cautionary tale for the ages, but it could have easily gone the other way around.
On the other hand, they entered the game-console game very late but still had a pretty good run. They entered the cloud game similarly late but they're now killing it. They won the Browser War I and would have likely done it again in BWII, were they not forced to behave by authorities.
So it's not like they cannot execute, or like they are fundamentally impeded by their fundamental structure. They can smash it with the best of them, when the lead is focused.
I agree with the thrust of your post but I want to nitpick one thing:
> They won the Browser War I and would have likely done it again in BWII, were they not forced to behave by authorities.
Rephrased: Microsoft won Browser War I by using illegal tactics, and lost Browser War II because authorities made them stop using illegal tactics. IE/Edge was always a bad product, and its success was due to external factors.
That said, Microsoft has lots of resources and I don't doubt they could have made a good browser. By extension, I don't see Chromium Edge as a loss for Microsoft so much as an informed decision to throw in the towel. How does Microsoft benefit from spending millions of dollars on a custom rendering engine, when Google gives Chromium away for free? Microsoft can fully rebrand it, and put in their own ads, tracking, and default search engines. From my vantage point, Microsoft just saved an enormous amount of money and lost very little.
There was nothing in the justice department ruling that forced them to stand still when it came to browser development. Apple moves faster with Safari/WebKit after they started with KHTML. Apple wasn’t exactly a large company when they introduced Safari.
They also lost billions of dollars on the console market. Is it even as profitable as the AirPods now?
As far as getting in mobile too early. Failing at the Newton didn’t exactly hurt Apple getting back into mobile. You do remember that Apple was almost bankrupt in the 90s?
Apple was able to port OS X to the iPhone partially because it didn’t have all of the bloat of Windows.
I developed on WinCE devices using both .Net CF and C++/MFC. I am very much aware of the limitations.
> There was nothing in the justice department ruling
Between that and the EU ruling that forced them to implement the somewhat-silly browser-choice system, it was a "hostile environment" where the political message was fairly clear. Combined with a situation where IE was effectively dominant, it sapped any appetite to take it forward or take risks with it.
> Is [the console market] even as profitable as the AirPods now?
That is not the point, the point is that they pivoted just fine when they wanted to. They even had something genuinely novel with the gesture-tracking system. And how could they not? Windows and Office are a license to print money, even today.
> Failing at the Newton didn’t exactly hurt Apple
That was 1 device. Microsoft and their partners tried tons of different iterations of formats, some that didn't even exist at the time (kindles, book-like, palm-like - I had an HP that, without a stylus, looks now like a thick iPhone, but was produced in 2003)... but screen tech and cellular bandwidth simply were not good enough at the time.
I'm not talking of exhaustion because of financial losses (Microsoft was printing money, they could have done the same 10 times over), but of a shared weariness for the effort. At one point they basically gave up. I believe the famous Ballmer snark on the iPhone announcement was sincere, in the typically-disdainous way of his: Ballmer thought he had seen it all and this was just one more, nothing to write home about. They had failed so many times in the space that they had lost faith in the possibility of success.
Between that and the EU ruling that forced them to implement the somewhat-silly browser-choice system, it was a "hostile environment" where the political message was fairly clear. Combined with a situation where IE was effectively dominant, it sapped any appetite to take it forward or take risks with it.
There was no browser choice in the US and I believe it lost its lead here first.
> When they did the hardware transitions, they had a compatibility layer (Rosetta) that was supported for 5 years after the transition. But with the 32 bit change, the announced that it would no longer be supported in 2017, and dropped support in 2019, just 2 years later.
You are comparing the time a technology was supported with the time it was announced a technology would no longer be supported. 32-bit apps were supported in 64-bit Mac OS systems for at least 7 years.
From what I could find, the first announcement of dropping support for 32 bit apps was in 2017, and the support dropped in 2019. Why would an app maker update their old apps before an announcement was made that they would be deprecated?
> Why would an app maker update their old apps before an announcement was made that they would be deprecated?
Almost immediately after 64-bit support was added in 2005, it was the recommended path, not only for new apps but for existing ones to migrate to. For example, an document called "64-bit Transition Guide for Cocoa" [1] was first published in 2007. A few years later, another document [2] was more explicit:
> Should You Recompile Your Software as a 64-Bit Executable?
> As a general rule, in OS X v10.7 and later, the answer is probably yes.
OS X 10.7 was released in 2011.
Certainly, many – most – app makers did update their old apps. For example, Apple's refusal to support Carbon on 64-bit famously forced Adobe to scramble to update their apps to run on Cocoa, because they considered 64-bit support to be essential. Admittedly, one of the chief benefits of 64-bit was the ability to address more than 4GB of memory, and pro apps needed that more than your average app. But your average app was also easier to recompile, so in practice, pro apps were if anything the slowest to migrate. Even then, most of them did, long before the official deprecation.
By the time the deprecation was announced, let alone by the time Catalina was released, most apps that were still 32-bit were the ones that had ceased to receive updates altogether. From that perspective, the exact timing of the deprecation doesn't matter that much: most unmaintained apps aren't going to become maintained again because of an announcement by Apple. It was going to hurt no matter what.
So then why were there so many 32 bit apps? 2005 is, what, a year after the Intel transition? Surely not all of those apps were written during that one year.
Not to mention, for better or worse, games in particular were 32bit through 2018 at least. I imagine this must have had something to do with cross platform compatibility, although I can't imagine what.
> Not to mention, for better or worse, games in particular were 32bit through 2018 at least. I imagine this must have had something to do with cross platform compatibility, although I can't imagine what.
Well, here are the Steam Hardware Survey results from some random date in 2014:
If you click on "Windows Version", you can see that a substantial percentage (~20%) were running 32-bit versions of Windows, which can't run 64-bit applications. Most of those were likely on 64-bit processors, but had 32-bit Windows installed anyway for whatever reason; IIRC that might include driver support and not needing as much RAM. So game developers were forced to ship 32-bit binaries if they wanted to target those users. (They could ship both 32-bit and 64-bit versions, but that's relatively difficult on Windows; there's no equivalent of Mac "fat binaries". And there wasn't much point.)
That's Windows, not macOS. But then as now, game developers targeting PC primarily cared about Windows, with macOS as an afterthought at best. If the Windows version was built as 32-bit, the macOS version likely would be too for simplicity's sake.
There's also game consoles to consider. PlayStation and Xbox both switched to 64-bit in 2013, so most games released from then on would have to at least have 64-bit-clean codebases (even if they weren't tested specifically on 64-bit PCs). But earlier games might well not be 64-bit clean; the kind of low-level hacks that make porting to 64-bit difficult are probably more common in game codebases than your average app.
While you are talking about games, it's relevant to remember that Unity had different licenses for their 32 and 64 bits engines just up to a few years ago. I think the free license was 32 bits only, what locked a lot of people in it.
Were there? On iOS, I lost access to a lot of games due to the 32 to 64 transition. On Mac OS, nothing I used was impacted. If your app wasn't 64 bit, it was abandonware. Even the abandonware that I actually use, like Gitbox, was already 64 bit.
Yes, lots! I won't go through all the individual 32 bit apps I use because it's idiosyncratic, but this has been a major complaint about Catalina, aside from the OS's general buggyness.
And that's why in 2006 they introduced Macbook Pros which were 32-bit only?
If you want to switch to 64-bit, you don't introduce machines that cannot do that, that need their 32-bit ABI and have to be supported for years to follow.
Apple didn’t really have a choice. The choice required a fairly decent amount of legacy support for the next decade or so; I doubt they really wanted to commit to that if they didn’t have to.
1. Wait a few months, Core 2 Duo was on it's way. Previous Nehalem generation did support 64-bit, so obviously, the next Core Duo SKUs with 64-bit support were on the way. This way, they could skip x86 ABI entirely.
It's not like Apple doesn't have long periods without updates of specific models.
2. If they really had to release a model that introduces 32-bit ABI, they could support it properly.
So by doing it the way they did it, I have to agree with Jasper_. The only difference with his conclusion is, that in my opinion they just took the option 2.
Easy with 20/20 hindsight, but back then Apple didn't know it would be a few months. Should Intel have had unforeseen trouble with the Core 2 and decided to delay the release they would not be releasing any hardware.
Also, availability of enough processors and ease of porting existing 32-bits x86 Windows software could have been factors.
But they did take that option and now refuse to pay the price. They are forcing paying the price on their users now.
They got additional benefit at the time, if I remember correctly: Adobe suite was 32-bit + Carbon; so having 32-bit x86 ABI allowed to them and to Adobe to have the suite running natively on the new CPUs, buying Adobe time to port to 64-bit and Cocoa, and axe Carbon without waiting to axe Intel entirely.
Given the benefits they got, they should be called for not wanting to pay the price.
32 bit Carbon was deprecated in 2012. Did it really take a rocket scientist to know that you shouldn’t be writing 32 bit software in 2017? 10 years after the last 32 bit Mac shipped?
So enlighten us with those reasons why these devs and businesses chose to only support 32-bit. Other than they would find it cheaper to do nothing while Apple and the users pick up the tab, of course.
>> their last few transitions being fairly successful.
Their last transition is specifically what brought me to the Mac after having been away for a decade.
I switched to Mac exactly because they were going x86, because I knew I'd be able to virtualize Windows (although it took a year to happen), which was a win-win for me. I'm guessing a lot of people who used Windows at work plus developers on Linux wanting a nice GUI (remember, we're talking mid to late 2000s) were responsible for a lot of Mac growth back then.
I have to wonder how many people will switch back if x86 were to get totally abandoned.
I will likely switch back. I'm not an avid gamer but there are a couple of titles I really care about - and I'm pretty sure at least one of them will never make whichever transition Apple will impose, having only barely survived the 32bit massacre.
The last few years have really felt like enduring Apple more than enjoying it. It's a shame, because OSX remains great, but there is only so much it can compensate for.
> Back then Mac software support wasn't nearly as good as it is now
It might be a chicken and egg thing though. With the influx of people who wanted Macs on x86 and the halo effect of them spreading the word, the app support got much better.
Also, because the machines were x86 compatible, I'm sure a lot of enterprise buyers like IBM started to allow their users to get Macs instead of Windows laptops.
I obviously have no idea what will happen, but as someone who switched back to Windows a few years ago, I am definitely curious to see how it plays out.
The early 2011 15" MBP fiasco. Long story short, the logic boards had a design defect that caused them to fail from overheating. By the time Apple decided to address the issue and do a recall, Applecare had lapsed for many owners, who had already disposed of their computers.
I had my logic board replaced under AppleCare and it had the same issues within a few months.
I managed to keep my bricked MBP long enough for the recall, but the board (which I guess had the same defect) still eventually died (same for a friend of mine with the same model).
That laptop wasn't cheap, and is the only laptop I've ever had that died. I've had a lot of Mac and Windows laptops since the mid-90s (first was a Powerbook with a trackball).
So needless to say, while Apple has the rep of having the best customer service, there are exceptions, and I was unlucky enough to fall under one of those exceptions, and I will probably never go back again.
Having said that, when I switched back to Windows, it wasn't nearly as bad as I had remembered it, and has gotten better since. I only miss a couple of things from Mac, and the biggest one is iMovie.
Including 64-bit support until Catalina was effectively the same as Rosetta: support for deprecated tech. A Rosetta for 64-bit is just kicking the can.
The difference, to me, is that Rosetta was an inherently bad experience for users. The apps ran slower and used more battery, because despite Apple's best efforts, it was still an emulator, and emulators are slow.
The amd64 architecture, by contrast, innately supports i386 code—that's a key feature of the platform! Removing support from macOS does not benefit users; it only benefits Apple by lowering Apple's maintenance burden.
Given that Apple is one of the richest companies in the world, and they sell Mac hardware at a large markup, I'm not particularly sympathetic to their plight.
But now, they barely care about their desktop/laptop revenue (being only 10% of their revenue) and they don't seem to care much about their marketshare either because they don't seem to be doing much to expand it. They care a lot more about their iDevices now.
The selling point of iDevices is that so much content/code works perfectly on them, and that is because it is created on the OSX platform originally. If Apple think they can maintain that experience for customers while developers flock to WSL or Linux they are kidding themselves.
32/64 isn’t a good analogy because the only reason Apple supported 32 for so long was because Intel missed their deadline for core two duo which led cause the first generation core duos to be 32 only.
There shouldn't be too much of an Osbourne effect with ARM macs. If anything, the sensible approach is the opposite: get in there quickly and get yourself an x86 mac while you still can.
ARM macs make sense long-term, but the experience is likely to pretty rough at first while software support catches up.
As a very, very long time Mac user, the transition from 68k to PPC, then Mac OS 9 to OS X, then PPC to x86 was fairly smooth for the most part. Apple's relatively firm iron grip on their ecosystem due to their vertical integration gives them a lot of latitude to make moves like this be minimally disruptive to their user base. Now this is of course often annoying and disruptive to its developer base, and I've been on that side of it too. But, Apple knows how to pull this off about as cleanly as one could expect for such a significant overhaul.
Also they are not afraid of ditching old stuff when they needed to move forward i.e older generation hardware (Core 2 Duo macbooks) and frameworks (Carbon, 32 bit kernel)
Apple has never pulled off a transition like this before. None of your examples involved going from a popular architecture with broad software support to a less-popular architecture.
Transitioning to x86 brought additional customers who prefer Mac but also need to run Windows or Linux or who do x86 development. Transitioning away from it may lose them.
> None of your examples involved going from a popular architecture with broad software support to a less-popular architecture.
Not sure how unique this architecture will be, but by device count, ARM architectures dwarf all others significantly. Calling it 'less-popular' is wrong.
Less popular by count of non-mobile software. Which is likely a lot of business software. Still, Mac software seems to generally keep up to date with Apple's changes. Drivers on the other hand... not so much
68k to PPC was exactly "going from a popular architecture with broad software support to a less-popular architecture". ARM today is waaaaaaaaaaaay more popular than PowerPC ever was, and nearly infinitely more popular than when Apple switched from 68k, and 68k was very popular back then.
Commodore (known for the Amiga line) did go bankrupt due to the death of 68k. And Atari exited the home computer biz. It was a huge issue back in the day.
> Apple has never pulled off a transition like this before. None of your examples involved going from a popular architecture with broad software support to a less-popular architecture.
That's exactly what the 68K-to-PowerPC transition was: Going from a widely used architecture (Mac, Amiga, ST, and others) to a brand new one.
I think 68K and PowerPC were both clearly much more of a dead end when they were abandoned than Intel is today.
Also, I don't recall stories of Macs being able to dual boot AmigaOS, and there was nothing like virtualization in those days. There were emulators, I guess. All of those other platforms were less common than Windows is today.
And there was never any threat that they would enforce sandboxing and code signing like a lot of people speculate for ARM Macs.
It really does sound like the ARM Macs are going to be less capable. It would be nice if that turns out incorrect but it's sure the way it looks.
> Also, I don't recall stories of Macs being able to dual boot AmigaOS
They weren't. Amiga 68K systems had significantly different system architecture from Apple 68K systems, most notably in terms of their graphics and audio hardware.
> and there was nothing like virtualization in those days.
Weeelll... there were some special-purpose tools, but they were much more special-purpose than anything we call "virtualization" today. The Amiga version of Basilisk II [1] ran 68K code directly on the host processor, for instance. Later on, there were tools like Mac-On-Linux [2] which would run PowerPC MacOS on a PowerPC Linux system.
At the time of the announcements, there were G5s in the towers and G4s in the laptops. One of the big problem was, how to put G5 into laptop, as it was too power hungry and produced too much heat, and Motorola with IBM were not going to do anything about it. Moto went out of the PPC business and IBM was focusing on the servers.
We're talking about Apple here, the company that pulled off two seamless architecture transitions for its desktop platform -- something I believe no other desktop vendor even attempted.
ARM Macs will do just fine. Backwards compatibility will be close to 100%. x86-64 apps may run a bit slow, but native software will benefit from running on the most powerful laptop money can buy, period.
I don't think the concern is about performance but compatibility.
Even if all major developers (Autodesk, BlackMagic, Ableton, Steinberg, etc) are able to switch to ARM there is a myriad of plugin developers out there who might not be able to do it.
Catalina and the 32bit EOL hit the audio world very hard.
They'll probably run the entire gamut of mobile apps, with desktop apps more of the afterthought. I have a feeling this is more about… 'Mac Pro-ing the iPhone'?
Seems like it's about encouraging the development of way more ambitious ARM-based stuff that will tend to eclipse the x86 and 'desktop' things, on the basis that Apple is far bigger in the mobile space than they are in desktops. Maybe there's the prospect of super-powerful computers in a sort of cluster form… a dense block of iPhones, rather than a massively multi-core PC? That's sheerly hypothetical, but given the characteristics of ARM stuff it seems reasonable.
I don’t think the switch is about Apple providing better performance, thermal efficiency or anything like that. I think it’s simply cost control. Consumer Intel chips haven’t improved much better in a long time but they still command an enormous profit margin. And since Apple controls the compiler toolchain along with the OS and much of the other hardware, it can make the switch without the consumers it cares about the most noticing.
Apple have often suffered when it comes to external dependencies. That might have been when IBM failed to keep PowerPC competitive, Microsoft Office lagged on the Mac or now when Intel are struggling. The rumours are that several Mac releases have been delayed because of supply problems, the newly released 13" MacBook is shipping with previous revision chips in the low end models.
When you take this into account it's easy to see why dumping Intel is attractive to them.
security/boot control might be more under control. There might be more unknowable attack surface with someone else's management unit as part of the boot stuff and around your T2.
I'm pretty convinced Apple's transition strategy for existing software will be "Recompile, retest, republish":
No built-in emulation or other support for x64 binaries.
In the past transitions they had to have a strong compatibility story. But I think that's not nearly as important anymore.
For software that is being actively developed, releasing fat ARM/x64 (or maybe fat bitcode/x64) binaries can be worked into a release over the upcoming months.
Software that is not being actively released... was pretty much mostly kicked to the side of the road due to the completion of the transition to x64 (if not sooner -- death by a thousand small transition cuts).
There will certainly be casualties to this approach, but I'm guessing Apple will prefer to reap the benefits of a much simpler and straight-forward transition. They may also see it as no great loss. E.g., software that is no longer being recompiled is also no longer receiving security updates.
If this happens, the big kicker will be dependencies. Your hyper-maintained, always-updated app will probably contain at least a few dependencies that stopped moving a long time ago, and you'll be forced to find alternatives.
Also: x64 emulation (slow though it will be) will happen in any case. But from third parties. VMWare Fusion, etc. I think there will probably be a market for some kind of x64 container that "shrink-wraps" an x64 app or library with an emulation core as an alternative when recompiling can't happen (there's a lot of existing tech to build this on top of, so I think it will come pretty quick -- actually, in different forms for different migration scenarios. Because of the performance/resource penalty, you'll want to wrap as little as possible)
edit: I meant to add: It will be funny to come back to this after Apple's ARM announcements and see how bad (or good?) my speculations are...
> But I think that's not nearly as important anymore
I used Macs during Classic to OSX and PPC to Intel transitions but I'd argue it's actually way more important this time.
Back during previous transitions fully native apps were really the only option but now...
Almost all modern actively developed software on Macs is just electron wrappers Only a few exceptions of actively developed cocoa apps come to mind, Sketch, Omni Group, etc.
The other half of the Mac ecosystem is exist outside of electron apps or cocoa good citizens and that's old but essential power user applications the Adobe suite, 3d software, science software and audio software which all have extensive custom Intel code and all the plugin infrastructure runs on Intel.
To bring After Effects, for example to ARM is going to be an insane amount of work and a lot of it is going to sit outside of Adobe's control because so much of the application relies on plugins which all have to be recompiled too.
I feel Apple might have lost the plot here a bit and has convinced themselves that this functionality vacuum is going to be filled with catalyst iPad ports, reality is if the computer can't run these apps people require for their jobs then they'll just buy windows machines.
I think also code using x86_64 vectorization (AVX, SSE) is going to be non-trivial to port. Same applies to JITs, which I guess are a lot less mature on ARM due to lack of users at the minute.
32 bit is going back a lot longer, although my machine which I haven't upgraded to Catalina still has a few 32bit applications kicking around on it including one that I was using in my professional work only 6 months ago.
Plenty of 64 bit software exists that isn't being updated often or at all anymore.
I'd agree 100% but the state of how janky and alien their own Catalyst ports feel don't give me much hope.
If they want this platform to remain the world class computing experience they actually have to invest in it fully not just hope that some work can be skimmed off the top of their iPad investment.
Where is the example of a best in class 2020 Mac desktop app? Because it certainly isn't News.app.
With the App Store, would fat binaries even be needed for the most part? Just upload all the compiled versions to the store and it'll distribute the right one to the right devices.
For non-store software, I don't see a big deal in distributing x86 vs ARM builds separately, but that might be the developer in me speaking.
I buy iMacs for my personal use, as they spend the majority of the time running Windows 10 for gaming. Sure I could build/buy a dedicated gaming machine, but I like the hardware (Great monitor, quiet, sleek, mostly trouble-free) and my iMac can play any game at 1080p. Without bootcamp support, I’m going to abandon the Mac hardware for personal, and then for business use. There’s no way an ARM processor is going to be able to emulate x86 to run windows and my steam library at anything close to usable performance.
My prediction of Apple’s moves here are far less dire. These are just predictions mind you, but the point is Apple could do this without ruining the platform for enthusiasts.
1. They’re going to have some kind of breakthrough with x86 emulation where the Apple ARM CPU has a translator, or an accelerator, or a co-processor, or something that will make x86 run at decent speeds.
2. Apple will move to a long term two-CPU strategy with ARM on economy and low power SKUs, and Intel chips for the “Pro” SKUs.
3. As an alternative to 1 and 2, the direction might just be a killer next-gen iPad Pro and new iPadOS which substantially raises the bar for multitasking, trackpad, etc on that platform. I wouldn’t even dismiss the possibility of an “iPadBook” for want of a better name.
Every iOS app is currently tested in x86 on the Mac simulator and then ARM on the device.
If you ship a Catalyst app that runs on iPad and Mac you are shipping and testing on x86 and ARM.
Apple already splits binaries in the app store and only downloads what is needed for your device (including resources for different device resolutions), so the end user would likely not see an increase in app size.
I see no issue with developers shipping universal binaries indefinitely. I don’t see what the issue would be with that, for most devs it’s just part of the Xcode build process. I suppose it would double the compile time for release builds.
Testing is a legitimate concern but I don’t think it’s as consequential as you imply. (For most developers.)
> 2. Apple will move to a long term two-CPU strategy with ARM on economy and low power SKUs, and Intel chips for the “Pro” SKUs.
That makes sense to me: a two-CPU strategy (ARM for laptops and x86 for desktops where power usage is less important) would lessen the Osborne Effect. Mac developers would have a strong motivation to support both laptops and desktops (and thus ARM and x86 macOS) so consumers would not have to fear buying a soon-to-be-obsoleted computer.
Shipping bitcode instead of ARM/x86 universal binaries would also mean that eventually Apple could drop x86 and those bitcode applications would be forward compatible with ARM desktops.
In my imaginary universe, it’s be ARM for MacBook Air and a “Mac nano” shortly followed by an economy spec iMac. If the performance is there, it could eventually extend to the entire iMac line, with a broadened iMac Pro line staying Intel.
It would actually be quite tidy if “Pro” Mac meant Intel, non-pro meant Apple ARM.
What would keep Apple from shipping machines with both ARM and Intel CPUs in them? The ARM CPU would run the OS and decide when to ship jobs over to the Intel CPU. I can’t imagine that the home-grown ARM CPU would add much to the total price of the computer.
Recall that at one point Apple shipped laptops with two GPUs; one for low power work that was more power-efficient, and a more powerful one that was turned on as needed.
They already ship ARM co-processors in Macs: the T2 chip.
An alternative possibility to my earlier musings is that the T2 chip gets expanded to take over more system functionality—perhaps to the point where the Intel chip looks more like an accelerator add-on card than the primary CPU.
While they might like to go that way, I would say the subpar UX of iPadOS with keyboards/pointing devices suggests that they will need to pour a lot of money into the effort.
I look at two things Apple did recently: giving iOS on iPad its own name, and releasing the Magic Keyboard. To me, both these things look like moves in preparation for Apple seriously scaling up iPad as a computing platform.
I lived through both major Mac CPU transitions (68K-PPC and PPC->X86) and in both cases the assumption was that it was impossible for Apple to ship a usable emulator. In both cases Apple proved everyone wrong and in fact the emulator was a major contributing factor to a smooth transition. It simply takes a lot of time for all 3rd party software to be released for the new architecture.
Given how they handled it in the past my guess is that Apple will release a very usable X64 emulator for the ARM and really brag about it at WWDC.
A theoretical possibility: Apple could make Bootcamp support Windows on ARM. I don't know that they'll do it, but they could and it would be interesting.
The choice will also say something about how Apple views these ARM Macs: Are they iPhone-esque devices made exclusively for Apple software, or are they (relatively) open-ended computing platforms capable of booting alternate operating systems?
The problem is that Apple has moved to a home-grown stack for it graphics and deprecated OpenGL and moved wholly to Metal.
There is no way Apple would create a Windows driver for that custom silicon- it has always relied on AMD or Nvidia for those drivers in the past, it has no in house code base or expertise to write the requisite code in Windows and none of the silicon has been optimized for it.
This is an interesting point—but, are you expecting Apple to move entirely to in-house GPUs? On the laptops that currently have integrated graphics, sure, but I'd expect the higher-end Macbook Pros and iMacs to still have AMD cards inside. Perhaps that just becomes a prerequisite for Bootcamp—there's probably a lot of overlap between the types of people who want Bootcamp and the types of people who buy Macs with dedicated GPUs.
(I do agree that they're more likely to just drop Bootcamp support.)
Potentially keeping AMD as a partner is graphics on the high end is a good point- i.e. iMac Pro/Mac Pro.
But since its 99% likely that all the other Mac's will be SoC with no Windows driver support for any of the underlying silicon, I'd find it very surprising if Apple went to the trouble just for the high end.
That said, I thing Intel Mac Pros will be a thing for a long while, even is a 256 core ARM variation came out, just to keep the pros happy until they can comfortably transition to the new stack. Apple has spent too much human capital recently on the pro team to make me think otherwise.
I don't think there is a standard BIOS for ARM like there is for x86, so multi-boot would probably require Microsoft to support Macs explicitly in Windows on ARM.
Just observing the pain of getting Linux to run on new Macs, the answer should be obvious. Not that I think this is a bad thing, there definitely is space for an operator like Apple and exactly what many people need.
Yeah, this has got me debating whether I need to bite the bullet and buy a high-spec 2019 model, despite their lingering, myriad problems. I have too much stuff to worry about in my day-to-day usage to gamble on whether Apple is going to stick the landing with this transition.
I've been pretty unenthusiastic about the idea of replacing my Macbook Pro after they removed the MagSafe connector and HDMI ports. However, if the ARM laptops run cool enough to actually use on your lap, it might be worth it.
Knowing today's Apple, though, they'll make it super thin and it will still run hot.
Neither my 16” MBP or my iPad Pro are too hot for lap use. Yes - the 5+ year old laptops you are clinging to were saddled with some occasionally hot intel CPUs but they’ve come a long way.
After the new machines are released, all Mac app store apps will just magically work because they were recompiled behind the scenes by Apple to run on ARM. Only independent releases (stuff not in the app store) will have to deal with the shit show, which will also push more developers to the Apple tax.
For regular Mac users (who are already using the app store for everything), there will be no difference, and nobody will care what's actually under the hood. All of their software will work just fine whether it's x64 or ARM. There's nothing to wait for because these ARM machines aren't going to be so much better that people delay their purchase. It would be like holding off on the 2.2L engine variant of a car because the new model with the 2.4L engine with 5% better acceleration is around the corner.
Of course the reality will be that some apps will break, but the majority will just work, and Apple will be working behind the scenes with the biggest fish to make sure their stuff transitions smoothly.
> After the new machines are released, all Mac app store apps will just magically work because they were recompiled behind the scenes by Apple to run on ARM.
That would be really magical; this is essentially impossible.
It's not though; when submitting a native app to the app store, at least for iOS you're sending them an intermediate version; a version that can be compiled and repackaged by Apple for a specific CPU, e.g. 32 or 64 bits. They will also grab the assets intended for that platform, e.g. regular images for small screens, 2x images for retina and 3x images for iPad screens.
Bitcode is not a magic solution, though. You can’t take a 32-bit app, for example, and run it on a 64-bit device. That kind of portability isn’t something that Bitcode can give you, notably because that is something that’s visible in C. As you’re writing C code, you can write #ifdef pointer size equals 32, and that’s something that Bitcode can’t abstract over. It’s useful for very specific, low-level kinds of enhancements, but it isn’t a panacea that makes everything magically portable.
Yes, Bitcode is not a requirement for the Mac App Store. Furthermore, Steven’s blog post is misleading: it implies that Apple can just seamlessly translate IR into a different architecture, and shows it kind of working for the simple example he has there. The problem is that this falls apart for anything that is more than mildly complicated (which, of course, is not mentioned in he blog post) because of ABI differences. Apple did this once before with Apple Watch and there they had full control of both the chipset and the ABI on both sides of the transition, plus they had the forsight to mandate Bitcode submissions for the platform. And watchOS itself precludes doing certain fancy things anyways. So this isn’t really a proven way forwards in this case.
> app store apps will just magically work because they were recompiled behind the scenes
That will be hard to do, as apple doesn't have the source code. They could transcode them from one machine code to another, but this is unlikely. It will be the app developers themselves that will need to recompile. But I'm sure it will be a (relatively) easy process.
Given a large proportion of premium mac sales are to people in the creative industries using specialist software distributed outside the app store, I don't think your intuition is correct on this one.
What percentage of the mac user population are they? 5%? 10%?
80% of mac users are like my mother-in-law: The Mac is a machine that does their taxes, emails, word processing, small business stuff, browsing, movie watching, and sometimes video chat if they're not doing that on their phone.
And if a company that depends on the creative industry is selling its software outside of Apple channels, I'm sure they'll have the foresight and budget to make sure their software works on the new machines.
I imagine that Apple has a pretty smart team that has thought of their market share prospects, using internal data that no one else has access to. The chances of an armchair analyst out-thinking them is pretty slim.
Does the underlying CPU platform matter anymore for most people? If they have some non-standard proprietary apps then for sure that's a big issue. But if most OS apps are recompiled, then you have iCloud apps too, the bigger hitters being onboard is key (Microsoft, Adobe). If you pretty much hit those key points then it begins to look like it'll be relatively straightforward. ARM Macs will increase Apple's profit per unit shipped as they get the CPU for free. No longer beholden to Intel's erratic road map and failure to deliver.
It's unlikely Adobe will be on board. Porting their apps to ARM, even with a virtualization layer, is going to be hell. They dragged their feet last time around, I bet they will do that 1000x now.
MS might decide that "the web is good enough" and just drop native Mac apps for the time being.
Considering that Adobe has been actively porting creative cloud programs to iPad, I'd say the opposite.
They've already been optimizing the existing code base for ARM, maybe with a heads up from Apple that this would be a good preparation for changes coming down the pipe on Mac...
Adobe is on record that those are ports - the core of the app is ported, but individual features don’t all make it over, or make sense on a touch device so you u e a feature disparity.
>> Does the underlying CPU platform matter anymore for most people?
Developers. They want to run the same code on their machines as on their servers. Or in some cases a library might not be available (or behave differently) on one CPU family vs another.
In theory this could accelerate adoption of ARM servers.
I would be surprised if apple transitions all of their macs to ARM at once. Consumer-level macs (macbook air, mac mini, iMac) would go ARM now, while keeping the macbook pros, imac pro, and mac pro x86 for at least a few years. This could happen tacitly with the latter computers being sold in their current form (maybe with minor spec bumps). This allows Apple to make the transition without abandoning their creative pro users, and the makers of pro apps will be incentivized over time to port their software to access the growing ARM-only mac user base. That, or apple has already been working with (and probably paying) Adobe, Avid, Autodesk and other makers of marquee pro apps to port their code to ARM so that Apple can claim that their ARM macs will be pro ready from day one.
It seems like the de jour solution to the Osborne Effect is to create the consumer expectation of significant price increases. People still buy last gen hardware nowadays well after the new hotness is out, because the new generation often costs 50%-100% more for a year or two.
If Osborne had said "The next versions will be even better, and only cost $5000 more!" people would have kept buying the Osborne 1.
The first generation PPC Macs were not equal to the 68K Macs they replaced. It took another generation of PPC hardware and optimisations in the OS and applications before the performance envelope expanded.
I really doubt there would be a major backlash if a company like Apple said "Due to <nonsense manufacturing related excuse>, we are able to offer the new iPhone for $XXX dollars less than we expected!"
Consumers rarely complain about lower prices than they expect.
Hell, tons of companies offer their products at an MSRP that drops massively within weeks.
I’m just real glad that my personal upgrade cycle is at the point where I won’t be looking for a new computer until at least the second year of ARM Macs, when most show-stopping bugs should be ironed out of the OS and the Adobe tools I spend most of my work time in.
I was an early adopter of PPC and OSX, I’m not gonna sit there trying to get work done via the old-cpu emulation again.
On a desktop machine, there’s no benefit in lower power dissipation and slimmer bodies
If you’re in a hot climate then every little bit helps. Lower your power bill due both to lower consumption on the part of your computer and due to the AC coming on that much less. Run the computers fan less. Maybe even have a Mini that does a surprising amount of what you’d buy a Pro for. Make the iMac that much more of a flat screen.
> On a desktop machine, there’s no benefit in lower power dissipation and slimmer bodies
The obvious win for me is shipping multiple times as many cores in the same TDP. If Apple used a chiplet strategy and good interconnect to make scaling easy - imagine a 16-core Mac Mini, 64-core iMac Pro, and something obscene (256 cores?) in a Mac Pro.
Exactly. If Intel's 3ghz cpu eats 262W and emits 895 BTU/h, and Apple's passively cooled ARM laptop chip eats 130W and 450 BTU/h (completely made up numbers), how fast would Apple's ARM chip run with active cooling and 250W?
Yup, and if they reduce power consumption (and heat generation) per core, they can cram more in the same device without having to upgrade the cooling or power supply, and they'll have to throttle less if temperature limits are reached. It's always worthwhile to get more performance-per-watt.
I read that paragraph and had a completely orthogonal thought. Intel mobile CPUs compare favorably to all established competition (AMD) and the iPad with Magic Keyboard is almost a laptop anyway. Why start with a mobile device?
Intel desktop CPUs are where they have problems and you would assume that this is where Apple has the most frustration with Intel. That's where I'd start.
Also, by keeping ARM and Intel around simultaneously for a few years (both as first-class citizens) forces developers to maintain portable code. If you are using Swift in the manner that they want you to use it, this is already the case. If you're using some other language/approach then you have to either think about adopting Swift or putting in more work.
The thing is, the majority of Macs sold are laptops. It doesn’t make sense for them as a business to build a strategy around what’s best for desktop machines.
Yes, if their plan was to keep both platforms forever it wouldn't make much sense. But if they have a 2-3 year transition plan, it might make more sense to start with something for which low sales is still a success.
same, I wrote that comment sitting outside a cafe in New Orleans. 80ºF, plus humidity. I can only assume that Gassée has not spent much time in the tropics, or that if he has, he is more than wealthy enough to not worry about the cost of running the AC constantly. :)
>I was an early adopter of PPC and OSX, I’m not gonna sit there trying to get work done via the old-cpu emulation again.
The lowest spec of the very first PowerPC Macs (the 6100) was faster than any 68K Mac, including emulated software.
The first Intel Mac Mini was faster than the G5 Mac Pro.
EDIT: Sorry, I misremembered; 68K emulation did impose a performance penalty versus contemporary 68K Macs. PowerPC-native performance was definitely faster than any 68K Mac, and developers were very good about releasing fat binaries.
I’m glad you did the edit. I was about to tell you a story about comparing my Mac LCII with an upgraded 68030-40 and my PPC 6100/60 running emulated software.
Yeah- you were generally standing still at best. The nice thing though is that as your apps (and OS) updated, you were continually getting 'free' speed boosts.
I went backwards theoretically. The 6100/60 emulated a 68K at about the speed of 68030-25. But by the time I upgraded in early 1995, the software I cared about was native.
I had been using SoftPC to run compilers for school. I bought the DOS Compatibility Card for my 6100 and that was a dream machine.
The author mentions Nokia as an example, I didn't follow the transition closely. Was everyone really waiting for the Windows Nokia phones or did Nokia just lose mindshare (as I remember it)?
Any other good examples for Osborne Effect? Wikipedia does list SEGA's transition from Mega Drive/Genesis to 32X and Saturn and I can see how this confused potential customers but I guess there must be more (better?) examples.
Symbian was the cash cow, successful in a number of markets. But not up to par with where Android and iOS were leading the industry. The prevailing wisdom seemed to be it needed some modernization or replacement.
Maemo/Meego was their previous replacement candidate, based on Linux and Qt. Personally I liked this one. They never really allowed it to be on a path to cannibalize Symbian. It was always like an experimental project.
Switching to Windows Phone killed both of those.
Less talked about is s40, the low end feature phone-ish platform that continued to sell well in emerging markets long into the Windows Phone era.
Sweetheart deals with ex-Microsoft Executives leading the company. In my opinion: This is a cautionary tale of career executives.
There was, however, an anecdote banded about the Helsinki R&D Nokia offices: "We have, easily, the most expensive clock application ever made, it's been rewritten so many times". So maybe the inefficiency was always going to kill it.
That said, Nokia was betting on Meego replacing Symbian, symbian was so crazy optimised for low end hardware though.
The N9 clock app was a work of art though, super easy to use. I'm still sad that Meego didn't take off. It was the most intuitive, polished mobile OS I've ever used.
That announcement killed the little developer love that they still had.
Symbian was transitioning to better Java support, moving aways from Symbian C++ via Qt and PIPS (POSIX support for Symbian), 3rd Reboot of Symbian IDE (2nd Eclipse based attempt), having Python and Web based widgets as well.
Then comes the Windows phones announcement telling Symbian developers that all the tooling improvements from the last couple of years would be thrown out the Window.
That was the last drop to just abandon Nokia phones as platform for many of those developers, that decided to focus on Android/iOS instead.
Also as former employee that deal was quite a surprise to us, Nokia used to have a strong anti-MS culture on the engineering units.
1. Apple designed chips are already faster per-core than a top-end MacBook Pro.
2. There's no reason dedicated ASIC modules for specialized tasks (like video rendering) can't be connected by Thunderbolt 3 just as external video/compute modules are now.
3. There's no reason a dedicated x86-64 dongle can't be sold for ARM-powered Macs that need to run software that hasn't been ported yet. It would be a spectacular coup if those x86-64 dongles had chips made by AMD but I digress...
Faster per watt I agree. Per clock, they could be somewhat similar with caveats. Per Core, the clock speeds that can be achieved with intel are much higher (even if it throttles). I think both are good enough for 90% of the population.
I don't see anyone building an external compute/accelerator platform around an ARM Mac ecosystem. It's a niche market as is and the direction has always been to integrate functions like this onto the chip / package directly. I think the latter is certainly an opportunity apple will choose as they will be able to claim their new processors have a special sauce that intel does not have and charge an apple premium for it.
Are you suggesting plugging in a dongle that has a CPU on it? I'm very confused if true. As the bandwidth between RAM and CPU is ~50GB/s, any external dongle would not be able to communicate with the system RAM without a huge slowdown, so the dongle would need it's own RAM too, how this would communicate with a GPU is questionable again and this is already a weird crazy expensive dongle.
I think this is the part that will make 3rd party developers give on on the Mac. Mac ports are given the least love as is, mostly because apple will break your application on such a regular basis it's not worth the cost to port/support, but this change adds so much more burden to the developer, I think most will wait a few years before making a commitment unless apple throws money as them.
Takes me back to the Apple IIe that I used to develop 6502 controllers with in assembler. It had a Z80 add on that ran CP/M and Wordstar that I used for writing the documentation.
> 3. There's no reason a dedicated x86-64 dongle can't be sold for ARM-powered Macs that need to run software that hasn't been ported yet. It would be a spectacular coup if those x86-64 dongles had chips made by AMD but I digress...
Fun suggestion, so I'll bite :-) would a USB-C dongle actually be a speed improvement compared to host dynamic binary translation? Especially since all RAM operations would need to be funneled over I/O...
1. Please don't extrapolate from a useless program like Geekbench to a general concept such as "faster"
2-3. TB3 is a single PCIe x4 link, which is not even enough for a single state-of-the-art SSD. It's certainly not fast enough to connect an entire other computer to, except for certain embarrassing workloads.
When it's a laptop, I'm quite the fan of having as much as possible built into the machine as a single unit. That's the point of the form factor, after all.
I think the last thing the MacBooks need are more dongles.
Is that actually the point of the form factor? Seems like laptops (designed to be easily portable) should be as small and portable as possible with optional add-ons for specific use cases. Desktops can have as much as they want because they’re not designed to be moved very often.
There is likely to be yet more convergence towards iOS constraints. This seems inevitable, especially if Catalyst is central to how Apple carry out any ARM transition. It's very clear that Apple aren't catering to developers in macOS to the extent that drove OS X growth from 2004.
For a time, many developers couldn't find fault with OS X and Mac hardware. I wonder how much more correction we'll see from Apple, now that they've gone back on keyboards to a degree. I'm not counting on too much, I know a lot of people aren't as happy as they could be.
I really hope Catalyst is a dead end and only exists because they somehow started working on it before realizing SwiftUI was the way forward. I've yet to see a Catalyst app that does not feel like a quickly hacked beta.
Once you have a SwiftUI app you don't need catalyst to make it work on macOS. You can tweak the SwiftUI code specifically and the final user experience will be better. Catalyst only exists because UIKit does not work on macOS.
> This seems inevitable, especially if Catalyst is central to how Apple carry out any ARM transition.
That seems really unlikely to me. Catalyst is primarily a tool for porting iOS applications to macOS. Apple already has plenty of ARM-based devices that run iOS applications; there's no reason for them to muddy the waters by introducing another one and calling it a macOS system.
By my count, every new built-in app which Apple has introduced in the past two years has been a Catalyst app. It's possible I'm forgetting one, but the trend is pretty clear.
Stock apps set the example for how third party apps should use your platform. IMO, Apple is setting a pretty clear example.
Wow, those aren't Catalyst?! They certainly fooled me!
Mind, I'm not sure that's any better: Apple is still setting a pretty clear example for future Mac apps. Mac-specific UIs are out, iOS-esque paradigms are in.
I think this enchanted land ("many developers couldn't find fault with OS X and Mac hardware.") isn't grounded in reality. I only started using Macs around 1999 (though I had an Apple IIc prior.
Apple hardware has always had bugs, flaws, and failures. I had a Mac Pro G5 that leaked (it was liquid cooled). I had the first Intel Mac Pro, and the GPU died several times (and was like $700?). Luckily it was covered under warranty.
Laptop batteries bulged, screens delaminated, cases displayed wear because wrists get sweaty. iTunes on Windows sucked. And obviously keyboards butterflied.
Software has been up and down, sometimes secure, sometimes not. Sometimes fast, sometimes buggy.
But the reason the Mac came back was because despite these perennial flaws and issues, it was WORLDs better than the alternative.
Indeed. Been using Macs since 2006 but switched back to Windows late last year after being fed up with the declining quality of Mac software, and the lackluster hardware options.
Speaking to friends suggests that I'm far from the only one.
> For a time, many developers couldn't find fault with OS X and Mac hardware
Totally.
I bought my first Mac in 2007 and from that moment until 2013 I only had small complaints about the Mac platform.
Then my MBP 2011 suddenly died and by the time Apple started a repair program it had been collecting dust for almost 2 years. Then Yosemite came and it was the absolute worst version of macOS I've ever used.
These days I'm happy with my iMac 5K but I'm still using High Sierra. I will probably upgrade to Mojave at some point but that's it. I will _not_ upgrade to Catalina. Thank god I don't need Xcode.
I do worry; some years ago they added some features to MacOS that made it look like they were headed towards a touch screen interface, convergence with their mobile products - Launchpad, mainly, which I hate and never use and I don't think there are many people that do. I mean yeah, using Finder to list applications beyond what's in the bar is crude at best, but it works fine for me. I don't use Launchpad and have the application bar hidden most of the time; spotlight works for me for 99% of my use cases.
But that's just one piece of anecdata. I prefer my MacOS to remain roughly how it is now.
It's a combination of development in CPU performance. The peak performance levels of the Apple's ARM architectures have reached peak Intel/AMD levels around the Apple A12, so they haven't been at this level for that long. In addition, Intel has not been good at iterating on their hardware since Skylake in 2015 and don't look to be making great strides in the short term.
It's like the PowerPC->Intel story all over again. Intel was not at PowerPC levels that long before 2005 (PowerPC had substantial advantages from 1994 to 2002), and IBM was not showing evidence of improving the situation beyond 2005.
I think before 2018 it was unclear whether an ARM Mac would be superior in terms of performance/heat/power, but I think given Intel's lull and Apple's continued improvements I think the best decision is clear [if you think it is worth the transition, which is the most debatable item].
- They have only just finished deprecating 32-bit x86 instruction set support in Catalina. Now there is space to support another instruction set, as supporting 3 at the same time would be more expensive
5. Development cycles are long, they were operating under the assumption Intel would deliver efficient 10nm parts in volume, and instead Intel is still on 14nm++++++++++++.
I feel like popular is an understatement—they were everywhere. They basically were the MP3 player market, and that was in turn how just about everyone listened to music. I can't say that today about the iPhone.
To me, moving the Macs to ARM or perhaps just the laptop models to ARM could be a win for Apple. Thinner, lighter, lower-power and better software integration with the iPhone/iPad could result in happier consumers and more reluctance of users to jump to Windows based machines.
The risk is that there will be enough difference between the ARM machines and the previous ones that consumers will be reluctant to buy the new machines. I know that the keyboard problems on recent laptop models caused me to buy some windows laptops for the rest of my family.
There is a way for Apple to facilitate this move to ARM. Make the ARM machines noticeably less expensive than Apple's previous Macs. I've been disappointed during the last few years by the cost of Apple's Macs, the mini, the iMac Pro, the Mac Pro, and high end configurations of the laptops seem overpriced now. Perhaps this is is a longer term strategy for Apple. Maybe they have priced the recent machines higher so that adoption of ARM based machines can, in comparison to previous models, look more attractive. Apple could also lower the margins on the new ARM machines, at least until there is enough buy-in by consumers to raise the margins back to be comparable with Apples pre-ARM products. My prediction is that the new ARM machines will be priced very competitively when they are announced. In a few years, their prices will rise.
I’m rather happy about buying the recently updated 13” MBP and basically skipping ARM Macs. My last one lasted 8 years, so I expect this one to last roughly as long. By then they’ll _definitely_ have ARM figured out, and in the meantime I can continue enjoying x86 compatibility and VMWare. For the fairly small segment of the market that we devs represent, I suspect I’m not the only one here implementing a _reverse_ Osborne Effect. Get your x86 Mac while you still can!
The biggest questions many of us have around this transition include:
a) Will this change be smooth and offer native enough app support (or emulation) at launch?
b) Will the performance and price be acceptable?
c) What will the ripple effect be throughout the existing Apple dev community?
d) Is this move going to do well for Apple, or the opposite.
I'm finding it tough to speculate on this one, but I've seen a lot of Apple misfires since 2013 when it comes to price, functionality and quality (should I dare say, "the butterfly effect"?).
In short, Apple's direction hasn't been great recently. Given this, I think it more likely that the negatives will outweigh the positives.
I agree that the Mac trajectory wasn’t that great during the last decade with a lot of misfires. However, the most recent Mac picture is much more positive: Nearly the whole lineup is up to date, they butterly keyboard is out, many questionable design decisions were rolled back.
I get that after nearly a decade of disappointment that‘s not enough to rebuild trust completely and there is still more than enough to be critical of, however the most recent positive moves to make me cautiously optimistic.
Plus: Apple has managed these transitions extremely well in the past. That‘s something they are (or at least were) good at.
Here they could also be helped by the fact that these transition times usually put any other extreme experiments on pause. I don’t expect Apple to switch out the CPU and also offer up a radical redesign (where they can make many wrong decisions), at least not across the product line.
They also didn’t do that for the Intel transition.
And yet the _most_ recent releases on the Mac lineup have been much improved. If that is the trajectory and not an outlier one could make the case for the positives outweighing the negatives!
Somewhat off-topic, but what a strange image to feature in the article: a composite of the Apple logo over the Sony PlayTV [1] logo, a PlayStation 3 accessory for watching television.
Meaning, today I write. Because what I say is necessary. Can’t just be silent.
When is he gonna give up the ghost? He’s not at Apple for a reason. Yet he’s obsessed with them.
No one at Apple sought him out before going in any direction. He’s been a non-entity there for two decades. Yet he can’t stop trying to tell us all how to think about everything they do.
Gruber is at least a little more attached to Apple.
But the two of them sound like two old cranks: one consistently fawning over all Apple decisions, the other fretting.
I don’t know what it’s like to devote one’s life to discussing a single corporation, even one as wildly successful and entertaining as Apple.
I always wonder if these guys ever wake up and think, “Maybe today I’ll just delete my blog, stop reading press releases and tech news about Apple, and go build that cabin or walk a trail”.
Maybe they’re terrified no one would miss them? Or maybe they’ll find that life is better and that they wasted the last few decades of their very precious and limited life?
I don’t know. Maybe they are so enmeshed it just never occurs to them. It just always makes me sad to see their posts. Maybe that says more about me in some sad way than it does about them. Don’t know.
I'm a bit more skeptical.
Back then, their desktop/laptop business line was much more important to them. They needed it to go well. They were also a niche player and couldn't afford to lose more market share to PCs/Windows.
But now, they barely care about their desktop/laptop revenue (being only 10% of their revenue) and they don't seem to care much about their marketshare either because they don't seem to be doing much to expand it. They care a lot more about their iDevices now.
Also, if the transition from 32 to 64 bit is any indication, they don't care nearly as much about backwards compatibility as they used to. When they did the hardware transitions, they had a compatibility layer (Rosetta) that was supported for 5 years after the transition. But with the 32 bit change, the announced that it would no longer be supported in 2017, and dropped support in 2019, just 2 years later.
I can no longer use a lot of my favorite apps because they were games written years ago with no current support, so they were not updated. Or they're things like the old Microsoft Office, which works perfectly well for me, but I can no longer run. And Apple provided no emulation environment or official way to do so.
So now my choice is to keep an old computer around just to run old apps (which could be a security issue if it's on the network), or buy something like Parallels, which is a ton of overhead for a single app at a time.
It would have been nice if they had provided a Rosetta for the 32/64 conversion. The fact that they didn't does not bode will for this transition.