Hacker News new | past | comments | ask | show | jobs | submit login
macOS Catalina (apple.com)
477 points by css 8 days ago | hide | past | web | favorite | 562 comments





I've been on the beta for a while. I'm sort of surprised that they actually released with the state it is currently in. Most things work, but my CPU still spikes for no reason running some OS process or another.

Also, clicking on a dropdown box in a web browser (Safari, Chrome, or Firefox) after the computer has been asleep for a while freezes the whole computer for about 10 seconds.

Maybe it's just me, but that feels like a showstopper.

The whole 32/64 bit thing doesn't seem too bad, although I've had to use Pages instead of Word and can't use Adobe products anymore and have had to switch to open source alternatives, but I don't use those products very much so it wasn't a huge deal for me.


I've just checked which 32-bit apps I have (system information/software/applications if you wonder) and turns out if I upgrade I lose:

- My scanner support (they have new version, which is insanely buggy and can't even detect my scanner properly)

- All my games (literally every single one of them turns out to be 32-bit) - I don't play that much but still, would be a shame to lose all of them

- Postal label printing app (no idea if they have an update, need to check)

- App supporting my stand-alone disk array (not using it too much though)

- My niche learning apps (pretty old, from small providers, not sure if they'd ever be updated)

Much more than I expected, given that I don't have any exotic stuff like music instruments, specialized equipment, etc. So I personally probably would hold off upgrading as long as I can.


Apple's backward compatibility is poor, and particularly bad for games. The 32-bit apocalypse kills off a large subset of the Mac's already less-than-stellar game library; and on iOS it already broke many apps that I used, many of which will not be updated. To me it seems like Apple is going in the wrong direction - really I want them to add an x32 ABI to save pointer space for apps that don't need more than 4GB!

Stick with Windows/Steam/Consoles/etc. if you don't want a bunch of your games that you purchased to become unplayable every year (unless you add multi-boot to run the old, compatible OS, or run them in a VM, which usually works poorly.)

As I have mentioned before, it's a very annoying example of Apple shifting technical debt and maintenance work away from themselves and onto developers, and the overall maintenance burden is greatly increased.

I like how the platform moves forward, but I really wish that there were a way of getting desirable security patches and feature improvements in the OS without breaking all of my apps.


Dropping 32bit support is game-over for gaming on macOS. The numbers already didn't add up for indie developers [1]. Even AAA game developers often ignore macOS because the ROI is just too low. And from the $5/month subscription how much will end up to the developers [2]?

Essentially what Apple Arcade will end-up hosting is ports of run-of-the-mill mobile games. And I'd bet the advertised number of developers will diminish as soon as the agreements they secured for the launch expire. But if this is the kind of gaming you're interested into, you already have your iPhone/iPad.

[1]: https://www.gridsagegames.com/blog/2019/09/sorry-mac-users-a...

[2]: https://www.vice.com/en_us/article/43k4ww/its-hard-to-use-ap...


> Dropping 32bit support is game-over for gaming on macOS.

No, it isn't. There were never any 32-bit x86 Macs with decent GPUs. Dropping 32-bit support only affects games that were already old enough to be commercially irrelevant or games that rely on unmaintained third-party middleware that never got a 64-bit port. Those don't add up to enough to be a major impact on the Mac gaming market. There are plenty of other factors that are much more important, such as Apple's abandonment of OpenGL and preference for Metal over Vulkan.


A huge amount of Indie games do not require a powerful (or even decent cpu). Many of them have Linux and Mac ports. Those are the same kind of games whose developers do not have a lot of spare resources to stay on the endless treadmill of deprecation.

It also kills any hope of wine powered support for proton in steam as many many windows games will be still 32 bit for a very long time (mind you, metal already had put a big dent in that).

Interestingly Linux users are indirectly getting affected by this: for many developers supporting Linux was just a byproduct of supporting macOS. As the latter is being dropped, support for the former is getting harder to justify. Luckily Proton seem to be a very viable alternative to native games.


I thought PlayOnLinux/WINE is the linux gaming solution.

Proton is a wine variant directly integrated with steam.

A bunch of Aspyr titles (a sizable publisher of AAA ports) won't work with Catalina, and there are no plans for updates.

These aren't old games for the most part: https://www.macworld.co.uk/feature/mac-software/apps-wont-wo...


Those are old games for the most part. The newest thing on that list appears to be from 2015, and it's a remastered version of games released in 1999 and 2003.

They certainly weren't making much money off these particular games even before Apple announced the deadline for going 64-bit. It's unfortunate and disappointing for customers who had already bought those games, but this deprecation didn't shut Aspyr out of much in the way of future sales.


Users do not care about that. They care about when things they paid for stop working.

But that's not what drives the market for Mac gaming. What matters is what kind of return on investment developers expect to get out of porting to macOS. They don't particularly care whether the port keeps working for four years or forty, because they'll make basically all the revenue in the first several months.

Now, if users become reluctant to buy Mac games for fear that they'll stop working unacceptably soon, that could have a meaningful impact on demand for Mac games. But even if Apple made it official policy that they would break games after four or five years, that wouldn't completely kill the market for Mac games. If Apple made it cost-prohibitive to get a game ported to their platform in the first place, that would pretty much be "game-over for gaming on macOS".


If you can't play your old games it makes the entire platform unattractive to players, which in turns makes it unattractive to game developers.

But tbh Mac was hardly a gaming platform to begin with.


>it makes the entire platform unattractive to players

The platform has never been "attractive to players" to begin with.

Now we have Arcade though and easy porting to iOS though, which could open a multi-billion market...


Now, if you only had a way to actually get your fair share of those billions...

Commercially irrelevant for the game maybe, but not for the platform. If i can't play my old favorite games i am less inclined to buy a new platform. Or might move to a platform which supports those games.

> No, it isn't.

Well, let's see. My main 3 purchase locations are Steam, Humble Bundle and GOG.com. I'd guess this is fairly typical, but anyone using other platforms feel free to add yours.

Steam: The Steam client is still stuck on 32bits. I'd bet it will soon be updated to 64bit, but it just goes to show that macOS is pretty low in the priorities of Valve. And if a company with the resources of Valve doesn't care, I can't see much motivation for the individual developers either.

Humble Bundle: I've went through the current and following Humble Bundle Monthly games. Windows Only: Call of Duty WWII, Crash Bandicoot Remastered, Spyro Remastered, Sonic Mania, Planet Alpha, Override-Mech City Brawl. Windows+macOS: Battletech, The Spiral Scouts.

GOG.com: There's already a 64bit DOSBox port—DOS–era games will eventually be supported. Windows–era games are gone—Catalina breaks Wine emulation and no announcement has been made for 64bit support by the Wine Team. For newer games, GOG.com will need new builds from the developers—doubtful if we ever get to see those, especially for indie games where developers don't have the resources to go back and port their released games. Newly released games are often Windows-only.

I don't know how you see this picture, but it looks pretty bleak to me.


The Steam client has been updated to 64 bit quite a long while before. But the in built updater seems to only fetch the 32 bit version. If you uninstall Steam and do a clean reinstall from Steam website, you will get the 64 bit client.

Old games like Civilization 6? Civ 5 is old, I give you that.

Civ 5 and 6 both have 64-bit Mac versions. It's only Civ 4 that's unplayable on Catalina. That's a game from 2005.

Most indie games in the past 5+ years are made with a few middleware engines like unity or gamemaker and AFAIK it's a rebuild option to release a 64bit version.

This is true, but unfortunately, a lot of games are almost completely unmaintained, especially on macOS. Many of the smallest indie game developers don't even own Macs—they either borrow their friends' machines to do one-off builds, or they use cross-compilers and rely on fans for testing.

Even just recompiling the game might be a challenge for a lot of developers in the "long tail" of the Mac game library.


This is my sentiment as well. A twitter summed it up nicely:

> the notion that every developer of every app is part of the constant, incessant update loop that Apple encourages is fundamental to the problem. Someone's super personal narrative unity game from 2013 is not getting updated!

[1] https://twitter.com/MammonMachine/status/1181327259057082368


It would be interesting to see a report from Steam on how many 32bit only OSX games are in the store. According to the stream hardware survey [1], OSX represents 3% of their install base.

[1]: https://store.steampowered.com/hwsurvey


According to people in the know, the $5 a month is for Apple. Apple pays a flat-fee to the developers of games in Apple Arcade, and that's it.

... I mean sure, I love the idea of paying $60 for a game that throws away 15-20% of the performance by targeting an architecture that hasn't been supported by MacOS for more than a decade, rather than supporting an architecture that has been around for something like 15 years.

But yes, apple is in the wrong here, and should continue to support hardware that they haven't shipped in 10+ years.

I assume you're in the group that believes MS should be required to support XP forever.


As far as iOS is concerned, Apple got rid of 32 bit support in the processor itself allowing it to improve the processor.

Keeping around old code increases the security vulnerability surface. For instance, there are at least a half dozen ways of representing a string in Windows. One of the earliest widespread vulnerabilities in Windows was caused by improper handling of string encoding where anyone could run DOS commands on a web server running IIS just by encoding the commands in the browser.

https://www.sans.org/reading-room/whitepapers/threats/unicod...


> As far as iOS is concerned, Apple got rid of 32 bit support in the processor itself allowing it to improve the processor.

That is slightly different because Apple is designing their own mobile CPU's. And indeed by dropping 32-bit ARM support they can simplify and improve their CPU designs.

OTOH, Intel isn't gonna drop 32-bit x86 support from their chips just because Apple isn't making use of it.


Yeah but maybe dropping 32-bit support is a necessary step before dropping Intel chips...

They will face some* backslash now, but if/when they switch Mac to their own arm chips they might achieve a painless transition.

*They announced 32-bit deprecation like a decade ago, will legacy users be pissed of? yes! Is it a excuse for developers that still relied on 32-bit support over the last decade? NO!


Maybe, but I remember reading a the time that it was so the OS didn't need to load two versions of every library in.

Your example is from 2000/2001. The same year OSX was first publicly released.

That’s kind of the point - it’s gotten worse since then. Windows has become more bloated as they’ve added on more layers and refuse to drop backwards compatibility.

Presumably Apple is dropping x86-32 because they don't want to spend resources maintaining 32-bit support libraries, doubling their testing matrix etc. And now you want them to add a third ABI?

(Even in the Linux world, which prides itself in supporting crap used by a handful of people globally, X32 is dead, to the extent it was ever alive.)


X32 is dead exactly because, where 64bit is not needed, x86-32 is perfectly fine. I.e. the only practical reason not to go 64 is backward compatibility.

Indeed. X32 is beneficial for an application that

- Never needs more than 4 GB RAM (because it's not worth the usability downsides from having two binaries and letting the user choose which to use, just for a small performance boost)

- Is performance critical

- Performance is bound by memory and/or cache bandwidth

- A large share of the program memory usage is due to pointers

The intersection of all the above is just vanishingly small in reality. X32 was never more than a gimmick to score a few extra points in SPECcpu. And thus people rightfully ignored it.


Regarding games, Apple Arcade seems to be just what the Mac needed.

I haven't evaluated the quality of available offerings yet, but they look good in the App Store previews, and I think they're all native Mac apps (not lazy ports), using Metal and everything.

Although yeah, you can't really own any of those games and will always need a subscription, and some may be removed in the future.


I play games on my Mac when I can play them on my PC (travelling is one reason for this). I try buy my games DRM free on GOG, and play them either on Mac or through Steam streaming (yes you can load non steam games). My Mac game library is essentially wiped out by this.

And yes, there's this Arcade, but I have no interest in playing for something which is locked down to one platform.


If I want to play something and I already own that platform, why should I prevent myself from playing it because of its exclusivity?

Do you boycott Xbox, Playstation, Switch, DS etc. exclusives too?

What about Windows-only games that aren't on Mac or other systems? There are certainly thousands of those. Do you refuse to play those either?


Does this mean I won't be able to play 32-bit Windows games with CrossOver?

Not currently, but Codeweavers is working on a solution.

https://www.codeweavers.com/about/blogs/ken/2019/10/3/crosso...


It depends.

* 64bit processes usually can't load 32bit modules, only a 64bit OS can (true for Linux and Windows, I'm not sure if it's true for x86 overall). There's a pretty good chance that only a 32bit CrossOver could load 32bit modules.

* Has 32bit support been stripped from the kernel, or has it only been stripped from the installed dynamic libraries? I'd wager that it's stripped from, or partially stripped from, the kernel. Even if Catalina allows 64bit processes to load 32bit modules, the kernel would have to support it.

There's a one in four chance your scenario is supported. CrossOver definitely couldn't magically voodoo a 32bit module into a 64bit one, as it's impossible to determine if a register/address is a 32bit integer or pointer; if it doesn't work today, it will probably never work.

I'm not sure if macOS supports IOMMU, but an external GPU with a VM might be your saving grace here.


Support has been at least partially stripped from the kernel. Attempting to run a 32-bit executable yields an "invalid executable type" error.

Given that the kernel is open source, I wonder how hard it would be for someone to add that back in...

Building the kernel itself is non-trivial.

Ah, thank you guys for posting this.

I'll hold off too. My macbook pro is 2013 and pretty old it overheat often if I play video. This update will definatedly exacerbate the problem.


Doesn't the apple watch run a 32bit-on-aarch64 abi?

Apple watch uses a variant of LLVM bitcode as deployment format.

And no, it isn't raw LLVM bitcode.


I was talking about the arch/abi running on the os. Looks like it's called "arm64_32" (maybe it's similar to the linux abi "x32"?)

Xamarin among others ran into a snag but it seems they could retool and convert from armv7k to arm64_32 for example https://github.com/xamarin/xamarin-macios/issues/4864


Yes.

Have a look at vuescan. It's 64bit clean and supports scanners back to the late 1990's.

I use it myself and ignore the manufacturers software altogether. Which form my scanner only supported PowerPC.

https://www.hamrick.com/vuescan/supported-scanners.html


Wow, VueScan! A blast from the past. I used it 15 years ago. Glad to see it's still around, maintained by the one guy (and now him and his son according to the website). I'm definitely going to buy a license now ($99) and ditch the 32-bit manufacturer's software.

Software like VueScan and USB Overdrive (just bought a license for that too $20) deserve to be in some sort of Apple Hall of Fame.


I switched to ExactScan to support my Fujitsu S1500M:

https://www.exactscan.com


Vuescan is great, its better than most scanner apps too, insomuch as, its actually good at scanning.

> its actually good at scanning

Careful now. Scanner manufacturers would have you believe that's nearly impossible unless you remember to make the appropriate sacrifices under a full moon on a marble altar.


I found this application the other day, and despite its look, it really is great.

It really demonstrates how incredible Windows 10’s 16-bit application support is.

Although I understand the Windows codebase is a nightmare.


And how much they tolerate it, as it's so embedded in their engineering culture and I imagine "backwards compatibility" is drilled into your face when you start working on Windows at MS.

It's not easy to get into the mindset of fixing every problem a 3rd party app creates.

Some will say Apple is offloading tech-debt onto client apps, but Microsoft is allowing client appts to offload tech-debt onto itself, and it's big enough to take that burden on so its users can still use the stuff they bought.

I dunno, sometimes I hate having to do it but it's all about user/dev experience and if you're optimising for your own employees you're making it worse for everyone else who makes your OS/machinery worth buying. There's got to be some compromise, coz it sounds like if you buy into Apple then you're SOL if they can't be bothered supporting your hardware.

Which, additionally, must contradict their environmental aims. It's not good for the environment if upgrading an OS means you throw away your printers and scanners and buy compatible ones.


As a developer who strongly advocates for and strives to maintain backwards compatibility whenever possible (and even when it isn't so easy...), I can say that it's unfortunately something the vast majority of (younger) developers today don't even think about, much less care for. Perhaps they just haven't experienced stability nor noticed it enough in their daily lives, because all they seem to care about is "new and shiny" and rewriting things constantly.

I have written various (Windows) utilities over the years, and some are over 20 years old now and I use them every day. They worked on Win95, they still work through to Win10. They're all tiny single binaries that require no installation, start immediately, and are extremely responsive and low on memory usage. Ironically, it's almost impossible to do that with a "modern" toolchain now, and I'm not even sure if something like that would've ever been possible with the Mac. To me, the idea that backwards compatibility is a "burden" is absurd. Constant churn is a burden. If I had to go and "fix" all my utilities every few years because a new OS broke something that was working before, I would have less time for actual new developments. Instead I can continue to use them and write other things as the needs arise, instead of wasting the effort redoing things that should've still been working. Just "leaving well enough alone" is a big part of it.


The kind of simple utilities you are talking about would have also been pretty easy to maintain/port from OPENSTEP in 1996 through to current MacOS. Apple's offered pretty good source-level compatibility for the basic stuff, and an easy migration path for tooling. Recompiling something every few years isn't a "constant churn" sort of burden.

There's a lot of stuff out there that's not maintained, but is still useful. So "recompiling every few years" is a pretty big burden.

> There's a lot of stuff out there that's not maintained, but is still useful.

I can't see how this is Apple's problem. Holding up development of a platform and technical progression generally for unmaintaned software is not a good working model for anyone, no matter how useful it is.

> So "recompiling every few years" is a pretty big burden.

As is "supporting everything ever implemented in perpituity"...


is not a good working model for anyone, no matter how useful it is

WTF!? You've just perfectly illustrated the attitude that's making technology worse for everyone.

What are computers for? "To control and force users to consume mindlessly" might be an accurate depiction of reality today, but that's not what they were originally invented for. Computers were intended to assist people. As the early (1930s-40s) promotional material would say, "to come to the aid of mankind". The whole point of a computer is to be useful to its users, so arguing against that is just nonsense. This discussion item is full of other comments stating exactly what sort of work they use a computer for, and how they are being affected by useless changes.

Most people take it for granted just how stable a lot of other things --- also invented to help them --- they use on a daily basis are. Imagine if every few years, your toilet, sink, bathtub, light switches, power sockets, door and window handles/locks, lightbulb sockets, and home appliance controls changed in such a way that you had to completely relearn how to use them and without some functionality they had before, and all for totally BS reasons like "development of a platform and technical progression".

As is "supporting everything ever implemented in perpituity"...

Some things just don't ever need to change.


>What are computers for? "To control and force users to consume mindlessly"

Nonsense. I never suggested that. What I am suggesting is that whining about progress (especially when it has been known to have been deprecated for at least the last 10 years!) and layering technical debt on top of technical debt is stupid. I do agree that somethings don’t need to change, but are absurd, not to mention inaccurate. The many types of different light fittings for example are evolving. Somewhat more slowly than computing, I grant you. The same is true for locks and light switches are being developed too. Your comment about home appliances is By far the most ludicrous. I do take issue with your notion that things in Catalina have changed to the extent that they need to be relearned. Bullshit.


> If I had to go and "fix" all my utilities every few years* because a new OS broke something that was working before, I would have less time for actual new developments.

Welcome to the life of a Mac developer.

*year


>There's got to be some compromise, coz it sounds like if you buy into Apple then you're SOL if they can't be bothered supporting your hardware.

I think part of the idea is (and how they see it), if you buy into Apple, you should have enough spare money to upgrade your hardware as needed. This sucks for us which are not exactly affluent, but that's part of the thing. Apple never tried to maximize affordability or expenses.

(Though in some cases, they have been the more affordable of the bunch, e.g. when the iPad was announced, it took about 2 years for competitive machines to reach price parity. Or now, e.g. the newly announced MS earbuds are more expensive than airpods).

It's not a platform for long term support and maximum bang for the buck, it's a platform for user convenience ("it just works, mostly"), inter-operation ("things -phone, earbuds, speaker, watch, etc- just work together, mostly"), and cohesiveness ("things have a unified vision, mostly"), plus polish (thinking some things more through in their design -- not always though, e.g. BS MBPr keyboard).

I use "mostly" above in the sense that it's not obviously perfect (and some areas far from it). But the tradeoff is in the areas mentioned above.


This is only on 32-bit versions of Windows 10. Which I can only imagine is a very small portion of W10 installs.

fwiw, upgrading to x64 windows killed 16-bit support, I think? But as far as I know that was due to architectural limitations as much as it was an intentional choice. It's still impressive that 16-bit stuff kept working on x86 windows basically forever.

Yea it's not possible (in a way that works without breaking stuff) to drop from 64bit to 16bit the way you can with 32 to 16. That's just a limitation of the cpu architecture that's just not something you can work around. Dosbox and virtualization however do allow that to still be run if you provide the OS bits (install windows 95 or 3.1 in them), so there's still paths forward that will usually work. If they need to talk to esoteric hardware then you might be out of luck still but it might not be impossible to get that to pass through depending on what it is.

> Yea it's not possible (in a way that works without breaking stuff) to drop from 64bit to 16bit the way you can with 32 to 16.

Yes, it is. It’s almost exactly the same. What you can’t do is run v8086 code on a 64-bit kernel without using a VM, most of the code people care about is 16-bit protected mode code.

I maintain the messy but small amount of Linux kernel code that makes this work. It is, indeed, gross, because the x86 architecture is awful. But it works fine in practice and there is quite a good test suite these days to exercise the ugly bits in the kernel tree.


As with a lot of other things, the truth is a bit more subtle than "it's impossible" and closer to "we didn't care enough to try hard enough to make it work":

https://www.dkia.at/en/node/180

http://v86-64.sourceforge.net/


If that v86_64 thing works the way I think it does, it is not really an acceptable way to do this. If you drop the CPU down from long mode to legacy mode and get an NMI, either you are toast or you have some extremely complicated awful code to handle it. Not to mention that the CPU can’t even address all of physical memory when you do this.

Just don’t go there. Use an emulator for DOS code and use the normal kernel support (modify_ldt()) for 16-bit protected mode.


Honestly kind of surprised they didn't just build a subtle Windows 3.1 wrapper that would launch to run 16-bit apps, combining the view-of-the-system virtualization approach of WOW64 with ABI translation ala Rosetta.

Windows NT setup programs were traditionally 16-bit because every version of NT could run 16-bit x86. The non-x86 variants (alpha, mips, etc.) had built in emulation.

So Microsoft has already got the ability to run 16-bit Windows apps in emulation. It's a shame they didn't enable this on x86-64.


Quinn is 32bit! Oh no! Best macOS Tetris-like game around.

I’ve also been wondering about Dymo label printers. I haven’t seen anything when I searched - if anyone knows, I’d like to learn!

Maybe not much help to you, but my Rollo Thermal Label Printer works fine on Catalina. Didn't need to update any drivers, it just works as it did before the update.

Same here. My Canon scanner is not going to work any longer.

And Canon is notoriously bad at updating their drivers, so my scanner will now be a brick.


VueScan to the rescue, it supports scanners from the 90s and is also a very good scanning app to boot.

VueScan works, but it is stupidly bloated compared to the macos native scan tool.

I remember how it was so easy to scan directly to multi-page PDF with my Brother portable scanner. The Apple build in capture software was just so easy and clean to use. Then Apple removed TWAIN support in snow leopard, and suddenly my 2 month old 400 euro scanner was no longer usable. :(


As a long-time macOS user, I don't upgrade to a new version until x.y.5. I've just had too many issues over the years, with stability and application compatibility. Of course, for development, I install it into a VM right after release.

As a long time macOS user[1], I've always installed on public release and I've only ever experienced issues maybe once or twice. Rare enough that I can't think of any specific issues.

[1]didn't really pay attention to version numbers before 7.6, but I guess "install on public release" didn't begin until OS X...


You've been lucky. Many big vendors like Adobe take its time before fully supporting a new macOS version.

I'm not a macOS user, but if the situation is anything like on Windows, then luck has nothing to do with it. The updates by and large work just fine for almost everyone, but with such a widely deployed product you inevitably will still see hundreds or thousands of people that experience issues.

No the situation is totally unlike Windows which has legendary backwards compatibility.

It looks like waiting is not going to help in this case. It’s not like 32 bit apps were accidentally broken. When Apple kicks something to the curb they do it forever. I’m probably stuck on Mojave for good.

I only use my mac for audio production these days. I'll upgrade from 10.12.6 when I'm absolutely forced to.

Logic? :)

Logic and Numerology. More so for Numerology, if I'm being honest. I could probably get along with any DAW after the initial learning curve. Plus, I make heavy use of IAC Buses. But, I believe there's a Windows version that people use.

This year all their OSes seem to be riddled with issues at release.

- iOS 13.0 was so bad they released 13.1 in less than 5 days, but even now many things are still hit and miss (with 13.2 in beta)

- watchOS 6.0 is also still pretty bad and not yet fixed (with 6.1 in beta)

- macOS 10.15 GM seems pretty buggy

- Well, I think tvOS 13 is ok?

While the situation might be better for people who use the latest betas, it is still a horrible current user experience for all normal users just updating their devices.

Lots of cross-platform features introduced across these updates (like the new iCloud features and new Reminder apps, etc.) are also in a horrible state.

I'm not sure what their QA team is doing this year but it seems almost everything planned for this Fall would have been better off if pushed back a couple of months. Well, if it weren't for device compatibilities... (the iPhone 11/Watch 5 seemed to be more important than stable software across all their platforms and other devices)


The theory about the iOS 13.0 is that Apple was forced to updated to this buggy iOS version because of Apple Watch release. Watch was shipped with watchOS 6.0 already installed and it requires iOS 13. iOS 13.1 was still not finished and to prevent situation where new watch customers couln'd use it after the purchase - they needed to update as it is.

That I was hinting at in the last sentence. Apple Watch 5 and iPhone 11 came out on the same day and needed watchOS 6 and iOS 13 respectively, so basically, those hardware releases forced the buggy *OS to be released across all platforms and devices.

The question is what is Apple doing in their software development? From the outside, it looks like there are glaring issues within their engineering teams.

iOS 11 was a complete disaster and it took an entire OS upgrade cycle (iOS 12) to control the most pressing issues. Apple is constantly releasing wild bugs and after getting burned multiple times now, they still don't seem to tackle this internal problem.


For me, tvOS 13 broke HDMI-CEC and rendered AirPlay audio extremely spotty.

macOS was pushed back. Not a couple of months, mind you, but at least a couple of weeks.

Comparing iOS 10~13 with macOS 10.12~15 release dates, macOS seems to always come out 7 days after iOS.

This year it has been 12 days after iOS 13.0, instead. Wouldn't really call that much of a push back. It's less than one week behind the usual schedule..


Thanks for sharing. I would have upgraded for the sidecar capability, but if things are still dicey I’ll wait it out. And like you, I also have a super old version of Word, so will have to switch to Pages. I wonder if this OS update will result in a spike of Word upgrade purchases.

Pages is a pretty reasonable alternate of Word, for me at least. Keynote is decent too.

However, Numbers is not at all an Excel replacement and that is going to cause a lot of grief.


Keynote is amazing, in my experience. I mostly present from my computer, so I don't need to worry about PPT compatibility. I find PPT to be unusable/prehistoric by comparison (when I try to help my wife with it on her computer, which is running a current version).

Glad to know that Pages does a good job as a Word replacement. I do track changes pretty frequently and have wondered how solid the support/translation is for that feature.


The origin of Keynote is supposedly the requirement to create Apple-keynote-ready presentation software that would pass the extremely specific and persnickety Jobsian requirements for aesthetics and attention to detail in an age of Powerpoint dominance. Updates notwithstanding, Keynote shows its age — but all these years on, it still produces great-looking presentations.

The application Steve used was made by Lighthouse Designs on the NeXT platform.

https://en.m.wikipedia.org/wiki/Lighthouse_Design

Some great applications. I would love to have a direct port of all of them today. Of course I also want the shelf back and I want my menus on the side like NeXT. Oh well.

Keynote was created so Steve could have something that worked and looked the same as Concurrence. He would use his old NeXT to create presentation until Keynote.


Curious to know how it shows its age. Does PPT surpass it in some ways?

Try opening presentation you made a couple of years ago. Might not work. A colleague of mine ran into this, he couldn't open files created with Keynote a couple of years back.

I do recall a changeover around 2009 — was this a presentation more recent than that? I’ve had no problem with this and have used Keynote since it came out (and was a paid application, not free!)

I would think it was a little bit later, but I can't say for sure.

I believe iWork '09 was the final lineup of the original Mac codebase. After that, they basically back-ported the iOS versions to the Mac and from then on had a shared codebase, file format, and feature updates were pretty much in lockstep across Mac and iOS.

Yeah, back then there was a huge uproar about the dumb-down and loss of features.

They worked their way up, but I miss the layout of old Pages, it was brilliant as long as you accepted it to be not-Word (which many people did not)

My favorite feature of Keynote / Pages is that the equation editor is simply a native LaTeX editor. You have to wonder why Word didn't simply do this in the first place

LaTeX Math is good for people who know LaTeX but Word's Unicode Plain Text Math is arguably better otherwise.

The Microsoft equation editor also allows you to mix LaTeX and their own syntax, which is nice, since LaTeX is fairly verbose at times.


Pages is the only word processor I know of that refuses to open documents that it created just a couple years prior (version 5 would refuse to open documents created by version 3, for example).

If Pages works for you, great: but save a copy in a different format if you want to be able to edit it in a couple years.


This is exactly the mentality this entire thread is about! Here we see it clearly demonstrated in another product. Apple says "update, upgrade or you are dead to me."

I have been stuck multiple times with users upgrading Pages two versions later on a desktop and then unable to read on their laptop or vice versa. Inexcusable!


Yeah, I was dumbfounded when I ran into this problem. I would be ashamed if I was on that team. At least include a converter.

Numbers is actually the best spreadsheet for stuff where presentation matters. For example, me and my friends use it for our D&D (and invisible sun) character sheets - the formula language is powerful enough to do what we need, and its ability to manage multiple tables per sheet is uniquely powerful.

And the icloud version of Numbers is pretty handy when you don't have a Mac around (but the collaboration capabilities suck compared to Sheets).


Excel does support multiple tables per sheet through the format as table (?!) feature. But yeah, still hard to make those multiple Excel tables look good with sheet-wide column and row sizes.

I believe the Office-idiomatic approach there would be to create one Excel file containing multiple sheets, and then OLE-embed a view of each sheet into a Word (or Publisher) doc. Lets each tool do what it's good at.

I think for the way most people use excel it’s adequate and in some ways superior (e.g. I made a cute weekly chores-tracking sheet with photos on it for the fridge), and presume that’s why Apple appears to have stopped work going on it.

If you’re even a halfway serious user of Excel than indeed, Numbers is a complete joke. But if google sheets would work for you numbers probably would too.


Sheets is actually pretty powerful. I can do 99% of what I used to do with Excel. Numbers is way under-featured in comparison. I would categorize Numbers as “family-friendly” and Excel and Sheets as “business-friendly”. Only Excel gets “finance-friendly” and I’m not sure about that anymore. My finance partners now deliver all my budget read outs via Sheets.

I think Numbers is a great spreadsheet application. It just doesn't even try to be the kind of spreadsheet application that can be abused as a poor man's DBMS.

For simpler tasks where a spreadsheet program is obviously the right tool for the job, Numbers tends to be plenty capable and give a nicer user experience than Excel and clones. For example, a few weeks ago I discovered that Numbers is vastly better at doing time-related calculations than Excel.

Numbers fails completely when you start moving into the problem domains where a SQL database and/or scripting language are good tools to consider.


I ran into serious problems with Numbers (it just wasn't good with them, and broke some of my existing spreadsheets...), and I didn't like Pages because it makes it a serious pain to work with or save into MS .docx or rtf, or anything other than pages format. I found Libre Office to be waaaay closer to the office suite of tools I was looking for on my MBP. (and less buggy, believe it or not).

> I wonder if this OS update will result in a spike of Word upgrade purchases.

Unless you use some of the very special features of Word, or you're in a profession that requires Word docs specifically, Pages is a very good substitute. It can even open old Word docs (and new ones too).

I haven't tried Sidecar yet because the iPad is on my wife's iCloud account since she's the primary user and I'm not sure how to make it work across iCloud accounts, or if you even can. She uses it so much I can't get my hands on it!


FYI you need a MacBook Pro 2016 to use sidecar.

A terminal command could be used to enable Sidecar on older Macs, on Catalina beta.

  defaults write com.apple.sidecar.display allowAllDevices -bool YES
I don't know whether it works on the release build.

They deactivate the command in beta 2.

For Apple, you and your 2015 Macbook Pro are just poor losers.

Now, You're free to buy a new one.


Or to use a patch (http://dev.zeppel.eu/luca/SidecarCorePatch). But the image quality is abysmal.

Ugh, is there no better way to patch that?

it's using h265 and only Skylake and above have hardware h265 decoding, that's why

It didn't work for my 2015 MBP.

Sadly, I have one such Macbook Pro.

Phew! Thanks, you saved me an iPad.

Or just use Duet Display: https://duetdisplay.com/. It has worked flawlessly for me for four years.

macOS Catalina and iOS 13 are free-of-charge for supported devices.

Duet Display's iOS component costs US $9.99.


That's a pretty reasonable price

I agree, but when the alternative is $0 I’m more price sensitive.

Except if your macbook is older than 2016. They deactivated sidecar for macbook older than 2016.

That's how you know Apple don't give a shit about its user base, unless you pay 2500€ every 3 years for a laptop.


I mean, there was a reason for doing so: older Macs don’t support hardware H.265 decoding. It’s not like they disabled it just to spite people.

Why do I need hardware decoding in my Mac in order to render an image on my iPad?

Why is h.264 unusable?

Why is removing a feature better than letting run a bit slow?


> Why is removing a feature better than letting run a bit slow?

If a restaurant runs out of food, there are gonna be some customers who would eat a turd sandwich, as long as they could eat something, but most customers would flip out and demand to know who would think shipping that out of the kitchen was okay.

Now the analogy isn't perfect, but I imagine there's a host of reasons around expectations of support and experience that Apple has no plans on addressing for those cases, and rather than having a lot of time wasted clogging support with something that will just frustrate the average customer regardless, better to leave the few complaints about why they can't have it, since after all, your complaint doesn't cost them a dime in potential support calls. Just a vague feeling like you're not getting everything you want for the price you're paying.

If you've ever been to Disneyland, you can see historically, the park is jam packed with people who have that same frustration, each a paying customer.

If my assumptions on their product calculus is correct, I know which of those options I'd choose, from a business perspective.


That $0 comes with a hidden price tag.

which doesn't matter if you already have an iPad.

Latency is apparently lower with Apple's implementation.

If your mac is older than 2016, forget about sidecar. They deactivated it for every device fabricated before 2016.

For the 2 first catalina beta, you could still activate it with 2 command line in the Terminal, but these mother* removed the ability to do so in the third beta.

That's how I know Apple don't give a shit about me and my 2015 macbook Pro unless I pay 2000€+ every 3 years.

What a fucking joke.


That’s such a bummer. I’m still using my 2014 MacBook pro because of all the keyboard nonsense, but was actually excited for this feature. Is there a technical reason or just typical Apple BS?

Skylake is the first CPU with hardware HEVC encoding.

> so will have to switch to Pages

dude, Libre Office wipes the floor with Pages, Numbers, etc. If you haven't tried open source office programs in the last decade, they've come a long way. And I find they work better cross platform (ie they save in MS office format much more seamlessly and less buggy)


Impress is still far from PowerPoint or keynote.

Does your old Word version get security updates? If not, isn't that a huge security nightmare?

I'm wondering how Sidecar compares to Duet, which I've been using for a while. Anybody have experience with the two?

I use Media Rage to tag my mp3 since ever, and there is no hope that it will be updated. The application does just what I need.

I am also about to buy RockSmith on Steam, but it's 32 bits at the moment. I currently used it on my PlayStation but want to have it on macOS to access more songs.

It looks like you find many games on steam that are 32 bits only.

On our side when OS X was introduced we trusted Steve when he said Carbon is there to stay and will be 64 bits. We took a few years a few years ago to do the switch...


Ha! Literally just upgraded to Mojave yesterday thinking it had been long enough to iron out the bugs.

That dropdown freeze happens on the latest safari under Mojave as well (only on certain message boards, so it. Might be CSS or JS related?)

I'm on Word/Excel/Powerpoint 2016, which isn't even the latest version, and it's 64 bit. So, you might be due for an update. I've noticed complex excel spreadsheets are much faster on 2016 vs my previous version so it might be worthwhile anyhow (or run it in VM if you want to keep using it).

Now I realize that I still use Word 2008 for when I need to open a .doc or .docx document that Google drive does not want... Time to upgrade...

There are 64 bit versions of the Microsoft Office apps. Do they not work in Catalina?

Office on the Mac has been 64-bit (only) for the last 3 years, and it should work fine in Catalina.

> The whole 32/64 bit thing doesn't seem too bad

Homebrew has some broken packages, e.g. trying to run Midnight Commander dumps a nice "Bad CPU type in executable" message.


After the upgrade every time after sleep, a window pop ups saying "Verifying Dropbox" and it takes 30-50 minutes before it completes verifying. I don't get it. Why does it has re-verify the same binary over and over again. I'm really disappointed in this release over all. Also, ejecting a application freezes and ask to force eject.

Macbook Pro, 15inch, Mid 2015 2.5GHz i7 16GB RAM 500GB SSD


I had the same CPU issues. It turned out to be some bug with the iCloud sync. Constant updates also ate up my phone’s battery and storage.

Turning off iCloud capability stoped the insanity.

Have been using Apple’s betas for years now. This season was the first time I regretted it. But I like telling myself that dark mode and iPad OS multitasking made it worth it.

Some UX improvements across the board are very neat though.


I'm pretty much stuck a couple versions back now (last rmbp with NVidia graphics, mid-2014). Really wish they'd let NVidia release drivers for the last two versions again.

In the end, new desktop runs Linux, and aside from a few issues for brand new hardware, it's been a nice change of pace.


Im not sure I understand. Did your nvidia card not work on mojave? What were you gaining not being on mojave?

It doesn't work correctly, no. Apple stopped allowing signed Nvidia drivers starting with Mohave and the drivers from Apple for the device are buggy and cause issues.

Maybe the driver issues you experience are resolved in more recent macOS versions. I run macOS 10.14.6 on a Mid 2014 MacBook Pro with GeForce GT 750M. No issues here.

possibly... I don't actually use my laptop, and when I did my hardware refresh a couple months ago, I went linux from my older hackintosh (nvidia gtx 1080)

Thanks for the heads up! I was just about to update. Didn’t see anything about the 32/64 bit thing in the update release..

No adobe and office products? That sounds quite bad!

Edit: it seems to be only old versions that won’t be supported (office 2011).


Aren't most Adobe products 64bit?

Current versions are. But, if you bought CS6 outright when Adobe switched to a cloud subscription model then CS6 is not going to work any longer (it's 32 bit).

That's the boat I am in. I can afford their cloud subscription but out of principle I will try their main competitor first (Affinity: $50 one time purchase). I will try running it in a VM second. Only if neither of those solutions are satisfactory will I pay a monthly fee for software that I use very little.


What Mac / config are you using ?

MacBook Pro (Retina, 15-inch, Mid 2015)

16GB RAM

1TB SSD (3rd party upgrade)


> I've had to use Pages instead of Word and can't use Adobe products anymore

And this "doesn't seem too bad"? These are the industry standards for productivity. Pages on the other hand is only good for throw-away projects. Last time I checked, importing and exporting to Microsoft Office sucked. So, Catalina pretty much turns your professional machine into an overpriced netbook.

Microsoft/Adobe will of course eventually release 64bit versions of their software. But these will cost extra $$$ on top of the premium you have already paid for Apple hardware. A lot of money with doubtful productivity gains. This won't make any professional happy.


Microsoft and Adobe already have 64 bit versions of everything. I just haven't upgraded in eight years. So it's really my fault.

I was pretty sure I got 64bit warnings for both of them fairly recently. But looking at the binaries now, they're both 64bit indeed.

Still, I'm running both using a site license from my work. The upgrade cost per user in site licensing is pretty low. If I had to pay myself for a single license, I'd bet I wouldn't have upgraded for eight years either. And the Office/Adobe CC subscription costs may not be insignificant either, depending on where you live.


not sure about this, I just upgraded and all of my office apps still work

Publishers of audio products (soft synths, DAWs), games, and a fair number of domain-specific products have told their customers not to upgrade yet.

Here's a rundown of products that depend on 32-bit support or that haven't been tested in 64-bit on Catalina yet:

Audio products: https://www.sweetwater.com/sweetcare/articles/macos-10-15-ca...

Games: https://www.macgamerhq.com/opinion/32-bit-mac-games/


Yep, Native Instruments (on that list) sent out an explicit email saying not to update: https://support.native-instruments.com/hc/en-us/articles/360...

With good reason, too. A good portion of the Native Instrument software packages won't install because the installers themselves rely on 32-bit helpers. A savvy technical user can work around it by unpacking and excising the problematic portions from the installer scripts.

On the plus side, I've been running Kontakt 5 along with lots of other audio software since 10.15 beta 3 every day and most everything is working alright.

Some other observations:

• Pro Tools' QuickTime video plug-in prevents it from launching because it's a 32-bit only subprocess meant to interact with the now-defunct QuickTime framework. You can delete the plug-in from the application bundle and it will proceed and seems to be working normally.

• EastWest has a nasty crash in PLAY 6 that can be worked around temporarily by removing their internal word builder plug-in.

• Pretty much every iZotope plug-in as of September had a 32-bit non-pkg installer. The software works fine if you copy all the relevant parts from an install on another computer.

• MOTU hardware drivers for anything but the latest Pro Audio line won't work as the drivers are 32-bit at the moment (MIDI interfaces, CueMix-based audio interfaces).


Waves sent one out last month as well. Par for the course really. Audio pros are some of the most distrustful users w.r.t updates because they've been burned so many times.

Another way of looking at that might be, developer of pro audio apps are notoriously bad at preparing their apps for new OS releases.

As a former developer of pro audio apps at Native Instruments, sometimes it's just that the first version of macOS updates has bugs that suddenly causes crashes in your lower level OS calls or unexplained latency. If something like this happens, users mostly blame the audio devs and not Apple. So, if you don't want to ruin your reputation, you better warn, test, wait, hope and fix (maybe not in that order).

I think that's unfair to audio developers. You're asking for a real-time scenario from an obvious not real-time OS. Dropping audio is a lot more obvious than dropping video frames. Audio also deals with /a lot/ of plugins both hardware and software--many paid (years ago and now abandoned). Most people I know with an audio setup are very sensitive to physical changes due to subtle problems.

I also think it's unrealistic for large applications to be 100% ready on day one. It's not like Apple has a GM ready weeks ago. Betas are known to change and nobody really knew when Catalina would ship (many people are surprised they're shipping what they have).


Audio software is very hard to get right and it's a relatively low margin business. Asking them to track new OS releases on day one is just not realistic, especially when Apple has a pretty bad track record on stability and maturity of x.0 releases of macOS.

Could be, but when the new OS updates every 12 months they can either dedicate a whole chunk of their time updating to the new version, or make it work good on a version that's still going to be supported for 3 or 4 years so they can focus on bug fixes and updates.

Just because Apple can bump out a new OS version once a year doesn't mind these app developers have the same bandwidth to keep up the chase. They have plenty of other priorities.


Or you could see it as your OS breaking lots of software you depend on and then telling you tough luck.

As a user I only see "OS upgraded -> stuff broken". I'll blame the OS for that. All finger pointing and shoulda/coulda/woulda is not magically going to unbreak things,rolling back the upgrade will.


And why would they care to improve when they know their user base will let them get away with it and even side with them against Apple ? Reposting a comment I left elsewhere in the thread :

For the most part, it's not a technical issue at all but a cultural one : pro audio users are notoriously, almost pathologically conservative when it comes to software upgrades.

You'll find plenty of threads on forums like Gearslutz, asking for tips on how to downgrade brand new Macs to an older version of macOS that doesn't even support their hardware. Or 2019 threads asking if it's now safe to upgrade to High Sierra. They're typically 2-3 versions behind. Why ? Older is just safer, better in their worldview.

In that context, audio developers know they have customers on their side against "evil Apple that's always breaking everything for no benefit", and they get away with emails that read like Apple just unexpectedly dropped a bomb on them without notice, and it'll take them 6-12 months to get ready, like WWDC and 3-4 months of developer betas never happened.


As a MacOS audio developer who supports legacy machines to this day, there's some truth to what you say but you're glossing over important realities: Apple has a long-standing pattern of revising developer tools to throw away support for working machines, and then requiring you to use their newest developer tools for current development.

It's a technical issue. It's a bear to support older systems from newer machines. (it's a lot easier to support current stuff from dawn-of-time old systems! I keep an antique laptop to code on which allows me to support EVERYTHING all the way back to PPC Macs. Which I do support)

My choices of what I choose to buy (in Apple hardware) or even CAN buy are conditioned very much by this reality. I'll get stuff if there's a fighting chance I can build a working ecosystem on it. I'll be willing to do things like ditch Logic and switch to Reaper, and I'll be well within my rights to tell users 'this is what I can offer, and this is what I cannot'.

Because Apple is not automatically my ally. It can be my adversary, even when I'm doing its bidding (I was fairly early in porting my entire product line to 64-bit when few others bothered. Apple literally called me and offered to help me do this, so I told 'em I'd already done it three months before. I did NOT tell them that I continued to support PPC machines or maintained a time capsule dev machine as the only way to develop for a large range of cheaply, easily available hardware)

Users have every right to side with me against Apple when I'm an open source developer letting them do professional-quality audio work on computers costing only a few hundred dollars, and/or letting them continue to use known-good and predictable equipment, and Apple is locked in to a course of action requiring it to churn its userbase at whatever cost to the userbase.

I totally get Apple's motivation here, but it doesn't serve my customers.


It's amusing to read your post because Apple seem to be touted as the OS for audio people.

I wonder what MacOS has that Windows doesn't(which is a lot more backward compatible) that makes audio people stick with them.


OS X's CoreAudio is still an incredibly well-designed technical architecture, designed by a team who really understood digital clocking from a hardware perspective and the importance of low-latency kernel support. It's still kind of amazing to me that it was essentially fully baked by 2002-2003. On Windows there's now WASAPI Event, which has a similar architecture, but for the longest time third-parties had to step in with a third party solution (ASIO) because the OS support wasn't there (and ASIO really only solves a subset of the problems CoreAudio solves). I'm frustrated by how little attention the driver and documentation side of things has gotten from Apple since then, but for some specialized requirements the underlying architecture is still just fantastic.

Audio people have been moving away from Apple over the past few years specifically because of show-stopping bugs introduced by new versions of the OS. This really never happens in Windows now and it's become more and more appealing to switch.

Maybe on Windows 7, because Windows 10 does whatever it wants if connected to the internet.

All mainstream OSes are moving to an unsustainable release cycle.


macOS has had a reliable low-latency audio API for years, when on Windows you had to resort to third party hacks like ASIO.

For years it has also had things like Audio Midi Setup, which lets you set up aggregate virtual audio interfaces from physical ones, dealing with latency compensation etc. Or MIDI over Bluetooth. All of that out the box. It just feels it's been designed with pro audio in mind, compared to Windows.


Aside from the other replies you’ve already received that are spot on, macOS has also been consistently good at isolating the audio ports from coil noise / interference from the other electrical signals on both laptops and desktops. While “pro” audio folks are likely to be using an external audio interface anyway, having an audio jack that doesn’t garble the sound is one of many ways that Apple hardware engineers have been attentive to audio.

That's fair. Pro Tools crashing during a session has been a meme for awhile, despite it being pretty stable for the last 5-ish years.

But keep in mind these companies are usually pretty small, a lot of sole-proprietors out there, and their revenue comes from new products/sales not maintaining projects. So there's logistical and incentive problems in deploying fixes quickly.


Haven't those developers had access to the beta for months? What's the point in the beta if people aren't getting ready before the release?

The writing has been on the wall for 32-bit for literally over a decade and it’s not going to magically happen now just because support has been dropped.

Imagine digging up an old project, recompiling it for 64-bit, and finding out that:

1. It doesn’t compile any more, with modern tools.

2. When you get it to compile, it crashes mysteriously.

3. When you get it to run, it again crashes mysteriously when you try a 64-bit build.

Yes, there are a lot of “best practices” out there to avoid this. Any discussion of best practices is moot because people in the field have to work with actual practices and legacy code that might be full of undefined behavior, custom build systems, weird hacks, and lost tribal knowledge. And you are simply not given enough time to go through and fix it.

This is not all about the 32-to-64 bit transition, but that’s the biggest part.

Then consider the smaller shops—which might only have a couple of Apple devices altogether, and not a lot of spare developer-weeks to go through and test old plugins on new systems.


Do you have any specific examples of what would break because of this?

I hate to suggest a web search but there are a lot of examples online if you search, e.g., https://www.viva64.com/en/a/0004/

One of the big ones is that integer constants are 32-bit on LP64 / LLP64 systems unless they are large enough, so e.g.

    // Equal to 0x80000000 on ILP32
    // Undefined behavior on LP64 / LLP64
    size_t max = 1u << (sizeof(size_t) * CHAR_BIT - 1);

    // Correct version
    size_t max = (size_t)1 << (sizeof(size_t) * CHAR_BIT - 1);
This can also happen with e.g. multiplication

    #define K 1024
    #define M (1024 * K)
    #define G (1024 * M)
    #define T (1024 * G)
    // Undefined behavior, on both 32-bit and 64-bit.
    size_t max_object_size = 10 * T;
    // Correct version (#1)
    size_t max_object_size = 10995116277760;
    // Correct version (#2)
    static const size_t K = 1024;
    static const size_t M = 1024 * M;
    ... etc ...
Or if you need to align your pointers for some reason, so you do the cast correctly (with uintptr_t) and get

    void *align_ptr(void *p) {
        // Mask is 0xfffffff0 on 32-bit (correct)
        // Is also 0xfffffff0 on 64-bit (mistake)
        return (void *)(((uintptr_t)p + 0xf) & ~0xf);
        // Correct version
        return (void *)(((uintptr_t)p + 0xf) & ~(uintptr_t)0xf);
    }
Consider that you might do some pointer alignment e.g. to work with SIMD. This stuff isn’t so crazy and if you haven’t been targeting 64-bit, a lot of it can creep into your code base over the years. Legacy projects may not compile even remotely cleanly with warnings enabled, it’s just a fact, so even though warnings / static analysis will catch some of the errors above you’re not safe.

This is why so many languages have stricter rules about converting integers to narrower / wider types.


I was hoping for some real life examples from some games and reasons why they’d use something like pointer arithmetic in such a way or so often it couldn’t be easily migrated, or why they’d even do it in the first place on a desktop machine. I get the old software.

Of course basic examples between 32 vs 64 are easy to find. I guess I should have clarified as the OP sounded like he knew the topic well.

I’ve never dug hard into C++ gaming architectures and the patterns they use.


In my experience a far more common and much harder issue is dealing with dependencies.

A large project probably has lots of them. If you are lucky there is a 64 bit version of it. Very often though there isn't so you have to find something equivalent and rewrite all interactions with it.

And there might have been very good reasons for choosing those specific dependencies.

That can take ages upon ages and can be quite demotivating. You might have issue even finding out if you have any problems in your own application after you've spent months on replacing dependencies.


A Collection of Examples of 64-bit Errors in Real Programs: https://www.viva64.com/en/a/0065/

Any kind of pointer math that assumes a pointer is 4 bytes is an easy example.

Why would you write code like that, except in a well-contained module?

It doesn't matter why they decided to. They did. We all know better, we can all point fingers until we are blue in the face, but that doesn't change that people will lose access to software that they purchased.

> except in a well-contained module?

This is probably one of the best reasons. Game developers aren't entirely to blame here, middleware developers seem to be sticking to 32bit like balsamic glazing. Even if you could pry their cold dead hands off their archaic architecture, they'd probably make you pay full price for the 64bit upgrade.

As for DAWs? Think about all the the VST plugins out there. Most of which probably aren't maintained and only exist as zip files on the artist's dropbox.

Dropping 32bit is aspirationally sound. It's grossly inconsiderate of the very most obvious aspects of reality.


I think that's reasonable if we were talking about a tighter time-scale, but we're talking about at least 13 years since it's been 100% obvious that macOS would become 64-bit.

Yeah, but 64 bit doesn't give your users anything usually, so it's a tax that is more of a drain on smaller software shops. Especially consider that lots of programs and especially games have a spike of sales when new and then sales decline to nothing. There may not be new versions, ever. So going back and updating them is pure loss for the developers.

This is one reason why ecosystems like Java are so valuable! The 64 bit transition was so easy for it because of the common insistence on "pure Java" for portability. Combined with pointer compression 64 bit was hardly noticed.


You're right, but here we are anyway. Reality is always absurd.

To make the counterpoint: if it's not forced, apparently no amount of time is enough to convince people to change the code.

Forcing it is a net-good perhaps?


Because, among many other reasons, game developers work 60 and 80 hour workweeks, and have to deliver a project that takes 3 years in 6 months.

A lot of apps over the years have exploited the fact that kernels used to sit above the 2GB address space boundary to encode data into the high bit.

For many audio hardware drivers, it's not simply a matter of recompiling to 64-bit.

Apple has deprecated a lot of things and seems to be trying to move everything over to AudioServerPlugins, which essentially requires a full rewrite of the drivers. The architecture isn't similar at all (AudioServerPlugins actually follow a Microsoft-style COM interface, among other things). I don't even think Apple makes any sample code for updating old Firewire drivers to run on Catalina, so it's a tough slog, and low-level CoreAudio expertise to update old drivers is getting scarce. Traffic on the CoreAudio developer mailing list is down to less than a dozen posts most months. There hasn't been a single post to the mailing list yet in October.

I strongly suspect that some interface vendors are not going to update their 32 bit drivers for Catalina. (MOTU in particular is one, which kind of marks the end of an era because they used to be champions of some of the best CoreAudio features, like Firewire clock syncing.)


> I don't even think Apple makes any sample code for updating old Firewire drivers to run on Catalina

Surely, surely it was obvious for years that Firewire is on the way out? Why are people so reactive and sluggish? People should be ready for change.


As a independent dev I would rather spend my time working on new features and projects on an OS where my older projects continue to work fine rather than spend my time constantly playing catch up with Apples policies. This combined with the notarization requirement has pushed me to stop developing for Mac OS.

This isn't just an OS thing, though, so I think that's crap. This change is a hardware change that's been forecast for literally years. To dismiss it as "playing catch up with Apple's policies" is nonsense.

You have a point, but another thing to note is that the quality of Apple’s developer docs for audio drivers and CoreAudio has been decreasing. Some major behaviours/quirks of new frameworks like AVAudioEngine on OS X are not documented at all. It’s a lot harder to chase the target than it used to be. This is IMHO one of the likely reasons that MOTU chose to implement as much as possible as an on-device web interface on their newer gear.

As a former MOTU engineer who worked on the software side of the new interfaces, the built-in web server was really about cross-device and multi-user access, not avoiding CoreAudio. We still had to write CoreAudio server plug-ins for Thunderbolt and proxy the web server through the driver as well.

I hear they are planning to update their 32-bit drivers for the older MIDI and audio interfaces with the exception of the PCIe devices. (There are no existing Macs that can run Catalina and the old PCIe interfaces with the exception of the new Mac Pros, but there will be no driver so...)


MOTU just released a statement indicating that they’re only working on new drivers for the hybrid and newer devices. Firewire gear like the popular 828mk3 classic aren’t getting drivers.

>new frameworks like AVAudioEngine on OS X

AVAudioEngine is not a new framework on macOS. It was first available on macOS Yosemite which was in beta in 2013 and released in 2014. Developers have had 6+ years to develop and test using these frameworks.

This literally just boils down to developers complaining that Apple isn't further letting them procrastinate.


That illustrates the problem though... It’s not a new framework, but the documentation is still poor, and important limitations of it on OS X are not documented at all, years later. (I picked this example specifically because it was a subject of traffic on the CoreAudio mailing list last month.)

Apple expects developers to adopt new technologies, but often never gets around to documenting them well enough to drive adoption. It’s been a real issue for low level audio work on the platform.


"forecast for years" is distinct from "totally a good idea."

Time and resources a software publisher spends on the upgrade treadmill is time they can't spend on other efforts, whether or not there's advance notice. Essentially, it's a cost Apple imposes on developers for their platforms. And either that shows up in costs for users or the software folds.

If there's a counter to upgrade treadmill criticisms, one of the few that makes sense is if there are compelling benefits with the upgrade to weigh against the cost.

At the moment, it's entirely unclear to me what benefits I'm supposed to derive from dropping 32 bit support, and moreover, I can't think of a single compelling benefit I've derived from macOS updates since Snow Leopard.


This distinction does not seem very useful. Regardless of whether it was a good idea, developers had to decide at some point whether they should prepare for the transition or cease macOS development (unless they wanted to continue supporting old versions with a dwindling userbase). This decision could have been made years ago. It seems to me that anyone complaining the change is too soon or too sudden is forgetting the sheer amount of time they had to decide their best course of action. It should be a question of whether to continue development at all, since that should include a transition to 64-bit. If not, then deprecate the project. I understand may situations will not be this black-and-white, but it seems like many of them are.

> anyone complaining the change is too soon or too sudden

Notably, that's not my complaint or my point. This isn't about the timing (which is why the distinction was important).

No matter how much time someone has to prepare for a change, there's a cost to making it. And if the benefits of the change don't outweigh those costs for the developer (and the user), then a complaint sure makes sense from those perspectives.

This does intersect the issue of timing, because in a resource-constrained situation, you inevitably have to defer some efforts until they're absolutely necessary (and simply rule out doing others). But timing is not the fundamental issue, cost-benefit tradeoffs are.

> [development] should include a transition to 64-bit.

Why? What's the benefit to ending 32 bit application support?


> This change is a hardware change that's been forecast for literally years.

Huh? What x86 processor doesn’t support 32bit code?

If they move to ARM, everything will get broken, regardless if whether it’s 32bit or 64bit. Is breaking things twice really better than breaking them once?


> What x86 processor doesn’t support 32bit code?

That's the wrong question to be asking. The relevant hardware change is when x86 CPUs that don't support 64-bit code finally disappeared from the market and eventually from the install base.

There is only a very narrow range of use cases for which running 32-bit code on a x86-64 processor is preferable to running the same code compiled as 64-bit. For everything else, going 64-bit is a clear improvement if not an outright necessity. Even if it didn't require any ongoing maintenance, continuing to ship 32-bit versions means wasting storage space and network bandwidth, and leaving on the table the improvements enabled by going 64-bit (eg. better ObjC runtime). For the entire existence of x86-64 processors it has been clear that retaining 32-bit compatibility through the entire software stack has significant downsides and that the cost/benefit balance has been inexorably moving toward eventually dropping 32-bit support.


Counterpoint: BareBones software has continuously updated BBEdit as Apple transitioned from:

- 68K Classic MacOS

- PPC Classic MacOS

- PPC OS X

- x86 MacOS


BBEdit is not a real time audio engine. It’s a text editor. And they’ve released thirteen versions which all require a relatively substantial amount of money to upgrade.

And what’s the problem with paying for software you use?

Nothing. But it's in their best interest to keep it working since they're going to charge you for the upgrade that continues to work.

Isn’t that also a good thing - that a company has a business model that allows them to keep supporting an app for almost two decades?

What alternative would you prefer? That Apple ships the latest MacOS with a 68K emulator, a PPC emulator, all of the 32 bit libraries, classic MacOS in a VM, and all of the Carbon libraries forever?


However BareBones did drop TextWrangler, and that's one of the thing that's going to keep me on Mojave for a while.

What's the issue with using BBEdit in free mode?

Didn't know it existed - thanks!

A caveat: the notarization is not forced. Apps can still run even if they aren't notarized, it just takes an extra click.

studio hardware can generally last 20+ years (and can cost > 5000$ for one rack) - you really don't want to throw it all every 3 macos releases

Apple isn’t exactly known for backwards compatibility.

I could never justify a long term hardware life knowing that I’m signing up with a short term software life cycle. Not only a short term software lifecycle, by a short term computer hardware/OS lifecycle. Want more memory or a faster processor? Not with that old OS! Btw, you have to rewrite all of your software if you want that extra memory.


The hardware isn't the problem, though. The software is. All of these vendors have had ample time to update their software to work with the hardware.

Assuming the vendor is still in business, still supports the product line, and still develops software updates for models that have been obsolete for over a decade.

Many studios and many musicians are not wealthy enough to throw out a roomful of working equipment just because it doesn't conform to industry trends.

And they shouldn't have to. Developers should support their working equipment by updating drivers. They've known about the transition to 64-bit for years and it's not something that can just be ignored because all the hardware is now 64-bit. It's not like this was done on a lark.

This was addressed earlier in this subthread. It's not just an upgrade to 64-bit — they're also changing the low-level interfaces, and the new ones are apparently not well documented. So while it probably wasn't done on a lark, it doesn't sound like it was done in a very well considered manner either.

True, but a lot of those devs ignored the 32- to 64-bit transition for so long that now they're having to do both at once. Authors that moved to 64 bit several years ago have a lot more free time right now to concentrate on supporting new APIs.

>the new ones are apparently not well documented

The low-level interfaces in question were changed almost 4 major versions back. There aren't any new interfaces or APIs that haven't existed for 6+ years already. The only people complaining are the ones that procrastinated because Apple continued to support their outdated apps.


Apple introduced new low-level interfaces four major versions back, and Apple couldn't work up the interest to document them in the intervening years, and then Apple decided to cut off the interfaces that are documented, and somehow any problems resulting from this sequence of choices Apple made are the fault of third-party developers for not jumping to new technology that Apple didn't care enough to document? That interpretation of events seems a bit obsequious to Apple.

If everyone should have been prepared because this change has been in the works for so long, then surely it's inexcusable for Apple to have left the new stuff in a poorly documented state all that time. If you do feel that Apple's lack of preparation is reasonable when they had even more warning than third parties did, then it seems unreasonable to fault developers for also not being prepared.


> Why are people so reactive and sluggish? People should be ready for change.

I'm not sure if you'd be so ready for change if it meant replacing expensive gear that is still totally functional.


Is there a reason you don't freeze the version MacOS, disable all the updates(if possible), disconnect it from internet to prevent the zero-day attack on it and continue using it in mojave or whatever your gear works the best with?

That's one option, but at some point one will want to get a new plugin or something that will require a new macOS version.

But the point is 32-bit apps aren't functional anymore. You either keep up by thinking ahead, or you find yourself suddenly stumped. There's no option to just ignore it.

> But the point is 32-bit apps aren't functional anymore

No? How come I can use Ableton Live 10 in 32 bits to make music then?


I'm pretty sure you can't. There seems to exist only 64 bit versions of Live 10: https://www.ableton.com/en/trial/

When you open Live 10 in Mojave you get this alert:

https://help.ableton.com/hc/en-us/articles/360003076340


Interesting. So it's still mixed and has disadvantages of a 64 bit program (no 32 bit plugin support 1) and of a 32 bit program (no compatibility with Catalina) while pretending to be a 64 bit program. Sounds great!

1 at least on Windows


On macOS Catalina?

The main objective is to make music. Running macOS Catalina is optional.

It's not if you're buying a new Mac from today.

Ok but we were talking about running on macOS Catalina here - you need to be 64-bit to run there. And if you don't, you're likely to get left behind.

Or, you know, having to upgrade anyway, people might switch platforms to one which has a better track record in supporting old software.

Because firewire interfaces that were purchased a long time ago still work fine in professional settings.

Audio is interesting in that it’s both a fundamental part of people’s day to day interactions with computers and devices, but also niche.

A lot of audio developers are dealing with legacy code that has been growing over the years with a focus on pleasing a very specific user base.

Just look at how long studios held onto OS9 because of the long tail of plugins they had that were awesome, but abandoned.


I recently updated my app to be macOS 10.15 compatible. Most of the changes were just annoying and difficult to debug, but some were glaring bugs related to Safari App Extensions which were show-stoppers. I really hope they fixed them before the release...

1) Although everything is "notorized," the Safari extensions aren't installed on app start up like it's supposed to in most cases.

2) The following API call works, UNTIL macOS wakes from sleep mode, in which case it always returns 'false.' SFSafariExtensionManager.getStateOfSafariExtension(...)


> Although everything is "notorized," the Safari extensions aren't installed on app start up like it's supposed to in most cases

Unfortunately, the only way I’ve found to figure out why this happens is to sprinkle breakpoints throughout Safari…


With audio hardware it’s looking like substantial rewrites to core code because of underlying is changes. I think it’s just a lot of work on a short timeline for small development shops.

“Short timeline” assumes you didn’t see 64-bit coming a couple of years ago.

Not saying it isn’t work, just that “I didn’t see it coming” isn’t much of an excuse.


I mean most software that's currently maintained has been 64-bit ready for a decade. The issue is that there's a lot of 3rd party ghostware that isn't, and 32 bit bridging to support that code has been popular for awhile.

That software is a lot like vintage gear, there may be alternatives but they aren't the same, and that's the problem.


This is the same boat I'm in. Some of these programs were created only by one person, who can be impossible to track down and get a timely response.

Sounds like the software was effectively already dead and it was best to move off it as soon as it became clear there was only one developer and they were hard to get hold of. The problem isn't with macOS.

I would note Chris, given your insistence on all this, that your own project Graal is still shipping on Java 8 despite that being many years old. You're now working on moving to Java 11, which is itself already obsolete.

Imagine if tomorrow nobody could download GraalVM anymore because OpenJDK 8 stopped working for some reason (yes I know it's bundled, this is just a metaphor). It could easily be said you had years to upgrade, so why so sluggish? Well, of course, there were actual features you wanted to ship during this time too, not just doing upgrade work, especially given that Java 9 and 10 maybe didn't deliver many compelling upgrades.


Hmmm I take your point, but OpenJDK 8 is maintained.

A better example is the obvious impending transition to ARM, which GraalVM is already preparing for.


Sure, but so is Mojave. It'll be some years before Apple stops shipping security updates to older releases. Until then app vendors saying "don't upgrade macOS" is no different to Java developers saying "don't run this on Java 11 because it doesn't work yet" and we've seen plenty of that.

In fact, I'm guessing the pain of losing Java 8 will be too much for many organisations after so many years of stability and 9/10/11 breaking so much (current Gradle doesn't even work on Java 13!). Maintaining 8 will be a good business for a long time.


It's specifically a macos issue. 32 bit softwares run naturally on 64 bit hardware on any platform, Windows, BSD, Linux, without any maintenance.

It's just an Apple trick to force financial turnover for owner of 32 software.

It's not that there softwares are not compatible anymore, it's MacOS which block artificially 32 bit software.

It's for the same reason MacOS block sidecar for device older than 2016,even if they are capable to run sidecar in the first place.


Your software doesn't run in isolation. It needs services of the operating system. Apple has to spend resources maintaining 32-bit software and consumers would rather they didn't do that.

Poor Apple, that's certainly too onerous of a burden to put on their shoulders.

Going to 64 bit can be a lot of work. So there is quite some software which is still lightly maintained for which there isn't a 64 bit update available. Software which has worked perfectly fine up to date.

Additionally, it has zero benefit for many applications from a performance stand point, or makes them even slower.

> or makes them even slower.

Only in misleading microbenchmarks. In the real world, the memory bandwidth saved by using 32-bit pointers in some programs that can be guaranteed to not need more than 4GB of memory (or ASLR or other features enabled by x86-64) is completely outweighed by the costs of keeping both 64-bit and 32-bit libraries on disk and in memory and in cache. That's why even on Linux the x32 ABI was never able to gain traction even among Gentoo users, and why retaining traditional 32-bit support is viewed as only a compatibility measure for closed-source code that literally can't be updated.


> it has zero benefit for many applications

They literally can't run without porting, disregarding performance. The benefit is infinite!


But that's artificial, and you know it. The only reason for forcing 64bit is to lower Apple's support cost and push its bottom line.

Fine by me - sweeping out the dust.

If a program works fine, with no issues, why should companies be forced to update it simply because Apple decrees they no longer support 32 bit applications? "Sweeping the dust" is an absurd declaration.

How was it dead when it still worked? And why move off something that fulfills your needs and works?

32 bit softwares run on 64 bit hardware. Period. It doesn't require maintenace on any platform, neither BSD, Linux or Windows.

Depreciating 32 bit software is just an Apple trick to force financial turnover.

It's the same reason why the deactivated sidecar for macbook older than 2016.


It's a very odd trick given that Catalina still supports my 7 year old macbook pro. Why not force me to upgrade sooner if it's all about the money?

Not saying I agree with apple's decision, but it's surely it's just about reducing their own engineering costs.


I think sidecar uses h265 so it's probably a question of the GPU either having dedicated h265 hardware or having enough grunt to run some kind of shader.

> “Short timeline” assumes you didn’t see 64-bit coming a couple of years ago.

Or that you haven't had the resources to spend on it


If you think the price of updating for 64-bit is expensive, you should see the price of not updating and not being able to run your software at all.

Thankfully I'm not in that position, but many companies can barely keep up with their day to day stuff. There is never time for doing something for the distant (or not so distant) future.

It's not just 64-bit though. There are also some fake-security "notarization" requirements that were notoriously vague and poorly documented for months, as well as some equally vague and unspecific threats that certain non-notarized applications might or might not cease to work on January 2020. I've been developing for Macs since the late 90s and have never seen less clear developer info.

See the discussion here - and notice how confusing and contradicting the opinions about it are: https://news.ycombinator.com/item?id=21179970


For a game where 95% of the revenue is made in the first year after release, the cost:benefit ratio probably doesn't work out. Especially for games where the rights situation isn't clear or the publisher is defunct

Audio stuff is notoriously prone to issues with updates. This has come up continuously since Classic Mac OS in the 90s. Audio guys are even sensitive to changing their physical setup.

Outside of audio, I know of companies that ignore prerelease stuff because Apple has been known to change APIs during the beta period. It's not like Apple has a GM weeks before it is released publicly.


They have both a developer beta and a public testing beta prior to release. To ignore pre-release access to a codebase that will inevitably become the public release is, in my opinion, stupid and a death sentence. You know it's coming. Sticking your fingers in your ears on the off-chance that something changes before release is really, really silly.

"Ignore" is likely an exaggeration. It sounds like many of these audio companies have tested the betas. People have reported many, many emails over the past week pleading for them not to upgrade yet. I would think that anyone with experience in pro audio would be very careful about any upgrades.

The concern seems to be more of wasted time developing on a moving target. Often people report battery issues after major releases, people are reporting weird hangs after waking from sleep. Audio stuff is real-time in an OS that cannot guarantee it. Accommodating these things is a large undertaking and I'm sure there's gambling going on about what Apple will fix and what you can rewrite before Apple decides to ship.


But no one is saying that these companies are forcing people to move to the new OS. The issue is that there's not even an option to move to the new OS for some users because their software literally can't run. It's not about chasing a moving target when the 64-bit change was announced 4+ years ago. Waiting until the very last possible moment and then complaining that 1 or 2 APIs changed leaves no one to blame but the developer that ignored the warnings and waited to fix their issues.

I'm not talking about dropping 32-bit support. I'm talking about audio glitches that come up with every single major and minor update to macOS (or any OS). Catalina seems to be especially bad in regards to general stability.

Many audio tools have to deal with third-party, paid, and often abandoned drivers and plugins. So the 32-bit transition made for some difficult decisions. Nobody is arguing this is a surprise.


Wait until MacOS move to homemade ($$$) ARM cpu and everything you use except what is made by big publisher is broken, forcing you to buy a huge string of fresh arm software to replace depreciated x64 intel capable softwares.

In every Apple technical decision, there is a financial motivation.


a lot of developers won't update older versions of their software to be compatible either so a serious computer musician has learnt by now to stick with a system that is working

I don’t think I’ll upgrade my audio Macs for a very long time past 10.14. The fact is that after many years of recording and mixing music on computers I’ve got several older plugins that are simply not maintained anymore by the developers and moving to Catalina will certainly break any projects using them. Sucks that with DAWs unless you run a completely stock DAW without third-party plugins you eventually run the risk of things going out of date / support..

same here, i learnt the hard way when after upgrading and trying to use reason 7 again after a few months at which point i couldn't downgrade the OS at all. i essentially lost the ability to use reason at all without an expensive upgrade

I’m surprised a bunch of developers don’t worry bout breaking their development systems as well. I guess docker and such has taken a lot of the pain over how things used to be.

Both my audio macs are on Mojave now. I don’t think I’ll have any good reason to go to Catalina soon. I’d like to try it since it’s new but... too much could go wrong and it takes too long to restore backups to be worth my while.


Judging from all the "this will stop working soon" pop-ups I've seen, I'm pretty sure most of my Mac-capable Steam games will stop working if I upgrade.

There's wide reports of Ableton Live 9 not working in Catalina, including from Ableton itself... but I just updated to Catalina and it launches and works... so either I'm doing something wrong or they're trying to get people to pay for an upgrade?

Same for Unity

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: