You can complain about OpenGL all you want, and praise Metal, but it doesn't matter how much better it is. i cant rewrite all my shaders, and do the API integration. The fact that the richest company cant hire 10 engineers to maintain compatibility, and are pushing all this work on to thousands if small independent developers is just rude.
With apples history, why spend time embracing, their new tech if you cant count on it sticking around. For developers who write small one-off apps for iOS its fine, but for people who make larger long term applications Apple is very developer hostile.
Apple has defined their single graphics API to be Metal, kind of but not quite like mainstream Linux where this has become Gallium3D (i.e. all open source drivers implement their OpenGL and Vulkan and sometimes even Direct3D support on top of Gallium3D). I fully expect OpenGL to be dropped from many future Linux drivers too. I'm actually looking forward to the day Linux drivers will go Vulkan only, as hopefully it allows things to get less buggy.
Even when OpenGL gets fully removed from macOS, you can still use OpenGL. You can use Vulkan too. You just need a library for it, like ANGLE or MoltenVK. If that would have been Apple's communication, there'd be a lot less fuzz about this.
By the way, OpenGL still works on macOS 10.15. Deprecated does not mean it's not working.
The thing is, in its entire lifetime OpenGL was never meant to be a GPU abstraction layer - that only happened during GeForce2's time (and only for Nvidia). Even SGI's implementation was doing a lot of stuff on the CPU (and SGI even had a pure CPU implementation). OpenGL was always meant to be a high lever immediate graphics API (immediate here meaning as opposed to retained/scene graph APIs like OpenInventor).
> By the way, OpenGL still works on macOS 10.15. Deprecated does not mean it's not working.
Yeah but as it can clearly be seen, you cannot rely on Apple when it comes to keeping things working.
Every OpenGL newbie has to go through the rite of passage of assembling their own SDK for dealing with fonts, textures, materials, math, meshes,...
Whereas all competing APIs provide such features out of the box on their SDKs, including the console ones.
Plus there is this myth about portability, when any relatively big OpenGL application is full of extensions and multiple code paths to deal with OEM specific features, driver bugs or hardware glitches.
Then there is the little issue that OpenGL, OpenGL ES and WebGL are only superficially alike.
> Yeah but as it can clearly be seen, you cannot rely on Apple when it comes to keeping things working.
Well yeah that's what deprecated means. At the moment, developers have a bit longer to get on with the times though, so that's nice I guess.
This would have required there to be a lower-level interface that different vendors could all implement, which was not the case for early graphics hardware. In the decades since graphics hardware has all evolved to look more or less the same to SW (so vulkan became a nice fit), but that wasn't always the case.
You may be thinking about OpenGL 1.x and the fixed pipeline. Modern OpenGL is perfectly capable of running virtually every game and professional app out there.
OpenGL does NOT work on macOS and hasn’t worked for a very long time. Apple only allows to use an ancient version of OpenGL, which is not really my definition of “working”.
Nobody is going to ever remove OpenGL from Linux (or Windows) drivers. OpenGL/D3D powers everything there. Vulkan and Direct3D 12 are just the lowest level and are not a replacement for OpenGL nor Direct3D 11. Drivers may implement them on top of Vulkan/D3D12, but removing? Never.
Finally, Apple has never told anybody to use MoltenVK because they don’t want you to. And I would not be surprised if they end up banning such layers if they find they get popular.
https://xdc2019.x.org/event/5/contributions/329/attachments/..., 2nd slide
This is just one example, here is a nice gathering at latest Reboot Develop Blue 2019
As for Windows, OpenGL ICD drivers only run in legacy Win32 mode, UWP and Win32 sandbox mode doesn't allow for ICD drivers.
The idea is that leaving behind old APIs will benefit driver and system stability for the entire ecosystem, benefiting your users in other ways. You might disagree with that view, but Apple is not doing this out of spite alone.
I'm not sure that switching to use a library like MoltenGL for this stuff would take months either, but you're in a better position to judge that I guess.
For example in one of the email responses someone from Apple says that Carbon was a temporary solution but quite frankly that was never clear from the outside world. It seems there were even plans to make Carbon 64 bits that were scratched at the last minute.
Same with the hardware. One day out of the blue Apple announced its new laptops without USB-A ports. Even 3 years later USB-A is still one of the most used peripheral ports in the PC world (including Mac and Windows). Even Apple does not ship peripherals with USB-C cables other than adapters. AFAIK its mice and keyboards still ship with USB-A cables for charging.
These tactics are maybe ok for the consumer world but not for professionals who need reliability above all else.
USB-A still works, you just need a compatibility layer (a dongle).
OpenGL has been deprecated for years. Why are people reacting like this is a new development.
Not even that, a cable will do. I have zero dongles, and just bought a few tens of $ USB-C to X cables...
Assuming you work at a company willing to spend the money on official Apple dongles and not cheap flakey clones.
Unfortunately few companies will do that and now all Apple I/O goes through cheap unreliable dongles.
The reason Apple is doing this in this poor way is that they seem to want to push Metal as the answer. They want incentives for developers to move to Metal by making their lives more difficult on OpenGL, because even with libraries those libraries must be maintained and included. They don't want to bear responsibility for this legacy. I'm explaining, not defending.
This is the direct opposite to the Linux ecosystem, where Gallium3D not only facilitates both OpenGL and Vulkan, but has had attempted Direct3D implementations in the past. Yet another case where Linux has the "everything and the kitchen sink approach", and Apple just flat out dictates a single choice to everybody. That attitude sometimes helps Apple, but sometimes it doesn't.
Even among Mesa drivers, usage of Gallium 3D is not mandatory. If the driver team thinks it will make their job easier, they can use it. If they think it won't, they won't. Intel 965 driver (the current driver for Intel GPUs) doesn't use Gallium, for example. The new one ("Iris"), currently in the works, will. For a long time, the only driver that used Gallium 3D was only the AMD one.
Because as a user I don't want apps forcing extra baggage like old APIs to be supported forever in the OS. Nor I want apps that are fossilized and don't take advantage of new platform features.
If I didn't care for apps being Mac-specific and taking advantage of what the platform has to offer and moving on with the platform, I'd just as well use Windows or Linux (and vice versa).
I think it is pretty unreasonable to require developers to rewrite their programs periodically just because there is a some new feature in some new API.
Also, software that gets updated just for the sake of it usually ends up worse than before.
I can’t verify this claim, but given my personal experience it makes perfect sense. I run a 980 Ti alongside a 950 in my hackintosh tower and graphical glitches on the desktop happen pretty regularly despite both cards being perfectly healthy.
The other thing is that historically, Apple has been unwilling to differentiate the drivers between consumer and workstation cards because generally speaking that concept is kind of silly. Everybody should have workstation-class stability, not just those who shell out 3-5x more cash for a workstation card. It’s obvious why Nvidia would take issue with this,
The Nvidia web drivers are not great but there are so many factors that could be producing those glitches.
I've built about a dozen hackintoshes since 2010 and while I've experienced many problems I personally haver never seen graphical glitches.
I don't think it's fair to expect the same level of reliability from a frankenstein you've built yourself than a commercial product.
I think in my case specifically the graphical glitches are caused by having dual GPUs. That’s a somewhat uncommon configuration, but not so strange as to not test for.
Then Apple needs to dig down into their pocket, maybe they’ve got some extra money there, and solve this problem.
This is only a problem for game developers who live right on the GPU hardware and need to squeeze absolutely all the performance out of it. Which is probably a small number of admittedly very important developers: Game engines and AAA studios. For the rest of developers who have more modest requirements, I don't really understand what "modern" graphics APIs provide, besides tons of extra boilerplate code and headaches.
When I was a graphics programming newbie, it was great to have the whole fixed function pipeline all set up and ready for me to experiment and learn. If I had to learn from nothing on Vulkan, I'd probably have given up before the first 1000 lines. Drawing a triangle in OpenGL vs. Vulkan: http://sol.gfxile.net/temp/hellotriangle.png
The anti-managed bias on some Vulkan circles also doesn't help, specially when the OpenGL deprecation message keeps poping up in Vulkan related talks.
Yes, using managed languages will take some performance out of Vulkan, but guess whose's fault it is when no other API gets offered as alternative, not everyone is jumping of joy to use C for graphics programming.
And here Metal, DirectX, LibGNMN, NVN, WebGL, WebGPU take a much developer friendly attitude.
If something with a GUI hasn’t been updated in 20 years, chances are it’s not gonna be stellar software. Both technology and UI conventions have changed a lot.
I think the reason software abruptly changes (API switches, rewrites, UX redesigns, etc) is because doing that employs more software people than incremental improvements to old stuff. I don't think this is necessarily a choice, it's just the way it works out.
It's kind of a perfect business in that way. The downside is that occasionally an important customer says "enough!" They're totally justified in doing that, but it doesn't align with growth, so it's the cost of business.
That’s how you get Windows 10, where you’ll encounter three eras of UI, going all the way back to 1995, just to change your power settings. No thanks.
If the situation demands it I would say yes. I treat it as one big refactoring job. There are lots of shims in old software libraries that may not need to be there anymore. There are also lots of vulnerabilities that a fresh set of eyes and a different perspective can illuminate. A product that began its life being developed by a lone developer may have an entire team now. The rewrite is a good opportunity to explore how it works and why certain decisions were made.
... and a good opportunity to break existing and working features along the way.
If something works, why should you fix it?
I’m asking because outside of some big ones that I don’t use, like Final Cut Pro or Pro Tools - most of the Mac apps I’ve found are very low on features or “options” (flexibility). They all seem to try and follow Apple’s model of only serving the happy path and completely ignoring anyone who needs more than that - e.g. power users.
Even normal users get annoyed by this tendency, like recently when Apple removed the “mark all as read” option from Mail in iOS.
Scrivener (Mac only for 5-6 years, later had a Windows port),
Final Cut Pro X,
Logic Pro X,
>most of the Mac apps I’ve found are very low on features or “options” (flexibility). They all seem to try and follow Apple’s model of only serving the happy path and completely ignoring anyone who needs more than that - e.g. power users.
Or you know, "do a thing and do it well", the UNIX philosophy...
Not sure what power user needs aren't met though (especially with the full unix userland available as well)
Apple News, though it's still moist behind the ears and needs more features.
Sequel Pro, though if there's a Windows equivalent my IT department hasn't found it yet.
There are a bunch of tiny workhorse utilities like FileChute, EasyBatchPhoto, and Subler. Though there's probably a Windows version of Subler.
There are a number of Mac-only apps that attract people, but it's not the 1990's anymore. People don't use a platform for a "killer app." Most people choose a computer because they like the way it works. macOS works better for a lot of people than Windows or Linux. They may also be partial to the hardware for various reasons.
They all seem to try and follow Apple’s model of only serving the happy path
Apple, and its users focus on getting things done. Productivity is highly valued.
completely ignoring anyone who needs more than that - e.g. power users.
"Power users" is just Nerd for "tweakers and hobbyists." Apple hasn't been interested in that demographic since 1984. If you're a later-day "power user," good for you. Linux awaits.
The only reason I keep a Windows box around is for IE11 compatibility testing. And Linux I only use on servers. That's because the way macOS works makes sense to me. It didn't at first, coming from Windows. But now that I understand the workflow conventions, it makes sense, and I prefer it.
I don’t buy the argument that Apple wants to let you get things done when their stuff lacks things like that. How is removing a feature like “mark all as read” helping me do that?
And if you think Finder works better than Windows Explorer or other products that emulate Windows Explorer, you’re certainly in the minority.
I've used Windows and Linux (several wms), I wouldn't call their window management any more competent (if not less).
If you want a tiling wm for macOs, there are a few.
But managing windows is mostly bikeshedding, and for that something like Linux would serve better...
Finder still feels less cluttered to me than Explorer. It isn't quite as simple as Finder vs Explorer, it's the built-in assumptions. For me, Finder + Unix filesystem vs Explorer + Windows filesystem is a no-brainer.
If I have to install third-party app to do this stuff, it doesn’t really help me when I go over to some junior developers machine to help them.
The problem is that macOS is what they call “app-centric” and they’re the only ones who do it. It’s not better, but Apple will try to convince everyone that it is, just like when they stuck with the single mouse button for a decade, when clearly multiple button mice were considered the standard.
Actually, it’s document centric - the Windows approach is “app” centric. It has always been like that, 35 years and counting. It’s arguably a “truer” implementation of a windowed UI, than Windows.
Windows is window-centric. That’s why when you do alt tab, you see all of the windows. That’s also why you classically see a separate task bar item for each window.
I’d like to hear an argument in support of your view though.
TablePlus  has a Windows port.
* OmniFocus and anything from Omni Group
* Keyboard Maestro
* iStat Menus
and tons more that I have not personally used or found useful.
I think their idea is, if you're not going to make a full-on Mac app -- leveraging the extra capabilities, the native APIs, the native look, etc and updating with the platform --, don't bother.
It's perhaps too much for its small market share (2-10% depending on region, though with much better share of the richer, actually-paying-for-software demographic), but if that wasn't the case, what would really differentiate macOS from Windows?
Just the UI and under the hood stuff, when both are running the same identical apps and codebases?
Oh come on spare us the ethics lesson. You know perfectly well that OpenGL is an archaic stone around the neck for many developers. To the point that many (all) industry players are trying their hand at superseding it (of course also for lock-in, but that’s not the only reason.)
During transitions things come and go, adoption sway and ideas change.
Deal with it, it’s a fact of life
Also bear in mind that I currently support the application on Windows, Linux, FreeBSD and MacOS. OpenGL rendering works the same on all four platforms, and I have a unified cross-platform codebase which compiles without any trouble on all of them. If I switch to Metal, that's going to mean dropping all the other platforms which use OpenGL. Or it's going to mean having a separate MacOS-specific implementation for Metal. Dropping the other platforms is unacceptable.
MacOS is a niche. Windows is a bigger market. Depending on the application, Linux might be as well. Consider that Windows, Linux, and even FreeBSD can support beefy modern GPUs while MacOS is quite limited in the on-board GPUs on its laptops and other systems. When push comes to shove, MacOS is the platform which I will drop first. It's already got the oldest and most limited OpenGL implementation and the poorest hardware, and this makes it the worst choice for running my application in the first place.
If Apple want to retain developers like me, then they can provide a first-class current OpenGL implementation based on top of Metal. And then they can also provide a high-quality Vulkan implementation also based on top of Metal (or the other way around). Because if I do upgrade, I'll be upgrading to Vulkan on all four platforms, not to Metal.
You can't come now and whine that Apple changes UI frameworks too often.
In the GUI front I had to work on Win32, MFC, WebForms, Silverlight, XAML and WinRT.
I've been trough two DirectX breaking upgrades where we had to rewrite pretty much everything that touched the API because of deprecations.
Sure, old things keep working for users, but if you stick with ancient technology it gets way too hard to hire developers.
I'm not complaining though: I love writing code and upgrading things is a breath of fresh air. Every Microsoft API was significantly better than the one preceding it, IMO.
For me, the truth is somewhere in the middle. Yes, there is more churn on Apple platforms. On the other hand, it also allows Apple and the ecosystem to move faster. It is amazing what they have been able to achieve with (just naming some random things): wide Touch ID support, Metal, or application sandboxing.
Plus you can keep using your C++ core even if the UI is Cocoa.
The same applies to Swift, you can write or rewrite parts of the app in Swift and keep part of it in C++/Rust/Objective-C or your preferred language.
The same happens in SwiftUI, you can mix it with NSView and UIViews.
Microsoft and Linus have a specific fanatical attitude about the installed base of existing apps. Apple simply does not.
Contrast this with Windows, where browsing through system settings is like going back in time from Windows 10/8 to XP to 95 all the way to 3.11, if you click "Advanced" enough times.
Some Windows applications (especially installers) pop up dialogs that don't support anti-aliased fonts. It gives an impression of an OS with wildly inconsistent GUI and legacy garbage all over the place.
Note why we know this. We know this because those legacy dialogs are useful and no one could be bothered to rewrite them with new GUI toolkits. Probably someone would write new dialogs if the old ones were forced to die. Probably the replacement would even be more convenient. Or probably not (on both counts).
This isn't FLOSS.
You should keep a few replacements in reserve in case some minor component fails.
But at any rate I'd assumed that for someone who had "nostalgia" for old games would therefore have played those old games and thus had that system, and thus could just have kept it around as a disk image. I have Mac OS disk images going back to the beginning, it was by far the best way to install anyway because it was so much faster than actually going off the disc (the ease and customizability of network installs was also wonderful, I miss that). For really old stuff you can just emulate it via something like SheepShaver, no need for VM at all. I actually find that's the same as Windows, for all its vaunted backwards compatibility I had the damndest time getting some old games working under W7 even and it was much, much easier to just keep some VMs around cover 98, 2000, and XP. Those are old enough there is full virtual 3D support, even without hardware passthrough.
If you really wanted an old copy of 10.5 or 10.4 or whatever directly despite having not used/kept them, and you didn't want to get it off the net, you could just ask around a Mac site or buy a DVD. A quick look on Ebay shows tons for sale for $10-20, it's not as if they are some rare collector's item. And if you went on some Mac forum and just asked there'd be people with images like me or old discs just sitting around in closets collecting dust you could have for a stamp. It'd be a one-time issue, because then you'd immediate image it and keep it forever since they aren't big (looks like my Mac OS X Server 10.6.dmg is about 6.95GB).
10.5 supported x86 out of the gate.
I think you mean "obtained a license to use" ;)
While you did purchase the license, (IANAL) I don't think Apple is legally obligated to produce the original software for which you obtained a license for, at least per the license agreement.
The kernel not breaking userspace is quite a different thing, and in particular when comparing the macOS kernel and Linux because Linux specifically has a stable ABI and macOS doesn't. On macOS it's libsystem that provides the stable abi and kernel interfaces change underneath.
Beyond the kernel API's and libsystem you have the actual platform interfaces.
Carbon is a deprecated framework and has been for a long time. If apps are using carbon then they must get updated, Apple has been warning about this since 2017 (beyond just the depreciation notice in 2012 and everything in between).
And Windows95 apps almost certainly do not run unmodified... it's been awhile and things may have changed, but as I recall in XP (sp2, even?), they introduced the thing to run an app in compatibility mode and you actually select which is version to be compatible with. This works in some cases and not in others.
Not to mention win32 support has held back Windows stability and progress for a long time.
Do you just like the way it looks and feels? Is there an essential program that you use that need El Cap? Are you using a 2008 MacBook?
I'm not being critical. I have to run Jaguar on a dedicated machine I keep in the closet for a niche reason, so I'm curious what problem you face.
Interestingly, Carbon still pops up here and there, even in programs from large companies with tons of resources like Adobe.
The giveaway is when you ask it to do something hard and instead of getting the SPOD, you get the watch icon.
It was deprecated in 2012. In 2017 with High Sierra they announced that carbon apps would require changes to work at all.
Archive.org records the wailing and gnashing of teeth which accompanied the relatively sudden announcement:
You can still find developer documents in Apple's archives referring to a 64-bit upgrade path for Carbon apps.
That's why they have drink-themed names to fit with Java. Porting a classic Mac app to Carbon was called "Carbonization."
Despite my own personal plights with small desktop utilities I write now and then, I don't think Apple has broken Cocoa _that much_ or that it is untenable for a small company to ship cross-platform desktop apps these days (never mind Electron, I'm talking about toolkits like Qt, Xamarin, etc.). _Especially_ if their core functionality can be cleanly detached from the UI.
I poked around a bit to figure out the detail to which the software does estimations, and it does seem like there is a very complex UI:
...but I keep wondering how much of it could be abstracted away by a cross-platform toolkit, and what kind of separation there is between the estimation/modeling code and the UI itself.
We'll never know without a good overview of the internals, but my guess (based on looking at many internal corporate apps over the years, from the pre-web days) is that this evolved organically over time and built up technical debt by literally following UI abstractions rather that isolating their core code from it.
> The basic database features in Goldenseal are based on NeoAccess, licensed from NeoLogic Systems. Unfortunately we ended up re-writing a large percentage of the code there, which explains 6 or 8 months of the delay in our shipping date.
> We have used OOP (object oriented programming) throughout Goldenseal. It's a very good programming model which makes it easy for us to add new features, without getting lost in the 5+ million lines of code that the program now contains. A lot of that is comments and spacer rows, but it still represents many programmer-years!
> We estimate that about 50% of the code is for the basic interface—screen display, file management, mouse behavior. About 25% handles data storage for the many object classes, and 25% handles posting, action dialogs, basic accounting and estimating features.
In their blog post they said they worked for three years on the Mac version ... maybe the company is just one person not having much time or coding experience. Maybe I am underestimating the problem, but in my time they would have just hired a student or the nephew of their neighbor to hack something together useful.
Now if there is a lot of intertwined code that makes the UI less portable, that's another issue, but I currently don't see how their UI layer could be so complex as to not be easily shifted to Qt.
The cross platform nature alone would be worth the cost in my opinion.
It might even work better as a web site, so the customers don't have to worry about backups, sharing data, etc.
Apparently they tried QT in 2015: https://turtlesoft.com/wp/?p=198 .
There weren’t many 32 bit only Intel based Macs.
This complaint says far more about the developer than Apple.
I don't understand, why Apple even created x86 osx ABI.
When they introduced the first x86 Macs, the writing was already on the wall; in the same way, that there are claims that Carbon was deprecated for years, x86 was in the door on its way out. You would not create a new Carbon-based app today, Apple introduced ABI in similar circumstances? Well, maintaining it then for decades to follow comes with the territory.
Yes, as an user, I do mind removing it. For example, Apple had broken Preview scanning on Samsung MFPs for the entire Mojave lifecycle (, ), there's no indication that they are going to fix it, and as a workaround, users are using Samsung Easy Document Creator, which talks directly to scanner, avoiding Apple scanner libs. Yes, it a 32-bit app.
Ah yes had forgotten that (the Apple Intel Dev Boxes were P4's)
Xeons arrived with Mac Pro.
> The people I hear complaining about this are those who, like you, didn't move to Cocoa. Carbon was a _temporary_ transition API*. It was necessary when Mac OS X shipped in March 2001, but even though it wasn't yet formally deprecated, it was clear it would be.
At market-rate developer salary, it doesn't take a lot of man months for the port not to be worth the investment for small shops.
The reality is that Turtle couldn't make the business case to port to Cocoa at any point for 18 years, probably because most of their customers were Windows.
That's not them being lazy, that's just them directing their resources according to market demand and engineering constraints.
And Apple is not being greedy or uncaring, they also have to direct resources according to market demand and engineering constraints.
If Turtle had more Mac customers, they'd have either ported their app, or they'd purchase a Carbon compatibility library. And if more developers were actively using Carbon, Apple would put more resources into it.
They aren't, and so the two are going to part ways.
I think that guy lives in a bubble, Microsoft is still by a huge margin the lead in desktop OS market share :-P.
And really the main reason is that they try their hardest to not break people's applications. If Windows suddenly couldn't run the applications people wanted, everyone would migrate to Linux (and some to Mac, but Linux is free so the majority would go for the free stuff).
Now when I get an exe that needs to run in compatibility mode I don't even bother with it. I'm not compromising my computer because a developer has abandoned their software.
Yeah, the arc was that Cocoa was the future, but as late as 2007, Carbon was still widely considered a viable target for new apps.
Yes, it says that they didn't bother to waste resources doing unnecessary changes.
But it also says about Apple that they do not respect the time and resources of the developers that bother to support their platform.
There is no way to spin Apple breaking APIs and programs that people have worked on (for developers) and bought (for customers) in a positive way.
Keeping up with API changes from a decade ago doesn't qualify as unnecessary.
> There is no way to spin Apple breaking APIs and programs that people have worked on (for developers)
Here's the point. Apple isn't doing that. Apple don't visit customers with old macs to push breaking patches.
What happened is the developers made something that worked on old macs, didn't put in the effort to keep it up to date and are now complaining that Apple won't put in the efforts to support the old APIs on new systems anymore.
> and bought (for customers) in a positive way.
This is the developers fault I guess.
The API didn't change, it worked until Apple decided to break it in a subsequent version of macOS.
> Here's the point. Apple isn't doing that. Apple don't visit customers with old macs to push breaking patches.
They broke the API in new macOS.
> What happened is the developers made something that worked on old macs, didn't put in the effort to keep it up to date and are now complaining that Apple won't put in the efforts to support the old APIs on new systems anymore.
Yes, that is exactly the issue: Apple shouldn't have broken the API in the new macOS. The developers are perfectly in their right to complain about Apple breaking their applications and forcing them to waste time rewriting code that already worked to do the exact same thing only now in a different way because Apple doesn't care about the developers' time.
And yes, it is all on Apple - the breakage wasn't forced on Apple, it was something Apple decided to do.
> Meanwhile, our Windows version hasn't needed any work since 2000.
Microsoft's impressive backcompat is a blessing as much as it is a curse, and is also the cause of the [subjective opinion incoming] awful User Experience and complete lack of UI and UX consistency, and it remains the number one reason I don't wish to go back to Windows.
I made the Windows -> Mac transition in 2008 and recently played around with a modern Windows machine. I was pretty stunned to find the Control Panel experience... not unchanged, but still eerily similar to Windows XP.
Windows' Control Panel is a trainwreck of UX compared to OSX System Preferences. Being such a core part of the OS, I really expected to see, well, something new and better. But then I remembered how countless programs embed themselves in the control panel via DLLs, so Microsoft probably can't make major changes to the UX without breaking binary compatibility with ancient software packages.
There are no hooks, for example, to modify the Displays prefpane; if Apple decides they want to update it, they're free to do so. On the other hand, a lot of graphics card vendors still needlessly add their own little tab to the display adaptor control panel and the desktop's context menu.
They definitely might be outdated, tho.
Apple doesn't want you to build an app 20 years ago, make no updates to it and continue to sell it as though nothing has changed in the time since.
Their website is compromised of mostly broken links, the design is dated and it is showing reviews and awards from 10 years ago.
Clear to me that they couldn't be bothered to put any effort in at any point in the buyer's journey so I say good riddance to them.
Happened to me recently with an app which is the interactive version of a book. From one day to the other they switched from buying chapters/the book to some stupid subscription.
If they can't make a living they need to charge more and if they can't charge more they need to find a real job/business.
It’s especially bizarre that you’re ranting about them needing to find a real business when that is exactly what they’re doing by finding a billing model which is viable long-term.
It's impossible to say if a particular app will in a week:
a) still be available
b) if it will have switched to subscriptions or free + IAP or free with ads and IAP
c) what the IAPs will be and if your old (if any) IAPs will still work.
d) if they'll decide to sell your info to the next available bidder
This is why almost all apps are worthless and why I've essentially stopped purchasing or downloading apps. It's simply not worth the trouble to invest time to learn to use an app and investigate whether you can trust the developer.
That’s exactly what they’re doing: charging more.
The only kind of reasonable subscription is what's used for IntelliJ & co: if you stop paying after a minimum of X months, you get to keep what you paid for.
$200 is about 10 to 20-ish decent meals in most US cities (not high class, but not bottom barrel fast food either).
Do _most_ apps really not provide as much utility over a lifetime of use as ten meals?
If the users see no value in paying for it, then there are better business opportunities to spend development efforts on.
A particularly funny case was one otherwise rock-solid app which started SIGBUS'ing on iOS 13.
I never found iOS API changes any more extreme than Android or the web.
Had he used Cocoa, the rest, would have been trivial (to x86, to 64 bit, etc).
Or he could use whatever they like (C++, Pascal, what have you), and have their own UI/compatibility layer between OSes, like Adobe for example does (and several others, big and small: Sublime Text is an one man shop, and they make their own UI just fine).
The first response in the thread is not far off:
The people I hear complaining about this are those who, like you, didn't move to Cocoa. Carbon was a _temporary_ transition API*. It was necessary when Mac OS X shipped in March 2001, but even though it wasn't yet formally deprecated, it was clear it would be. The Carbon UI frameworks were deprecated circa, um, 2006(?). QuickTime has been deprecated nearly as long. 64-bit systems shipped in the mid-2000s, even before the x86 transition, and it was obvious then that 32-bit would eventually go away.
Eighteen years is _forever_ in the tech industry. At the time Cocoa was
introduced, the Mac itself hadn't even been around that long!
It sounds like keeping an app limping along on 30-year-old APIs, and then
suddenly trying to move it forwards all at once, is a bad idea. By comparison, keeping a Cocoa app up to date isn't that big a deal. I was maintaining Cocoa apps during the 64-bit, x86 and ARC transitions and had to make very few code changes. I've been out of the UI world for about 8 years, and there have definitely been significant changes in areas like view layout and document handling, but adapting to those isn't rocket science.
Yes, Microsoft is rather fanatical about compatibility. But that's part of what lost them their lead after the '90s: the amount of development resources needed to keep everything working exactly the same, and the difficulty of making forward progress without breaking any apps.
I think the problem is that their software is just a complete mess but they won't admit it. This whole thing has "sunk cost fallacy" written all over it.
You can do everything and more on a Mac.
Tablets, sure, when converted into pseudo-laptops, and unless we are talking about iPads here, the European shops are increasingly replacing their Android tablets on sale by Windows 10 laptops with detachable keyboards and touch screen.
As someone that does native/web development, the only area where Web wins are the typical CRUD applications, anything more resource intensive just brings the browser to halt, and for stuff like WebGL it still hit and miss.
As for doing everything and more on a Mac, as much as I like Metal Compute Shaders, they aren't a match to CUDA tooling.
Finally, as much as I like Apple's platforms, they are out of reach for a large segment of the world population, no matter what.
Depends on the office work. A lot of stuff is doable on a phone even, as many common place apps are available, if it wasn't for the ergonomics (small screen, no full keyboard, etc).
>As someone that does native/web development, the only area where Web wins are the typical CRUD applications, anything more resource intensive just brings the browser to halt, and for stuff like WebGL it still hit and miss.
As someone who is a heavy user of the other apps (NLEs, DAWs, drawing/bitmap editing) where the web is a non-starter (and I don't care for all the half-arsed attempts at web-DAWs and such), I agree.
But for business, CRUD apps are 90% of their needs, plus Word/Excel etc, for which Google Docs is a lot of the way there (and even if not, they exist in good shape natively for both Windows and Mac).
>As for doing everything and more on a Mac, as much as I like Metal Compute Shaders, they aren't a match to CUDA tooling.
Perhaps, I don't use CUDA or do 3D at all.
>Finally, as much as I like Apple's platforms, they are out of reach for a large segment of the world population, no matter what.
Sure, but that's also true for workstation-like PCs, and commercial compilers/IDEs, which you're in favor of, no? :-)
Anytime I put together a decent PC with best of breed parts, it goes to 3-4K. And commercial offerings from Dell with similar specs also go there, same for laptops, e.g. Lenovo, and the like.
And I've seen commercial compilers/IDEs priced in the $1K/$2K range, with which you can surely buy a Macbook Air or similar...
Naturally there are those that feel entitled to get a Ferrari to go down the grocery store, but that is their problem.
If one is buying enterprise class prices, then it is always going to be more expensive with Apple's hardware, because those compilers and IDEs are not part of Apple's offering, adding to the already expensive hardware price.
And if by hardware workstation, you want a really beefy one, the Apple's alternative is only their top hardware.
Thus at the end of the day, when one does the math of what one is getting per buck/dollar/yen/..., still way over the usual budget on PC side.
But yeah, if I can't work on a Linux machine I'd settle for Mac
But Carbon isn't one of them. I knew Carbon was about to be deprecated in 2005 when I was coding a crappy UI for a crappy OCR solution. Carbon was just a forward-compat GUI lib on Mac OS classic (1998?) for apps to run on Mac OS X (2000).
Likewise, Cocoa in 10.0 was buggy and incomplete. It took until about 10.4, 10.5 for Cocoa to reach feature parity with Carbon and many Cocoa applications were using Carbon for certain features (in facts, last time I checked, Cocoa menus were implemented in Carbon).
Bolting on Retina support to that is such a no-brainer that I hesitate to call it adding features - that was table stakes in making sure the switch to retina didn’t look like total ass.
None of this stuff ever changed how Carbon was deprecated. People just sat around not listening to Apple, and then the last two years the bigger GUI toolkits finally did the work to transition properly.
Toolkits that used HIView/HIWindow looked very out of date compared to proper Cocoa implementations (Qt, contrary to popular belief, wasn't really "native" for the longest time since they did this - it's part of why it always looked off).
There are very valid reasons to be frustrated with Apple, but the writing has been on the wall for this stuff for years now. Nobody should be complaining at this point.
Carbon NIBs even existed, although the format (XML) and concept (definitely not “freeze-dried objects”) were totally different from anything Cocoa. Xcode 4 and up couldn’t edit or view Carbon NIBs, which made that tech deprecation pretty clear.
Apple doesn’t deprecate ports, since they don’t run it.
But apparently Apple doesn't care about their users and developers.
Saying "told you so" doesn't help much, the only thing that helps is keeping things working. Anything else doesn't matter.
Though if there is anything that can be "told you so"'d that would be bothering with Apple's platforms when Apple doesn't care about you.
You don't need to ignore those. GTK+2 was released ~17 years ago and Qt4 was ~12 years ago. Apps written with them still run perfectly fine on modern Linux distros.
Same with Gtk+2. Once Gtk+2 itself is removed - like Gtk+1 and Qt4 before it - from the repositories, Gtk+2 applications will also stop working. Similar case with Gtk+3 now that the Gtk developers are working on a yet another incompatible major version, Gtk+4.
Also keep in mind that even though the major versions of Gtk+2 and Qt4 were released years ago, applications are still being written and released using those versions (e.g. Lazarus applications to this day are released with Gtk+2 as the Gtk+2 backend is by far the most stable - the Gtk+3 backend is still in alpha state, due to the low manpower that the project has - note that this manpower wouldn't need to be wasted if Gtk+3 was backwards compatible with Gtk+2 and instead all that time would be spent in stabilizing and improving Lazarus, which shows how much developer time is wasted when libraries many others rely on break backwards compatibility).
Now sometimes someone will suggest that applications should bundle their dependencies with them, but this introduces other problems - like the bundled libraries not using any features or fixes or having configuration mismatches with the libraries already installed on the system. This is a worse situation since instead of potentially addressing forward compatibility (you wont know for sure if the developers of the systems you rely on wont promise to not break your application) you are breaking current compatibility.
And your last paragraph sounds like you don't really know what you want. Do you want old applications to run exactly as is (which is solved by bundled dependencies), or do you want their developers to update them forever for decades and decades (which is solved by abandoning old toolkits)?
Finally, none of this even touches the fact that your "sane" win32 platform has incompatibilities too and there are many, many old win32 apps that doesn't run in modern versions of Windows.
"...as doing otherwise will break all the applications that rely on that tech that people bought and rely on". Do not remove the important bit.
Also i wasn't only referring to Win32 as Win32 does get updates, though they are minimal.
> Gtk2 and Qt4 are still around and up-to-date packages are available for all major distros.
For now. But as i already wrote several distros like Debian (and thus any that depend on it) and Arch are planning on removing it (just like they did Qt3, Qt2, etc). I already wrote that, why are you responding as if i didn't already addressed the issue here?
> And your last paragraph sounds like you don't really know what you want. Do you want old applications to run exactly as is (which is solved by bundled dependencies), or do you want their developers to update them forever for decades and decades (which is solved by abandoning old toolkits)?
It only sounds like i don't know what i want because you see only "which is solved by bundled dependencies" and "which is solved by abandoning old toolkits" as the only possible solutions. I didn't brought those up because they are the only possible solutions, i brought those up to explain why they are bad solutions (something i'm not going to repeat, i already wrote that).
Another solution, which i have repeated multiple times, is for the libraries to not break backwards compatibility. If the libraries do not break backwards compatibility then you can simply link against them dynamically, rely on them being there and provided by the OS (or at least ask for them and expect the OS to be able to provide them) and you wont need to bundle anything (the library is provided by the OS) nor worry about breakage (the application will keep working because the library wont break).
I mean, it isn't rocket science, it isn't dreamland, it is something already happening in both Windows and Linux to an extent. On Windows is the USER32.DLL et al, on Linux is the C library, libX11, libGL, etc. What i'm saying is to extend this to the libraries that also provide a full UI, not just the lowest levels.
> Finally, none of this even touches the fact that your "sane" win32 platform has incompatibilities too and there are many, many old win32 apps that doesn't run in modern versions of Windows.
Yes, there are incompatibilities but Windows is still trying to be backwards compatible and for the most part it succeeds in doing so. I have a lot of older software that work perfectly fine under Windows 10, either by themselves or via minor tweaks. Any incompatibilities that are the aren't because Windows developers didn't try, it is despite their efforts.
On the other hand incompatibilities with Gtk, Qt and now macOS are there because their developers explicitly and intentionally decided to break their APIs. They do not even try (Qt would have a hard time due to it being a C++ API, but Gtk has no excuse).
A screenshot: https://www.turtlesoft.com/Accounting-Software.html#Chart_Of...
I want to be respectful to an indie developer, but think it’s worth considering the kind of niche he works in (guessing windows -centric) and probably does more high touch sales.
I want to also guess that many of the people on that thread are from an older generation of developers, might be worth considering what the tradeoffs have been in language improvements that have attracted more people to writing software compared to the authors in that thread saying C++ is all they need.
What you can't do is expect the rest of the technology world to standstill or forever maintain backwards compatibility.
His reply post goes a bit more into his story of trying to update the app: https://lists.apple.com/archives/cocoa-dev/2019/Oct/msg00027... . While I get author's gone through some rough experiences, I wonder he could have sought outside investment/advising to get a solid rewrite done & grow the business.
Yes, Microsoft seems very keen on keeping Windows compatible even with ancient versions of the OS. New stuff usually is optional and APIs that behaved strangely in Windows 95 still behave the same way in Windows 10.
In Apple land, APIs may change their behavior whenever Apple deems it necessary. I ran into issues because of this with almost every macOS update since 10.8. And I see that even big players like Adobe keep running into compatibility issues all the time.
On the other hand, I'm spending just a few hours per week working on my project  and I manage to support an app that now runs on 10.5 through 10.14 and on three different CPU architectures with a single package. So no, I don't think you need to "throw 100 programmers at it" to get a working macOS version.
In hindsight, it would have been easier to just use Qt, GTK or wxWidgets. But I learned a lot by doing this myself and wouldn't want to miss that experience.
That person may not be doing good unit testing, they might not use the best tools, and find that supporting different configurations may require a lot more manual work rather than maintaining some carefully crafted #ifdefs.
And maybe one person could do it, but one salary they can't justify based on the demand may as well be 100 programmers.
Which is why I'm just going to use an engine-only approach from now on. I can, fortunately, eschew native UI's .. since I work on creative tools and my users prefer to have the same pixel-equivalent interface on each platform rather than shifting paradigms.
I think that game engines are the future for all app development. There's not much I can't do in Unreal Engine, for example .. with the benefit that the same app truly runs everywhere.
If Apple want to continue to subvert developer minds to keep them on the platform, fine by me. The engines see this as damage and easily allow a lot of us to route around the problem.
But Google did exactly this with Flutter i.e. using a game engine and I found the experience to be significantly better than native development. Not only are you guaranteed of the same behaviour across platforms but you end up with a smoother, more polished app faster.
Definitely not suitable for every use case but for many it was impressive.
Anyway, I have no problems running multiple instances of a small and light UE-based app. Most I've had running on one machine is 5 .. but I'm not seeing the limitation you're indicating.
The UIKit analog on macOS is AppKit.
Also GC/Arc aren’t features of cocoa, but of Objective-C itself.
I would expect that a second round of problem will come up because Swift is going to get access to APIs that Objective-C will not get.