Interesting; I was wondering if they had added some magic feature to Clang to enable Objective-C in pure C++ mode, but no, it really is C++. It just directly calls the Objective-C runtime functions like sel_registerName, objc_lookUpClass, objc_msgSend, etc. The expensive lookups like sel_registerName are performed in initializers of global variables, whereas for 'real' Objective-C code the dynamic linker does essentially the same job itself. I believe this has the disadvantage that unused selectors can't be fully optimized out, as C++ doesn't allow removing global initializers even if the global in question is unused. Therefore, any program that includes this library will, at initialization time, look up all ~1100 selectors defined in the library, regardless of which ones the program actually uses. But besides that, this approach may have equal performance to native Objective-C.
Also - throwing a zip file over the fence, really? Can't you put it on GitHub?
Along similar lines, from my skim of the README, it also seems like deallocated things are not actually released until the return from the nearest autorelease block, which is not exactly convenient for Rust (and it's not properly safe either without at least an extra retain). I always imagined that when you call through Swift it's a more native memory management interface, but perhaps it's also the same Obj-C layer under the hood.
The c++ wrappings break ARC, so if you use this library you'll have to manually retain/release any object. The autorelease pool is helpful if you autorelease an object but you'll need to manually manage the memory when using this set of helpers.
In obj-c or swift, ARC would handle that for you, so this makes the memory management aspect of using Metal a bit more of a headache.
Right. Would you happen to know if there's a way to make a retained object actually deallocate on last "release," other than putting an autorelease pool around it? Would you also happen to know if the Swift bindings have this same wonky autorelease behavior, or more straightforward Swift-style refcounting?
From a quick grep, it looks like this wrapper does not automatically call autorelease, so autorelease happens only when a method implementation autoreleases its own return value (or if you call autorelease yourself).
Objective-C has a standard rule for when a method is supposed to autorelease its return value: to quote the metal-cpp readme, it's when "you create an object with a method that does not begin with `alloc`, `new`, `copy`, or `mutableCopy`". In many cases, these are convenience methods that also have equivalent "initWith" methods, e.g.
But I'm not too familiar with Metal, and… it looks like Metal doesn't have very many of these methods in the first place. Instead it has a lot of 'new' methods, which shouldn't use autorelease.
When a method does call autorelease, Swift doesn't have any way of getting around it, though if there are autoreleasing and alloc/init variants of the same method, it will prefer alloc/init. Other than that, Swift likes to use the function objc_retainAutoreleasedReturnValue [1], which may prevent the object from going on the autorelease pool (with the 'optimized return' magic), but only as a non-guaranteed optimization.
Thanks, this is helpful. The specific method that's causing me trouble right at the moment is "computeCommandEncoder"[1], which is a method off MTLCommandBuffer, and I think not in the "new" class. In the Rust bindings[2], this is just a msg_send! and from what I can tell is getting an autoreleased reference, not a retained one.
It looks like objc_retainAutoreleasedReturnValue might be exactly what I'm looking for, even if it isn't 100% guaranteed; if I'm understanding, it would be safe, and wouldn't actually leak as long as you had an autorelease pool somewhere in the chain. However, I'm not seeing a binding to it in the objc crate[3]. Sounds like maybe I should file an issue?
Also, given that such a method seems obviously useful, I wonder why it's not being called from these C++ bindings?
> The specific method that's causing me trouble right at the moment is "computeCommandEncoder"
Yeah, it looks like there's no way to avoid autorelease here.
> It looks like objc_retainAutoreleasedReturnValue might be exactly what I'm looking for, even if it isn't 100% guaranteed; if I'm understanding, it would be safe, and wouldn't actually leak as long as you had an autorelease pool somewhere in the chain.
Indeed it would be safe and wouldn't leak, but the optimization is very much not guaranteed. It's based on the autorelease implementation manually reading the instruction at its return address to see if it's about to call objc_retainAutoreleaseReturnValue. See the description here:
In fact – I did not know this before just now – on every arch other than x86-64 it requires a magic assembly sequence to be placed between the call to an autoreleasing method and the call to objc_retainAutoreleaseReturnValue.
It looks like swiftc implements this by just emitting LLVM inline asm blocks:
This is optimistically assuming that LLVM won't emit any instructions between the call instruction and the magic asm, which is not guaranteed, especially if compiler optimizations are off. But if it does emit extra instructions, then you just don't get the autorelease optimization: the object is added to the autorelease pool, and objc_retainAutoreleaseReturnValue simply calls objc_retain.
(…Though, on second look, it seems that swiftc and clang sometimes use a different, more robust approach to emitting the same magic instruction… but only sometimes.)
Regardless, enough stars have to align for the optimization to work that you shouldn't rely on it to avoid a (temporary) memory leak; you should only treat it as an optional micro-optimization.
That said, the C++ buildings could have implemented the same scheme using inline assembly. And so could the Rust crate (edit: well, I guess inline asm is not stable in Rust yet). It's not the like magic instructions are ABI unstable or anything, given that clang and swiftc happily stick them in when compiling any old Objective-C or Swift code. But I'm guessing the authors of the C++ bindings either didn't want to bother with inline assembly, or considered it an unnecessary micro-optimization. Or perhaps didn't even know about it. /shrug/
If the object is not autoreleased then doing a release call will deallocate the object, otherwise it will be added to the nearest autorelease pool from the current stack and be deallocated when the pool is drained.
Swift and obj-c have the same ARC semantics, so I'm not sure what you mean by swift-style refcounting. It should be identical to the obj-c ARC semantics.
By "swift-style refcounting" I mean the object is reliably deallocated exactly when the last release is called. Following up on the response to comex, I would say I would get these semantics if I called objc_retainAutoreleasedReturnValue on the allocating method's return value, and it actually worked.
In general, a retained object is deallocated on last release. However ownership of some objects somewhere may have been given to an autoreleasepool, in which case “the last release” for those objects will come from the pool. To what extent this happens is implementation-defined.
Swift and ObjC implementations have levers which discourage objects being sent to the pool in common cases. It is possible to pull them from other languages but not easy.
Should be pretty trivial to create RAII Obj-C variants of shared_ptr and unique_ptr that automatically call retain and release as the reference is acquired, and when it leaves scope, respectively.
It doesn't break ARC right, it just won't do the automatic reference counting in C++ source. You can send it back and forth over the wall with CFBridgingRetain and CFBridgingRelease no?
Having an autorelease pool is pretty standard practice, and ARC works the same way in ObjC (and I assume Swift) apps - stuff that was autoreleased accumulates in the pool until it is drained, which tends to happen every turn of the run loop.
> It doesn't break ARC right, it just won't do the automatic reference counting in C++ source. You can send it back and forth over the wall with CFBridgingRetain and CFBridgingRelease no?
I guess I should say that clang's ARC doesn't apply to these c++ wrappers.
The c++ code is loading symbols using the objc runtime API's to load symbols and call objc runtime APIs to dispatch the metal calls. So, you could pass these references to other objc code and it should have the correct refcounts afterwards.
Having to manually retain/release objects is a bit of a pain, but its workable.
Using a c++ RAII type to retain/release is also somewhat do-able, but I worked in a codebase that had that kind of code and it can be frustrating to get the refcounts to be correct synchronized. Although that was back with c++11, so I'm sure things have changed that would make this easier today.
Huh, I'm surprised. I had just assumed without looking that this library provided wrapper classes where the copy constructor calls retain and the destructor calls release. Apparently not.
Yeah, kind of weird there's no smart ARC class, although I suppose you need a way to override it when you need a weak reference. I guess it would be trivial to add one?
Somewhat unrelated, but is Rust `Arc` able to match the performance of Swift reference counting? I know they do a bunch of fancy tricks to embed the reference count into the pointer high bits, it seems like Rust would probably struggle to achieve that.
Interesting. Rust does the same thing with Obj-C selectors, except each selector is looked up lazily at the first call. I wonder why they didn't do that for these bindings.
B. Microsoft does their own proprietary API with DirectX.
C. Metal is a high-level language, Vulkan is low-level. Apple wants you to use a high-level language so they have more flexibility in optimizing the low-level for you now and in the future, which Vulkan closes off more.
D. Metal runs on iOS, iPadOS, and tvOS as well, not just macOS. It's a graphics API for all Apple platforms.
E. Because of D, talent for Metal is much more likely to be found with mobile game developers than AAA studios.
F. Game engines actually support Metal - Unity, Unreal, LWJGL, the most common engines run on Metal fine. Don't use Metal as a scapegoat.
> C. Metal is a high-level language, Vulkan is low-level. Apple wants you to use a high-level language so they have more flexibility in optimizing the low-level for you now and in the future, which Vulkan closes off more.
The problem is that historically, this is what's lead to tons of issues in games. Drivers make their own "optimizations" resulting in terrible performance for some task on one GPU, and great performance on another. Or instead of performance differences, it's outright bugs.
Then, combine that with GPUs being fairly expensive, and fast moving when it comes to new features. Most game studios aren't going to buy 20 different GPUs and test their entire game on all of them. So games often ship with terrible performance or bugs on various different GPUs. And it's often not the game's fault: are games really supposed to detect your exact GPU model, and change code paths to work around a driver bug? That's crazy.
To combat this, GPU drivers often get updates after a game launch that fix the issues with performance/bugs for specific games at the driver level, when it detects that those games are running. So now drivers become a pile of game-specific hacks, making the problem even worse.
This is the reason Vulkan is so low level. The goal is to make sure drivers don't get horribly bloated, and games aren't working around driver-specific bugs (they still happen, just hopefully less). And when games want to use specific GPU features for extra performance, instead of drivers trying to magically improve normal operations, they can make a vendor-specific extension providing the new functionality that games can opt into when the feature is detected (e.g., raytracing). That's part of why Vulkan has so many more extensions than OpenGL does.
Vulkan may be low level, but it's intended to be so. By foregoing Vulkan support, Apple rejected any attempt at making a higher level API on top of Vulkan that would work on every system. Instead, WebGPU has to try and abstract over Vulkan/Metal/DirectX12 separately, which mostly works, but leads to a whole lot of clunk and longer development time than if Vulkan was the standard everywhere.
That's not to say macOS should never have made Metal. But by not adding Vulkan support in addition later on, they're really hurting cross-platform compatibility. Sure, now Apple can optimize their platform to the best of their ability (theoretically). But that comes at a huge overall cost to the ecosystem, for instance with games having poor support for macOS.
> Then, combine that with GPUs being fairly expensive, and fast moving when it comes to new features. Most game studios aren't going to buy 20 different GPUs and test their entire game on all of them.
This is mostly irrelevant for MacOS/iOS/iPadOS/tvOS. The number of people plugging in an external GPU on Macs is minuscule
> for instance with games having poor support for macOS.
Games don't have poor support on MacOS because of lack of Vulkan. They have poor support because pretty much no Mac (except maybe the newest M1 macs?) have gamer level GPUs and no market for high end games. Gamers basically know if you want to game, get a PC, and game devs know not to waste money on Mac ports.
> Games don't have poor support on MacOS because of lack of Vulkan. They have poor support because pretty much no Mac (except maybe the newest M1 macs?) have gamer level GPUs and no market for high end games.
It is basically this:
Gamers don't buy Macs (at least not as their only system, or their primary one for gaming) because there are few games.
Few gamers buying Macs exclusively means there is little unique Mac demand for games (that is, demand that can't be tapped on another platform, because the potential buyer has it and will buy games on it).
Little unique Mac demand for games means few games, and thus the cycle is closed.
Breaking out of that is hard, and Apple would probably need to be the one to expend the effort, and Apple doesn't seem to want to do that.
Breaking out of that might be significantly easier if Apple entered the console space, which is something I sort of assumed they would eventually do. I'm not particularly clued in, though.. does anyone have a good reason why they haven't?
Because game developers despise Apple. The only major devs who gave a damn about Mac support was Blizzard, and even then they didn't really care about porting anything other than World of Warcraft to Apple Silicon (or even Mac altogether). Valve tried extending an olive branch a few years ago with Steam Proton support on Mac, but Apple batted it away, and Valve responded by pulling SteamVR support for Mac. The rest of the AAA gaming space absolutely will not care until Vulkan is natively supported on Apple hardware, because there's no way they're going to put their games on it otherwise. Without Vulkan, you're not just missing out on Vulkan titles (of which there are many), but also DirectX ones that would be playable at near-native framerates through translation tech like DXVK. The fact that Apple doesn't have a robust, high-performance graphics API is completely gimping their chances at getting game developers to care, which is why completely broken platforms like Linux and BSD somehow have better game support on hardware that's a fraction of the price.
Oh, and there's the fact that the console space is a low-margin industry where their competitors are taking a loss on their product. The PS5 and both new Xboxes are being sold below-cost just to move units, with the expectation that users will pay for digital purchases/subscriptions to compensate. The Nintendo Switch is barely staying underneath $300, to the point that it normally sells at a loss during the holidays just because of shipping prices. Apple is known to drive massive hardware margins, so making a product that's actually worth it's value isn't all that attractive to them. They'd rather have their cake and eat it too, which has worked pretty good for their insane 30% margin they cut off nearly every microtransaction they see. The less attention they draw to that fact, the better.
Apple do have a robust high-performance API, Metal. Arguably it has the best ergonomics of any of the low-level graphics APIs. All relevant game engines already have Metal support, too. There's even an equivalent to DXVK in MoltenVK. It's not that different from Microsoft's position.
It's possible Apple might make an M1 Apple TV with a decent GPU. Like the Macbook Air, it's likely it'd be cheaper to manufacture than competitors with similar performance. I suppose we'll see.
> Apple do have a robust high-performance API, Metal.
Metal is neither robust, or, arguably, high-performance when compared to DirectX or Vulkan.
> Arguably it has the best ergonomics of any of the low-level graphics APIs.
It's not a low-level API. By Apple's own admission, Metal is hopelessly high-level and can only achieve the ergonomics it has by abstracting away all the features that gamedevs care about and putting the keys to the kingdom in Apple's hands.
> All relevant game engines already have Metal support, too.
I like how you qualify this with "relevant", as if I'd come up with a list of all the engines without Metal support and you'd wave them away as irrelevant since they don't support Metal. Plus, Metal isn't just a compilation target. A "port" to Apple Silicon doesn't entail checking off a new box on the compiler, it's a long and arduous process of porting shaders, bytecode, 32-bit libraries, x86-based libraries, API calls and more. Since Apple provides game developers no incentive to do that work, they continue to refuse porting to their system.
> There's even an equivalent to DXVK in MoltenVK. It's not that different from Microsoft's position.
Yeah, except the main difference is that DXVK gets near-native performance, while MoltenVK is far too slow to be considered for regular use. Do you honestly think a Proton-like compatibility layer could be built on a technology that needs to translate Vulkan code into a higher-level, slower API? Keep dreaming.
> It's possible Apple might make an M1 Apple TV with a decent GPU. Like the Macbook Air, it's likely it'd be cheaper to manufacture than competitors with similar performance. I suppose we'll see.
Comedy-freaking-gold right here. Let's say that you're right, and Apple does release an Apple TV with an M1 processor in it. I'm going to give you an insane benefit of the doubt, and we'll assume that they cut down the price to $500 (using the Mac Mini as a starting point). At peak performance, the M1 can churn through a measly 2.4 teraflops. There are last-gen consoles that were faster than that, not even to compare them to the 11 teraflops that the Xbox Series X and Playstation 5 pump out. Okay, so let's put a faster GPU in there: the M1 has 8 GPU cores, so getting it to be as embarrassingly parallel as it's peers would require us to multiply it's core count by 4 or 5 factors. That would require a whopping 32 GPU cores, the same amount that the $2500+ Macbook Pros carry. If they did make a $500 console of that ilk, they'd be losing money on every unit shipped for the GPU alone.
I genuinely have no idea where you're getting any of this stuff. Maybe read up on game development before responding?
You exaggerate Metal's level of abstraction. It's still far lower level than OpenGL or D3D9 and allows most modern optimisations. It's supported by all popular game engines, same as with D3D12 and Vulkan. Of course there are plenty of niche ones that don't, but that's also the case for Vulkan.
In practice, games (usually through engines) overwhelmingly target the native graphics APIs. D3D12 on Windows and Xbox, GNM on PS5, Metal on iOS/macOS/tvOS, NVN on Switch, etc. In practice it doesn't matter much.
MoltenVK is not as good as DXVK right now, sure. That still has nothing to do with either Apple or Microsoft. They both offer exactly as much support.
The chips in MacBook Pros aren't the bulk of the cost. I think it's unlikely Apple would make a console, but still possible. They have all of the necessary ingredients already vertically integrated.
1. Apple requires Hardware Margins, I am not even aware of a single hardware category they sold that have low / zero margin.
2. Apple constantly pushes changes in their OS and API. Breaking backward compatibility. Part of the reason why many iOS Games dont work after a few iOS update anymore. Compared to console where games are expected to work during the life span of the console.
3. Thirdly and most importantly. There are zero gamers DNA in Apple. They dont understand Gaming.
> Games don't have poor support on MacOS because of lack of Vulkan.
Proton/Wine/DXVK. Wine works on Mac but DXVK barely does through MoltenVK. The lack of proper Vulkan support is directly why games have poor support on MacOS. I'm almost positive if Mac OS had Vulkan support Valve would be supporting Proton on Mac OS.
MacOS had great game support for a while, but axing 32-bit libraries and refusing to adopt Vulkan led big players like Valve to drop support for MacOS in many of their programs. For a short while with Mojave, you could reliably play 80% of the PC games on Steam though.
I had many 32-bits games in macOS but they were all axed, including big titles like Diablo 2 (not the resurrected one), Warcraft 3, and a lot of indie games. This made me realize why no sensible game companies would develop in Apple platforms, unless the games generate revenue continuously (which are mostly pay-to-win games)
A) Many games don't use these engines. The trend has actually been going in the direction of less custom engines, but a significant amount of games still do.
B) Games might not have to deal with macOS if they're using an engine, but the engine themselves still do. This limits what engines can do to the least common denominator. And we all know Apple isn't the fastest when it comes to new standards or features. In the graphics world, that comes in the form of DirectX12 vendor-specific extensions, with Vulkan vendor-specific extensions a few months later, and then Metal support whenever Apple feels like copying it.
Someone else in this thread linked an example about Metal lacking support for a type of atomic primitive that prevented the decoupled loopback algorithm from running on Metal. That means the author couldn't use WebGPU to implement the algorithm, and had to make an implementation in Vulkan and Metal separately (iirc all the details of this correctly, you get the general idea). And it _wasn't_ clear that they couldn't use WebGPU to begin with. The WebGPU spec had to actually be clarified based on the author's investigation. Engines and other APIs wouldn't have to deal with this if Apple just supported Vulkan.
But then we're back to the various pros and cons of Metal and Vulkan. A counterpoint would be that yes, it lacks the atomic primitive, but Vulkan's low-level design means that games would be less optimized for future devices if their developers stopped supporting them.
An example of this would be how Metal runs on AMD GPUs, Intel iGPUs, and now Apple GPUs. Would a Vulkan-based implementation that was originally written for AMD GPUs be as optimized for Apple GPUs as a Metal implementation would have been? What about 5 years from now?
My point is that there are minute details that are benefits and drawbacks to both, and the decoupled loopback would be one of them. I wouldn't be surprised if there are algorithms Metal can do but Vulkan can't.
My point isn't to talk about the various tradeoffs between Metal and Vulkan. Ignore that Metal exists for now. My point is that by not supporting Vulkan, Apple isn't supporting the lowest common-denominator API that everyone else supports. So that leads to people having to make a Vulkan implementation, and then also a Metal implementation if they want to support macOS. Regardless of the benefits and drawbacks of each, there isn't a way to support every platform with one API. And Apple causes a lot of strife because of that.
I don't have a problem with Windows having DirectX12 in addition to Vulkan. Or even that vendors typically add new features to DirectX12 first, with Vulkan getting them later. This is because you can still write a Vulkan program, and have it run on Windows. Apple only supporting Metal breaks "write once, run on any platform". It's one thing for Apple to have Metal as a better supported/approved API that they can make the best apps with. It's another to lack Vulkan support altogether. Because Apple aren't just trying to get the highest Metal adoption. They could easily do that with AppStore restrictions or something, or just accepting that UIKit using Metal is good enough adoption. Lack of Vulkan has knock on effects on every other platform, because now engines have to do a ton more work, and likely have more bugs.
This isn't actually accurate because, if you are a game engine, you can't just have a Vulkan-only implementation. There's plenty of old computers that don't support Vulkan. PlayStation doesn't support Vulkan - better use GNM, GNMX, and PSSL. Xbox only uses DirectX, which has both high and low-level variants, no Vulkan there. iOS is Metal only again. You could use Vulkan on the Nintendo Switch, but NVN is faster and the preferred method.
You might get away with just Vulkan if you were targeting only Windows, Linux, and Android on hardware that isn't more than roughly 5-6 years old. But Linux is a small percentage and they've got Proton, so as long as you don't want to port to Android and not iOS for some reason, why not use DirectX?
"lowest common-denominator API that everyone else supports" - Outside of the PC, Vulkan isn't really that broadly-supported. Yes, you've got Android and Nintendo Switch, but those are only individual items in their respective categories and don't make sense to support alone without the rest of their categories (smartphones and consoles).
> My point is that by not supporting Vulkan, Apple isn't supporting the lowest common-denominator API that everyone else supports
> I don't have a problem with Windows having DirectX12 in addition to Vulkan
Did I miss that Microsoft is supporting Vulkan? I googled, but couldn’t find that. Vulkan and Microsoft both seem to support HLSL, but that’s it, as far as I can tell. What do I miss?
Many game developers don't use any of these engines, and even when using an engine, the underlying graphics API can certainly matter. There's nothing saying that Unreal for example, has to support all the same rendering features in Vk and Dx and Metal. And even the same feature can look subtly different between different interfaces.
I'm less mad about Metal existing, and more mad about Apple deprecating/abandoning OpenGL. There was no good reason for that, especially for a company with their resources.
OpenGL is a little weird. Windows, for example, does not actually support OpenGL IIRC. It's just that Intel, AMD, and NVIDIA started including OpenGL drivers in their graphics drivers even though Microsoft still does not lift a finger to support OpenGL.
[EDIT: Actually, they do, with the OpenGL compatibility pack at https://www.microsoft.com/en-us/p/opencl-and-opengl-compatib.... It's new for Windows 10, and it only supports up to OpenGL 3.3, suspiciously around the same compatibility level as macOS...]
MacOS, well, Apple got into a huge spat with NVIDIA in ~2012. Around the same time, actually, that Apple stopped updating OpenGL (though they wouldn't formally deprecate OpenGL until 2018). Part of me wonders if NVIDIA was writing the OpenGL for Apple, but then OpenGL just got put on the shelf after the spat.
The result though is that neither Windows, nor Apple, supports OpenCL past 1.2 or OpenGL past 3.2/3.3. It's the GPU vendors that have been taking us further.
[EDIT 2: Correction, macOS supports OpenGL 4.1, Windows tops out at 3.3 without GPU drivers if you have DirectX 12]
This doesn't compare apples to apples, as the issue isn't really about operating systems but more about platforms and integration philosophy: on Windows, the people who make the graphics hardware are able to build drivers for their hardware that support OpenGL, Vulkan, etc.. On macOS, one of two things happens, depending on the device: either 1) Apple does most of the work to create the graphics hardware, and then refuses to support anything designed by Khronos; or 2) Apple buys most of the graphics hardware from a third party, but doesn't really offer a way for third-parties to develop kernel modules / drivers with the depth of access required to pull off a useful graphics driver. "Part of me wonders if NVIDIA was writing the OpenGL for Apple" <- FWIW, I went to college with two of the people who worked--at Apple--on their OpenGL stack: it wasn't third-party.
Fair enough, but I guess my point is more that, technically speaking, Windows and MacOS are actually identical in their OpenGL and OpenCL support level. The only place this really shows up though in Windows is software rendering - which lacks OpenGL because Windows lacks OpenGL whereas Windows can software-render DirectX AFAIK.
Apple didn't allow AMD though to continue updating OpenGL in their MacOS drivers, possibly because (I'm just speculating) they decided to invent Metal to replace OpenGL around the same time as their NVIDIA spat when they stopped updating OpenGL. Metal came out in 2014, the NVIDIA spat + OpenGL abandonment was 2012. Just my suspicion that MacOS's OpenGL is NVIDIA-based.
MacOS ended on OpenGL 4.1, which was formally spec'd in 2010 but wasn't widely-implemented until 2011 and later (because there's implementation time after the spec is produced).
This means that around 2012, Apple would have been most likely working on OpenGL 4.2 (possibly with NVIDIA, and maybe even with NVIDIA's source code) but that's the year their relationship soured. However, depending on the source you read, there are reports their relationship was souring as early as 2009.
There's plenty of reasons for it. For one, OpenGL is arguably a crap API. For another, even with billions of dollars resources are scarce. It's extremely hard to find graphics programmers that want to work on drivers vs work on games etc...
I agree. It's the worst decision so far that they've done. OpenGL may not be perfect, but it worked well everywhere. And what about all the thousands of programs that uses OpenGL? Even future programs may use OpenGL - it is not a legacy API.
It may be deprecated, but there’s no sign that support will be removed from macOS (since lots of apps and games still use it).
If they wanted to, the Apple Silicon transition would have been the time to remove it, but that didn’t happen. On Apple Silicon it’s enabled by an OpenGL->Metal translation layer.
I say this as an avid user of Apple hardware; I have no faith that they will really keep it around forever. They have shown very little willingness to maintain backwards compatibility with software for an extended period of time. Many games released for macOS less than a decade ago simply no longer function. Meanwhile, the Windows PC I built just a few years ago still runs software and games from 25 years ago.
This is why I mostly refuse to buy games for my Apple hardware. Until Apple makes some serious commitments (and shows evidence of following through with them), I will not view my MacBook as a viable gaming device. Right now, I'm not even convinced Rosetta 2 will last more than 6 or 7 years.
Specifically, I suspect one main reason they've kept it as long as they have is for WebGL support in Safari. Now they are investing in Angle[1], so going forward I expect that to be the primary path to WebGL compatibility.
It isn't just games, i bought an iMac back in 2009 which i used extensively for a couple of years and bought some software on it. I stopped using it as much in later years but i was still keeping it up to date with the latest OS versions, etc (until Apple stopped supporting the iMac itself). Pretty much every single macOS update had applications breaking. Some had updates, but not all programs did and some had weird things going on (e.g. i bought Pixelmator before they made a MAS version and since that was a paid update and i was fine with Pixelmator's functionality i just kept using that... until an OS update broke it and for some reason the tool palette doesn't show up at all).
And of course there were games too, e.g. NOVA which stopped working several versions ago - and isn't available anywhere else (PC or whatever) either, so basically there isn't a way to play it at all nowadays (it may not be a great game and basically was a Halo ripoff for early iPhones, but i do not see quality as a reason for it to disappear - besides, it might have been one of someone's first games, people like to have those around).
It’s not documented anywhere. OpenGL just works like it does on any other Mac, and you won’t notice it unless you find bugs in it (Wine has some oddball OpenGL checks that trigger a Metal assertion on Big Sur)
Right, but my question is: do those checks only assert on Big Sur on M1, or is the mapping later from OpenGL -> Metal just Big Sur/Monterey regardless of platform?
Just seems a bit odd to me they'd only do this on one architecture.
To be fair, I think just about everyone that has done any OpenGL programming absolutely hates it. It uses ancient paradigms that are confusing for neophytes and completely out of sync with modern programming practices. It's also essentially impossible to do multithreaded rendering.
> It's the worst decision so far that they've done. OpenGL may not be perfect, but it worked well everywhere.
I agree with your sentiment. I like OpenGL quite a bit due to its ubiquity. From my perspective as a casual graphics dev, it's relatively easy to use OpenGL to get triangles on the screen, compared to Vulkan (although I know Vulkan has other benefits).
> And what about all the thousands of programs that uses OpenGL?
Judging on past behavior - Apple isn't shy about breaking compatibility with techs they're no longer interested in.
> B. Microsoft does their own proprietary API with DirectX.
DX12 may be proprietary and ship only with Windows, but GPU vendors are able to provide OpenGL and Vulkan support through their drivers. As a result, DirectX is an option, not a requirement. Metal is a requirement for 3D acceleration on Apple hardware.
It's worth noting that all devices that you can buy with Windows on ARM - those running an ARM version of Windows 10/11 on a Qualcomm chip - don't have Vulkan drivers available. Just DirectX - and an OpenGL compatibility pack that runs OpenGL on top of Direct3D 12.
Those same GPUs have perfectly functional Vulkan drivers on Android. I don't know what the problem is.
Having Vulkan on Windows is more like a courtesy from Nvidia, AMD and Intel that Microsoft hasn't really blocked [1]. But it can't truly be relied on on Windows - other GPU vendors don't need to build it [2]. Presumably with that excuse, Microsoft has not allowed apps on the Windows Store that targeted a 3D API other than Direct3D - that stuff just would not work on Qualcomm devices and presumably other devices as well.
[1]: If they were committed to being assholes about it they could force change as part of the WHQL problem, but it doesn't look like they are that committed. I'm not sure what they are committed to though.
[2]: I.e. what if we get Windows laptops with Mediatek chips, or Microsoft and/or Apple revive some Bootcamp mechanism for Windows on Apple Silicon?
Like you said - Microsoft is just completely passive regarding Vulkan on Windows. It's up to the GPU vendors and driver makers, as long as DirectX isn't threatened. And on Xbox, DirectX is your only option, so if you want a nice clean Windows to/from Xbox port, writing for DirectX just makes sense.
As for #2, Qualcomm (according to new reports) has an exclusivity deal for Windows on ARM that is expiring soon. But Microsoft would have to work with Apple and allow drivers to be written for the M1. Craig Federighi of Apple said they would be fine with Windows on M1 but Microsoft wasn't at the time.
Though, to be honest, even if the exclusivity deal expires, I don't see Microsoft being excited about Windows on the M1. It only serves to embarrass their own products.
> But Microsoft would have to work with Apple and allow drivers to be written for the M1.
Yeah to be fair I don't actually think that's going to happen. Someone needs to bear the burden of developing/maintaining a Windows graphics driver for Apple Silicon, which is a huge task. (Other drivers too, but that's not quite the same expense.) Apple will almost certainly not pay for that, and Microsoft would have to be extremely committed to having their OS be sideloaded on a competitor's devices to do it.
If either Apple or Microsoft does do it though, you can be pretty sure you won't be seeing any Vulkan drivers soon.
Precisely. Somehow on Windows I feel OpenGL has always been regarded as more of a developer/professional thing, and Vulkan lives in a similar space. Microsoft would shoot themselves in the foot explicitly disallowing Vulkan, but it seems they do want to have it limited somehow to more niche interests (developers, certain enthusiasts) and Direct3D as the 3D API everyone commits to.
> F. Game engines actually support Metal - Unity, Unreal, LWJGL, the most common engines run on Metal fine. Don't use Metal as a scapegoat.
With much pain. Have you seen the shader pipeline we have to deal with going from HLSL to Metal? And no, translating the entire body of shader work to MSL is not an option.
Could it make sense for Metal to be ported to other platforms?
Vulkan has never clicked for me. I can cut and paste code to get something running but I don't really understand it. When you say Metal is high level, that's very appealing to me.
> Metal is a high-level language, Vulkan is low-level.
Wikipedia says Metal is low-level. What do you think accounts for the different description?
I totally get where you're coming from, Metal is much easier to program than raw Vulkan. I think a Metal-like API over "modern" APIs could be a good thing. There are things like the Vulkan Memory Allocator that help, but among other things that makes integration of different modules harder; this is especially nice on Metal because you can just allocate your own stuff off MTLDevice.
That said, there would be challenges, and big ones, the more you tried to make it actually Metal compatible. You'd need a C++ compiler for the shaders, and certain idioms (pointers, scalar types other than 32 bits) would be difficult to translate portably.
And as adamnemecek points out, the role of a friendly cross-platform GPU layer is largely being occupied by WebGPU (wgpu in particular in Rust-land). That'll be very comfortable for people used to Metal, as there's been lots of input from Apple folks. However, WGSL is much lower level than Metal Shading Language, and there are lots of important capabilities (subgroups, etc) missing that Metal and other advanced (>DX11) APIs can do.
Probably differing definitions of "low level," levels being relative to what you are talking about. In terms of making a game it's "low level" but it's still a much "higher level" than Vulkan.
They are both low-level APIs relative to OpenGL. The difference is mostly that Metal doesn't expose GPU vendor details.
Metal has sort of been ported to other platforms, WebGPU is almost a subset of it (albeit somewhat higher level). There's even wgpu to use it outside of a browser.
Inarguably though Apple not supporting Vulkan is a huge stumbling block for games on macOS -- even if the common engines support it, a lot of game studios who aren't using something off the shelf aren't likely to support a Mac-only target.
Apple has had many years after Vulkan came out to support it alongside Metal or phase out Metal with MoltenVk but we all know that’s not going to happen.
If Apple supported Vulkan it would pretty quickly be at parity with Linux for gaming. Proton/Wine/DXVK would actually make a new Macbook Pro a viable option for gaming.
Metal is not just about gaming. The entire graphics stack on all their platforms is drawn with Metal. Had they decided to switch from Metal to Vulkan when it was released they would have had to rewrite not just the low level Metal drawing but port everything (CoreAnimation etc) to it obviating all the work they'd been putting in for Metal.
With Metal they've got an API whose design goes exactly where they want it to go and is developed hand in hand with their GPUs and NPUs. Since it's the base of the graphics stack every person hour spend improving Metal improves non-game apps as well as games. It's the same calculus as Microsoft investing in DirectX.
Because there's no reason not to, quite literally every other GPU vendor has supported Vulkan alongside DirectX. Metal has clear and obvious shortcomings, so Apple continuing to expect developers to "just use it" won't ever get them the support that they desperately need, especially now that they're on Apple Silicon. Porting things to ARM is hard, but doable. Creating a second version of your software just to run on Apple devices, that needs to be bug-fixed and maintained separately is a suicide mission.
So yes, I am suggesting that they support two different graphics stacks. Apple has the most liquid cash out of any company in the world right now, they've got the resources to do it, especially as opposed to the developers who'd need to maintain two different versions of their software to compensate.
You want Apple to support two graphics stacks...so maybe developers with Vulkan-based graphics engines will port games to their platforms. Since obviously it's only the graphics API that makes porting a challenge between platforms.
Even with Mac and Linux support in popular engines like Unity and Unreal Mac and Linux as platforms get far less interest for most developers than Windows. That's with essentially push button deployment to alternate platforms. For anyone writing engines where Vulkan vs Direct3D is a material issue support for anything but Windows is minuscule. Photon brings aftermarket support to Linux but little in the way of first party support. Crossover has existed for years and received little interest from developers.
Even if Vulkan was fully supported on Apple's platforms there's still all the other hurdles to porting games not the least of which is a totally different CPU architecture. When the reality is developers are uninterested in first party support for non-Windows platforms even on identical hardware with an alternate OS (Linux) I don't see anything to suggest they're going to support a different OS on a different CPU architecture.
So you're really suggesting Apple support Vulkan for no benefit of themselves for games that will likely never appear. They should do this when they have millions (or in the realm of) games on their platform already happily running on Metal.
Linux doesn't need first-party support to run the vast majority of Windows games, and neither does MacOS. I won't deny that running x86 code on ARM will likely never be performant enough to enjoy gaming on, but that's Apple's cross to bear.
> so Apple continuing to expect developers to "just use it" won't ever get them the support that they desperately need
And in the same breath you go ahead an say this:
> Apple has the most liquid cash out of any company in the world right now
Something tells me, Apple really doesn't "desperately need" developer support via Vulkan.
Let's see, Metal is supported by MacOS, iOS, Apple TV. This is probably more devices than Vulkan supports.
> Creating a second version of your software just to run on Apple devices, that needs to be bug-fixed and maintained separately is a suicide mission.
You need to do that regardless of Vulkan. Vulkan is not some magical thing that will make your code run on every system in existence. As the linked comment correctly says, Vulkan is virtually non-existent outside of PCs.
---
Edit: It's very much Direct X vs OpenGL discussion all over again.
Microsoft stuck to Direct X and eventually provided the API all developers target. OpenGL is basically relegated to the annals of history with very bad memories, as a bad designed-by-committee API that was two steps behind and several steps sideways. See this amazing post down the memory lane: https://softwareengineering.stackexchange.com/questions/6054...
Vulkan will suffer the same fate: designed by committee, consoles and smartphones don't care about it, main API on Windows is Direct X, main API on Apple systems is Metal.
> Let's see, Metal is supported by MacOS, iOS, Apple TV. This is probably more devices than Vulkan supports.
What world do you live in? How hard is it to do a quick google search before declaring obvious falsehoods?
> You need to do that regardless of Vulkan. Vulkan is not some magical thing that will make your code run on every system in existence. As the linked comment correctly says, Vulkan is virtually non-existent outside of PCs.
How is it that you have time to write all this but not time to even bother reading the linked comment?
> Vulkan is virtually non-existent outside of PCs.
> How hard is it to do a quick google search before declaring obvious falsehoods?
Where did you find a falsehood? In my statement that Metal runs on iOS devices? Or in my jab (not a fact) that it's probably more than devices that support Vulkan?
> How is it that you have time to write all this but not time to even bother reading the linked comment?
Let me quote that comment for you and split it into byte-sized chunks
--- start quote ---
...you can't just have a Vulkan-only implementation.
There's plenty of old computers that don't support Vulkan.
PlayStation doesn't support Vulkan - better use GNM, GNMX, and PSSL.
Xbox only uses DirectX, which has both high and low-level variants, no Vulkan there.
iOS is Metal only again.
You could use Vulkan on the Nintendo Switch, but NVN is faster and the preferred method.
You might get away with just Vulkan if you were targeting only Windows, Linux, and Android on hardware that isn't more than roughly 5-6 years old.
Outside of the PC, Vulkan isn't really that broadly-supported.
--- end quote ---
Now. Who of us read the comment, and who of us read and understood the comment?
Edit:
So:
- About 3 billion consumer devices that don't have and will never have Vulkan support (Apple devices, Playstations, Xboxes)
- Another billion or so that run Windows where the main graphics API is DirectX, and Vulkan is provided not by the OS manufacturer, but by third parties (GPU vendors). And a large chunk of those are also PCs that will never get Viulkan support (because hardware doesn't support it, or the OS is out fo date etc.)
- A full third of Android devices don't support Vulkan, and never will
But yeah. Apple must support Vulkan because ... reasons?
/Shrug I would buy a new MacBook Pro if it supported Vulkan. I am not a game developer. I just want DXVK and Wine to run well for light gaming on my multithousand dollar laptop with high end gpu. Your argument is pretty much Apple shouldn't implement Vulkan because they haven't done it yet.
Yes. A billion devices don't support Vulkan, and never will. And that's just Android.
But yeah, sure tell me how you won't need to create a different version of software just because of Vulkan.
Meanwhile, Metal is supported on all Apple devices (starting with A7 chip), and given how "Apple has the most liquid cash out of any company in the world right now" (your words), they are not in a "desperate need" of developers supporting their systems.
Why would it happen if Metal is meant to support optimal performance on Apple hardware?
I say that bearing in mind we're talking about battery powered hardware, which can not spend any extra cpu cycles on anything, period, without being berated in the press for short battery life compared to some competitor.
Besides that, remember that if you use Vulkan, you will almost certainly end-up with something that is very optimized for one platform, but won't immediately run well on alternative platforms (see games with their frequent updates for new CPU and GPU architectures).
Apple runs Metal on AMD GPUs, Intel iGPUs, and their own in-house Apple GPUs. If you used Metal, you didn't really have to do much optimization to get close-to-the-best-possible performance on all three of these very-different architectures.
Vulkan has a subset of the features that Metal has.
Metal uses a device programming language that is C++ based, has pointers... and uses AIR (an LLVM IR dialect) as the bytecode.
Vulkan would be a big downgrade in feature levels from that.
While it is very much possible (and done) to have a compliant OpenCL implementation over Metal or D3D12, such a thing isn't possible at all for Vulkan.
Metal is also a subset of Vulkan. As Raph Levien points out here [1], there are certain important algorithms (decoupled lookback) that can't be implemented at all on Metal.
> One controversial aspect of the original decoupled look-back algorithm is that it depends a forward progress guarantee from the GPU
Apple GPUs aren't very amenable to implementing recursion or giving forward progress guarantees within a SIMD group by design. I'll have to check if a device-wide barrier type even exists on that TBDR...
It isn't like NVIDIA hardware (since Volta) where you have a separate instruction pointer for each (SIMT-but-not-quite) thread.
Neither is a subset, they're mostly overlapping sets. For instance Metal has features for tile-based rendering Vulkan doesn't, at least as of a couple years ago.
While this might work on your specific device, it is an undocumented interface and there is no guarantee at all for future compatibility.
If you do get a definitive answer from Apple you can share, please follow up here, as I would like to be able to cite it. I would be quite shocked if it's different than what I just said, though.
it's what is used for OpenCL on Metal (which is the impl present on M1) to provide the semantics there. AIR is a stable, forward compatible bytecode. Will ask and see what Apple says...
edit: thinking about this, OpenCL doesn't actually need those semantics either
The Open CL 1.2 barrier() function is threadgroup scope, same as threadgroup_barrier on Metal. OpenCL 2.0 introduced a proper barrier function (work_group_barrier), which takes a memory scope parameter, which can be memory_scope_device (all this is pretty similar to the Vulkan memory model, and at least some of the same people worked on both). I know of no way to reliably support those semantics on Metal.
Claiming OpenCL bitcode is "forward compatible" is a pretty strong claim considering that OpenCL has been deprecated for over 3 years, and the main thing you get when searching OpenCL docs on the Apple site is an exhortation to migrate to Metal. To the extent there's a forward compatibility guarantee for AIR, I'm sure it only applies to output generated by official Apple tools, and I'm pretty sure by now there's no way to get those to output a device-scope barrier.
"Subset" is not the right word here. Vulkan has pointers now, but (as was discussed in a recent thread), there are serious limitations compared with "real" C++. At the same time, Metal has its own limitations, not least of which it's lacking acquire/release semantics on atomics and a device-scoped barrier.
OpenCL is a little strange because older variants don't have advanced atomics (or subgroups), but does have pointers. I'd be curious to know what specific thing is not available on DX12 and Metal but missing in Vulkan, especially because I'm not aware of any DX12 feature on the critical path for OpenCL that's missing from Vulkan (at least as an extension).
For OpenCL on DX12, the test suite doesn't pass yet. Every Khronos OpenCL 1.2 CTS test passes on at least one hardware driver, but there's none that pass them all. That is why CLon12 isn't submitted to Khronos's compliant products list yet.
>F. Game engines actually support Metal - Unity, Unreal, LWJGL, the most common engines run on Metal fine. Don't use Metal as a scapegoat.
Unity has a Metal target, but its not like you can just ignore it exists after you enable it. I'd much rather write to a single API than many targets, even if Unity makes it possible to do so.
You cannot ‘just’ target Vulkan and then assume it works everywhere any Vulkan is supported. There are differences between vulkan implementations and like OpenGL it has a lot og optional parts. Some vulkan code can work on an Amd card but throw validation errors on an nvidia card
And vulkan does not support some methods that e.g. metal has natively (and vice versa of course)
They don't control the API - not what is in it, the direction it is going, nor when it is updated. That means they can't integrate it tightly with their OSs. Not from a tech standpoint and not from a release standpoint.
Supporting it in MacOS would mean just having a team working for Khronos, implementing Vulkan on top of MacOS's actual graphics API, Metal, and releasing when they could.
It would seem much better if some other group -- one with an actual stake or interest in it -- would take this on. Not sure who that would be because there doesn't seem to be a good enough reason for any group to do it.
Most game engines probably want to go directly on top of Metal -- they already provide an abstraction layer and probably don't want to have an intermediate one.
GPU makers, even at the best of times, have never been interested, and, of course, they can look forward to zero Mac business in the near future, so that isn't going to change.
They would because it would mean more software support.
Proper support for Vulkan would not be an abstraction layer. Metal is not the language of the hardware, GPUs operate on assembly like instructions. These are not publicly documented or AFAIK exposed on Mac OS so only Apple would be capable of creating a proper Vulkan driver.
> They don't control the API - not what is in it, the direction it is going, nor when it is updated. That means they can't integrate it tightly with their OSs. Not from a tech standpoint and not from a release standpoint.
What if I told you I can transfer a file between my Mac and PC using software built into Mac OS from a standard controlled by Microsoft.
> Supporting it in MacOS would mean just having a team working for Khronos, implementing Vulkan on top of MacOS's actual graphics API, Metal, and releasing when they could.
This is like saying because Mac and Safari support HTML 5 its like the dev team is working for W3C. Khronos group is a standards organization not generally the implementer.
> Not sure who that would be because there doesn't seem to be a good enough reason for any group to do it.
Plenty of "native" Mac games can't be bothered to utilize Metal directly and just use this instead. This generally results in less than optimal performance or efficiency.
> GPU makers, even at the best of times, have never been interested, and, of course, they can look forward to zero Mac business in the near future, so that isn't going to change.
GPU makers are plenty interested because it sells GPUs and it is exactly how its done on Windows. Back when Nvidia worked on Mac OS they implemented CUDA and OpenGL on Mac.
> They would because it would mean more software support.
Evidently, Apple doesn't think it's worth it. Who else would be in a better position to judge?
> Proper support for Vulkan would not be an abstraction layer. Metal is not the language of the hardware, GPUs operate on assembly like instructions
Sure, but the point of Metal for Apple is to have an abstraction layer so they are free to change things underneath it as needed. A first-class Vulkan implementation would be a largely redundant, and sometimes conflicting abstraction (and a conflict Apple doesn't control). That's a lot of additional complexity.
> What if I told you I can transfer a file between my Mac and PC using software built into Mac OS from a standard controlled by Microsoft.
I guess you mean SMB? That has not exactly been a straight a smooth road on MacOS. I would guess Apple's experience supporting it helped them realize the value of controlling the foundational APIs of their OS. But even more so with web APIs.
> GPU makers are plenty interested because it sells GPUs
My point is the GPU makers have not yet been inspired to implement Vulkan for MacOS, and now that they face a near future where MacOS will be not be a market for them due to Apple silicon,
How is any of this an excuse for them not to also offer support for Vulkan? If Metal is so great and offers that many advantages, then developers would choose it. Unless Metal can't stand on it's own merits, I don't see a problem here.
WebGPU is essentially a mobile-phone graphics API, as it focuses support towards the lowest common denominator (both in hardware, and the underlying graphics APIs). Similarly to WebGL before, it will always be half a decade to decade behind desktop graphics. The API itself is great, but the artificially imposed limitations are really really restrictive.
WebGPU is a different animal from Vulkan/Metal. It's more like an updated OpenGL than a true low-level graphics API. It's also pretty far from being ready for prime-time if I understand correctly.
there is some sort of legal dispute between apple and khronos resulting in apple not really wanting to touch khronos ip. on a technical level, translating spir-v to msl is a solved problem.
Everyone here seems focused on Metal as a graphics api for games. It’s also Apple’s answer to CUDA. Still a work in progress but each release they’re adding more (advanced) compute APIs. Metal is important for their AI plans. For that reason alone we will never see it be replaced by Vulkan.
I think the challenge here is that you need buy-in from industry pros to make this really worth it. The platform can be best-in-class from a hardware/API perspective, but if no tools exist using in it, why am I going to give up my CUDA-based workflow to spend way more effort re-inventing the wheel on mac?
I could imagine an open platform supplanting CUDA in the long term from a cost-optimization angle, especially on server workloads, but I just don't see serious work moving from one closed platform to a smaller closed platform.
I paid ~100 dollars to buy songs in Taiko no Tatsujin in iOS 4 via In-App Purchase. But around 2 years later they axed the game and I have no way to download it again. First I can still recover it via iTunes backup. But later when iOS 7 is published, the backup just couldn't run anymore. It was just gone. I should have owned them if I pay that amount of money :-/ It was a very fun game.
There's people who pay tens of thousands of dollars on Genshin Impact and the game will be removed from the store at some point too and the things they spent money on will be gone. You don't own things from the store :\
Also - throwing a zip file over the fence, really? Can't you put it on GitHub?