I could understand if they were deprecating it in favor of Vulkan. That would be in-line with Apple's history of aggressively pushing forward new standards. But by no means do they have the clout to coerce developers into their own bespoke graphics API that doesn't work anywhere else. All they'll accomplish is further killing off the already-small presence they have in the gaming space.
In fact, they couldn't get their act together, keep with current versions, and as a result titles like Elite Dangerous were being shut down anyway. Reason: OpenGL stuck on an old version without support of compute shaders.
Embrace. Extend. Then extinguish.
Just Replace. Extinguish.
On PC: Maybe by number of games, but not by the number of players.
(I count Fortnite as an outlier because it's technically not built on a third-party engine)
There's LoL, Dota 2 and Overwatch using their custom engines with huge numbers of players but... What else? CS:GO?
The OP's point was that the companies that make these engines can afford to invest in supporting an additional back-end API though. I think it's hard to argue that any of the companies that develop these engines would not be able to also add a Metal back-end. Many of them already work across a pretty wide range of back-ends anyway. Xbox One, PS4 and Switch all use completely different API's, for example. I think most of the work is not in adding an additional backend like Metal, but in tuning the back-end for some specific piece of hardware (NVidia vs. AMD vs. mobile GPU, etc).
Whether companies are actually willing to invest in a Metal back-end remains to be seen, but considering many of them license their engine for commerical use, I would be surprised if the major players will simply ignore Metal.
I tend to agree with Jonathan Blow's comments on Twitter, that the low-level graphics API should be just that: as low level as possible, small, focussed, and not actually intended (but still allowing!) to be used directly. Engines or higher-level API's can be built on top of that, with the option to dive down to the lowest level when needed (which will probably be a rare occasion).
DirectX will definitely not be this API because it is Windows specific. Likewise for Metal because it is Apple-specific. Blow appears to be of the opinion that Vulkan is also not moving in the right direction, because it is becoming too complex, and trying to be too many things for too many applications at the same time.
If true, in a sense, it's not that surprising Apple is doubling down on their own API. I think they should consider making the API open-source though, and develop something like MoltenVK (but the other way around) for Windows/Linux.
OP doesn't know the diminishing profits of the AAA gaming industry.
OP would never have to justify to producers and directors why <1% of gamers should take 30% or more of rendering architects time.
My answer was aimed at that part. But yeah, maybe we should define what "custom" is.
more people than ever do.
What's the "etc"? Are there any other engines in that set?
Go back in time six years ago. What were Apple’s choices?
(1) continue to live with the deficiencies of OpenGL. Remember that, over time, it had come to fail at one of its primary purposes which was to provide efficient access to GPU hardware. Further, sticking with OpenGL would be to accept the leadership of a group that had allowed its flagship standard to falter.
(2) They could marshal their resources and create the better API that the Khronos Group wouldn’t/couldn’t.
They really had no choice. Note that Vulkan wasn’t announced until after Metal was released.
The gripes in this are should really be leveled at the Khronos group, which fumbled their stewardship of OpenGL and, with it, the chance to lead open GPU APIs.
The entire point of Mantle was to be a proof of concept that could be reworked into cross platform API (which became Vulkan), there was plenty of work already being done by Khronos in 2014 (and Apple knew this). And they just went out and released Metal anyway.
I also blame Microsoft for the same thing, early parts of the DX12 docs were taken word for word out of the Mantle docs, that's how similar they are. But Microsoft at least had couple decades of having a competing API, but Apple went out to create a new one for some reason.
It started as an AMD/Frostbyte proprietary API.
AMD was basically in the same position when they started Mantle (at about the same time, no less.).
They chose differently. Now we have Vulkan. No thanks to Apple.
I'm not sure they should have spun around and dropped Metal for Vulkan once it became available, or slow down the pace of progress til the rest of the market caught up. Doesn't make sense.
Also Apple is perhaps the largest GPU manufacturer in the world, with 200-250M GPUs shipped in 2017. That is 4-5X of Nvidia! Also Apple is investing highly in AI from tools to devices to GPUs, being able to customize may have tremendous value.
It is highly possible that Apple sees owning their interface stack as a means to keep their software-hardware warchest a couple years ahead of the competition. Which in mobile has been paying off of the last 5 years, as they constantly have crushed all others by 2-3X.
What developers will do is go out of thier way to support iOS and supporting the Mac is just a side benefit. Just like almost every printer company supports the Mac as a byproduct of wanting to support iOS.
This may sound crazy, but remember there are billions of iOS devices out there in the world, and I don't think X boxes plus windows game machines count in the billions.
Its true Apple hasn't won the hard core gamer market, but they are no longer the niche player that had to cater to windows users.
Also looking at Apple's sells every year since the 6 came out, I doubt very seriously that Apple has sold more 6 phones than 6S, 7, 8, and X phones.
Also if the 6 from 2015 is not a powerhouse, neither is Samsung S8 that was just introduced last year....
Being able to run anything slightly demanding is other thing, but you can't argue there's no support.
Also, the benchmark you linked is for application load, which is heavily influenced by storage speed and load method (android has to JIT compile sometimes) and has almost no impact from the graphics' performance other than the bus between CPU/memory and GPU
There should do, albeit not for gaming but most of the office software is Windows with DirectX support. You won't be playing on, though.
Oh wait, you can't.
To be fair, the overwhelming majority of game shops develop on engines, and leave the engines to deal with the platforms. Unreal Engine, Unity, etc, support Metal, among others.
It's a very weird move to me, even if the software in question will be kept compatible with Apple's Legacy OpenGL, these versions will be worse than their counterparts running on other platforms making use of new shiny OpenGL features.
It's like Apple is saying 'we don't care' to the 3d professional market, also doesn't Photoshop rely on OpenGL these days as well ?
I heard they have a WebAssembly/WebGL version now, betting that'll get wrapped up in a WebView and we can all pretend it's a native program still.
Speaking of WebGL, that's basically OpenGL ES 2.0, but I assume the implementation in WebKit is backed by Metal? What about other browsers like Firefox?
Yes, though with Adobe's relationship with Apple they probably got all the handholding and resources to do the port to Metal.
As if my deteriorating keys on this machine were not bad enough. This wasn't a good WWDC for me. My PC is working great despite being an obscure setup with mismatched GPUs, can't say I understand why the graphic designer workhorse machine MBP is unusable with Illustrator and that is working just fine..
They don't? Whats the matter? (mine works great, but I'm not a heavy user so curious if it depends on the model or I just didn't run into it so far)
let alone the inability to reconfigure things like n-trig pens to have hover right click/middle click functionality, it's been INCREDIBLY frustrating without any communication from microsoft.
Overall it's rough, there are days where it seems better than others - but I'll randomly lose sensitivity and multiple reboots appears to be the only pseudo-consistent means of getting it back.
That being said - It's still way more usable than Photoshop/Illustrator on my Mac.
I miss 15 years ago when I had CS2 + Intuos Pro 2 and everything just worked.
(eg. linking with the system-provided OpenSSL has been deprecated for years, but AFAIK they still ship it.)
To expand on your example, I maintain a legacy app that is stuck in 32-bit land because it relies on the QuickTime framework. QuickTime has been deprecated for seven years, and the transition to 64-bit has been in progress for over a decade, and yet my legacy app runs just fine even under the Mojave beta. There are multiple time bombs lurking in that app, and one of these days I'm going to have to rewrite it from the ground up, but I've been astonished at how long it has lasted.
Apple knows it would be bad karma to make a large number of legacy apps and games suddenly break on the Mac. They're not idiots; they have a perfectly good idea of the scale of mutiny that would ensue. So I'll eat my hat if OpenGL doesn't continue to work for at least the better part of the next decade.
It has entirely been removed from the latest (since 10.12 IIRC?) SDKs so now you have to keep older SDKs just to build your app.
"MoltenVK is a runtime library that maps Vulkan to Apple's Metal graphics framework on iOS and macOS."
It's a subset mostly for mobile.
On iOS, dunno, you probably have a point there.
It doesn't have the features you need when you're creating more complex documents. Last time I've used it it didn't even allow you to have different sections which allow you switch between the rotation of pages.
I’ve witnessed plenty of last minute builds be saved by a Unity game dev on a Mac just flipping their renderer settings.
GL should always work for the simple case, but instead you need to rewrite to avoid bugs in its many layers. And once you have Vulkan/Metal, industry-specific wrappers are better than the impossible to debug procedural GL junk.
Vulcan was created to get that same portability, with an API that fits modern hardware.
OpenGL 4.6 isn't anything like OpenGL 1.0/2.0 even though you can still _run_ those old OpenGL 1.0/2.0 tutorials.
You can even do most of the cool stuff of Vulkan in OpenGL via AZDO techniques (example: https://developer.nvidia.com/opengl-vulkan )
Also on Windows.
Apparently, if you want to distribute your software to wide audience, you can only rely on OpenGL 3.0, with minimal set of extensions. Here’s an example: https://github.com/Const-me/GL3Windows#building-and-running
All the target systems had latest Windows updates, and they all run Direct3D 11 software just fine (I mostly develop for D3D and I test on them). On some systems it works in 10.1 compatibility mode, MS calls that “feature levels”. Not a big deal in practice, the majority of D3D11 stuff still works OK.
See: https://www.khronos.org/opengl/wiki/Tutorial:_OpenGL_3.1_The... for a tutorial.
My Pascal-based Nvidia GPU is showing OpenGL 4.6 on Windows 10. Nothing outdated here.
I think you’re wrong here. Two reasons.
1. If that would be the case, I would be stuck with GL3.0 regardless on the GPU. In reality, I’m only stuck with GL version < 4.0 on HD2000 and VmWare. On my desktop PC (Maxwell at the time I’ve wrote that demo) OpenGL 4.0 worked just fine in that very project. Even on Intel HD 4000 laptop, OpenGL 4.0 worked just fine with the same code.
2. Please read Intel’s documentation: https://www.intel.com/content/www/us/en/support/articles/000... Specifically, please expand “2nd Generation Intel® Core™ Processors” section. As you see in that table, Intel says HD Graphics 3000/2000 only support OpenGL 3.1, which is exactly what I’m getting from the GLEW library I’m using in that project.
Also, you can see in that article that no intel GPU supports GL 4.6 mentioned by GP. Even the latest generation UHD Graphics 620/630 only support GL 4.5. Meanwhile, they support the latest DirectX 12 for several years already.
Behavior depends on if the device supports 3.2+ compatibility mode which is optional.
You're hitting the legacy path, that's well-defined ( https://www.khronos.org/registry/OpenGL/extensions/ARB/WGL_A... ). You need to use the method I mentioned to get real post-3.0 OpenGL.
> Also, you can see in that article that no intel GPU supports GL 4.6 mentioned by GP. Even the latest generation UHD Graphics 620/630 only support GL 4.5. Meanwhile, they support the latest DirectX 12 for several years already.
ok, so? 4.5 isn't really outdated, either. It still supports all the modern good stuff. And, as we've established at this point, it's not Window's stopping you from leveraging the full extent of the hardware you have. By contrast macOS does stop you from using the hardware you got to the fullest, as it's stuck on 4.1
For the systems I have in this house it’s not required, i.e. I’m getting the same OpenGL version that’s advertised by the GPU vendors.
> You need to use the method I mentioned to get real post-3.0 OpenGL.
Either I don’t, or the authors of that GLEW library https://www.opengl.org/sdk/libs/GLEW/ already did that. When running on modern GPUs, the code in my repository already uses real post-3.0 OpenGL just fine, including the shaders.
> ok, so? 4.5 isn't really outdated, either.
Right, but 3.1 (Intel Sandy Bridge) is. And 4.0 is outdated, too (Intel Ivy Bridge). Meanwhile, modern Direct3D works fine on these GPUs, 11.0 feature level 10.1, and native 11.0, respectively.
Go read the extension I linked, it explains the behavior you're seeing. Also go read the tutorial I linked, it's using GLEW and shows you how to create a context.
You have a bug if your intention is to get a post-3.0 OpenGL context. Whether or not you care is up to you. You may be perfectly happy being in the compatibility bucket. I don't know. But you're not in the explicit 3.1 or later path.
> Right, but 3.1 (Intel Sandy Bridge) is.
Sandy Bridge is a 7 year old CPU. Of course it's outdated...? And D3D 10.1 is from 2007, it's also hugely outdated. You're getting anything more modern out of the hardware with D3D than you are OpenGL here. I don't even know what the argument you're trying to make is at this point.
In the old link you offer as example, Intel HD3000 and HD4000 are bad, with bad drivers that lie about OpenGL versions (hence the need to downgrade the client), and fortunately obsolete. Current Intel integrated graphics have improved. And VMware is a virtual machine, not hardware; it should be expected to be terrible.
Technically that’s probably true. However, if you drop support of Intel GPUs, your GL4+ software will no longer run on a huge count of older Windows laptops people are still using. For many kinds of software this is a bad tradeoff. That’s exactly why all modern browsers implement WebGL on top of Direct3D, and overwhelming majority of multi-platform games and 3D apps use D3D when running on Windows.
> VMware is a virtual machine, not hardware; it should be expected to be terrible.
It’s only terrible for OpenGL. The virtual GPU driver uses host GPU to render stuff, and it runs D3D11-based software just fine. I don’t use it for gaming but it’s nice to be able to use a VM to reproduce and fix bugs in my software caused by outdated OS, windows localizations, and other environmental factors.
Intel GPUs D3D drivers have historically been better than their OpenGL ones (which isn't saying much since their D3D drivers are also trash), but now we're talking driver quality of one player which has nothing to do with the API itself or opengl somehow being outdated on windows.
But ANGLE also targets desktop OpenGL (and vulkan), and as OpenGL 4.3 adoption increases I'd expect increasingly more browsers to use it for WebGL 2.0 since you don't need translation there at all. OpenGL 4.3 provides full compatibility with OpenGL ES 3.0.
You seem to be pretty confused on how OpenGL versions line up with the D3D ones, too. For reference OpenGL 3.1 is roughly equivalent to D3D 10.1. When you're complaining about only getting GL 3.1, you're also complaining about being stuck with D3D 10.1
But it's also more difficult for the lay programmer to use as well.
OpenGL -> Vulkan -> Metal
I can feel the performance gains already.
Granted, this may also be true for OpenGL.
You notice that Apple supported OpenGL while they were making most of their money from desktop and laptop sales; but once iOS became so profitable, they decided to go their own way and start pushing Metal.
Lock in, or at least getting people to do more iOS first development (helped by lower profits on app store sales on Android, Android fragmentation, etc), helps Apple out a lot. You get the first version of apps and games, or more polished versions of apps and games, on iOS this way.
And again, I don’t see the benefit for apple over supporting cross platform apis to encourage development. It seems like a net loss for everyone but some line in their budget on driver maintenance.
Providing macOS gives a developer and designer platform for iOS. That is really important for them. So Metal being available on macOS is important for that reason. But it's also important in that the Mac platform is still important, just not nearly as important as iOS.
OpenGL doesn't really have much of a future. Everyone is moving towards the next generation frameworks. It just happens that there was a lot of uncertainty about whether OpenGL could adapt or whether there would be a successor, and during that time Apple decided to invest in developing Metal. It wasn't until a couple of years later than Vulkan was released.
In the meantime, Apple has built up quite a lot of tooling around Metal.
And it's not like it's that difficult to write cross platform apps that target the Mac. If you write against major engines, they will already have support for the different backends. If you are writing your own software, you can still target OpenGL, or you can target Vulkan and use MoltenVK to run it on macOS.
And for the next several years, people writing portable software are going to have to either just target OpenGL, for compatibility with older graphics cards, or maintain at least two backends, OpenGL and Vulkan. Given that lots of games target DirectX first, and consider any of the cross-platform frameworks a porting effort, Apple probably doesn't consider it a big loss to add one more platform that needs to be ported to.
What's going to wind up happening is more software like ANGLE (https://github.com/google/angle), MoltenVK (https://github.com/KhronosGroup/MoltenVK), and gfx-rs (https://github.com/gfx-rs/gfx and https://github.com/gfx-rs/portability, see http://gfx-rs.github.io/2018/04/09/vulkan-portability.html for details) for providing one set of abstractions on top of several different backends.
One of the reasons this has been so feasible has been the fact that the engine (MOAI) uses GL ES to manage the framebuffer itself - giving the same look and feel on all platforms the host runs. This has been, honestly, revolutionary in terms of building a single app that runs everywhere.
This now becomes more complicated because the engine has to be modified for Apple platforms to use Metal, and represents another fork/splinter in the unity of the host itself.
I wonder if their decision to use a non-standard graphics API is due to them wanting to make this style of development a lot more difficult in the future - i.e. are Apple passively antagonizing the cross-platform framework builders in order to establish an outlier condition for their platforms? The cynic in me says yes, of course this is what they are doing ..
In the AAA game space you mean. Else, in the casual gaming space, iOS is perhaps the most popular platform -- and the new integration effort means all those games will be able to run on macOS as well soon.
They work fine for me -- both as implementation and as gameplay.
>Almost all of them are built around exploiting weaknesses in the human psyche to convince people to spend money and become addicted.
I think you confused casual gaming with Zynga or something. I was referring to smaller, non-AAA megatitles. Could be anything from a platform game, to Angry Birds, Monument Valley, Threes, Letterpress, racing games, RPGs and so on...
The point is that the kind of games that thrive on the app store tend to be exploitative and low quality. Desktop gaming isn't immune from that, but it's a dramatically better platform.
Well, platform specific APIs aren't lowest-common-denominator affairs, and get support for native platform capabilities faster (plus can be more optimized).
Also a lot went wrong when Apple opened up for ads and in game purchases.
But not every game are gambling. Fortnite seems to be doing great. And that shouldn't be a pay to win game.
The games that grow out of the app store ecosystem are games like Candy Crush, Clash of Clans, Clash Royale, etc. I'm not saying good games don't exist on the platform, I'm saying the platform is conducive to low quality games. Almost all of the great games on the app store did not grow out of the platform.
People take level 1 accounts into legendary arena which shows that skill dominates money.
How on earth the devs at Supercell tricked management into making a skill rather than money based game is genuinely a mystery to me.
Game Engine Choices would become the old day of OpenGL vs DirectX. No one sane would write their own game from scratch.
There are plenty great games. It help to source them from a gaming community you trust.
I guess that explains why they are on Apple devices in the first place.
And even in cases where they are available, for example Macbook Pros, the cost difference involved in stepping up from an integrated GPU to an entry-level dedicated card is greater than the cost of buying an Xbox or PlayStation.
Having a fixed set of hardware to target is a dream.
windows, gnu/linux, android; and what other os?
That of course depends on the definition of "gaming space".
The classical desktop gaming space, Apple was never a player in it. They simply don't care about it. Hence why they treated OpenGL on macOS the way they did.
But: Apple is arguably the biggest player in the mobile gaming space. That's what they care about. So instead of spending a large amount of money to attract a low number of AAA desktop titles to their OS they just tap into the vast (game-)developer base that they already have in iOS and make it easy for them to deploy and sell their games on macOS too .
The move to deprecate OpenGL and OpenCL in favor of Metal makes total sense in that regard.
I would very much prefer games to use Metal on macOS (Starcraft2 is much smoother on the same hardware)
And good luck finding a Vulkan version beyond 1.0, with extensions that work properly across all mobile devices.
You get a spectrum all the way from 1.0.0 up to 1.0.66, with Vulkan 1.1 promised for Android P devices only.
The First Moribundity was the period in the 1990s when Apple was coasting on the DTP and Photoshop advantage over Windows they had, reducing the features of their desktops and not innovating. Schindler, a smart but ineffective leader, was replaced by Gil Amelio, a star in his prior field but unable to get Apple headed in the right direction. It took Jobs' return to right the ship that time.
The emphasis on thin but less functional and less serviceable laptops, the dropping of OpenGL, the cruft piling on top of OS X to no new net benefit for users, and their coasting on the desktop market all point to this IMHO.
Apple is all about mobile phones, the Apple Watch and air pods right now. The MacBook is barely a blip on their product radar.
My personal impression is that while pre-built PCs are mostly sold to businesses, there is a strong growing trend since a few years to build your own PC from parts. I really do observe that if mobility is not an issue, people now tend to move away from laptops towards self-built PCs (or often rather: let a good friend build one). I don't want to go into the details what advantages these have over laptops, but just mention customizability with respect to requirements (e.g. very silent, very high-power, ...) and repairability.
A small (but only small) contributing factor is that laptops do not have sufficient power for VR, so you really need a stationary PC for VR.
The strongest "dampening factors" with respect to this trend are the growing prices of GPUs, because lots of cryptocurrency miners (in the last years in particular Ethereum) hoard them (but it is my impression that this will somewhat ebb away as soon as attractive ASICs for Ethereum get released) and the growing prices of RAM lately. On the other hand the release of Ryzen made building of PCs with either very high-performance or very good cost-benefit ratio feasible.
A lot of early pcs were kits people built, and I remember pc gaming in the '90s and '00s was also heavily focused on custom built machines.
That is why I wrote
"(or often rather: let a good friend build one)." :-)
Seriously: In my opinion (but others might disagree) an advantage of self-built PCs over one that some company produces is that you know exactly what components are inside (in particular you can buy components in which you trust), which makes you far less dependent on driver support by the manufacturer (i.e. you can find drivers in the internet by yourself if necessary). I often had bad experience with driver support by manufacturers (expect for a few well-respected names).
Also lots of cheap PC manufacturers install lots of crapware by default, while your self-built one is a very clean install.
In summary a self-built PC often "just works" far better than a pre-built one.
Just as an example, over at apple.com, the new iMac comes with 32GB of DDR4 2666MHz ECC RAM. To add 32GB (64GB total) they are charging 960€, or 2,880€ to add 96GB (128GB total).
Looking on amazon or geizhals.eu (price comparison site), 16GB ECC DDR4 2666MHz sticks cost 200-220€ x 2 = ~450€ für 32GB. That means you can save 500€ on just RAM alone.
Same applies to SSD storage. And graphics cards. And CPU. And basically anything you could want to upgrade.
Specifically Apple is well-known for their expensive pricing for better optional components (which is often necessary to pay, since for many models you cannot simply replace the component (RAM, SSD etc.) by another one on your own).
Couldn't this be simply explained by the hypothesis that people simply buy their PC parts mostly in the internet instead of a brick and mortar shop?
The kind of experts capable of understanding if they are ordering hardware that is supposed to work together instead of blowing up in some form when assembled, is quite tiny.
Anyone with a modern internet connection and a bit more patience than money (or at least a willingness to learn) can hop on Reddit or PC Part Picker and get a pretty good idea of what is out there and works together.
Compared to the days of making sure you had the right number and type of ISA, PCI, and AGP slots, assembling a PC from parts today is a breeze. Shopping online keeps costs low and places like Microcenter are great for buying in person.
I only haven't built one in a while because my current 5-year-old workhorse media/editing/gaming/everything PC shows no sign of needing a full upgrade any time soon. Sure I bought a new GPU after a few years when I got bitten by the experimental VR bug but other than that, it was an afternoon buying parts and snapping them together, an evening installing software, and 5+ years of "just working".
Cryptocurrency mining has kept the enthusiast hardware market afloat, I have a feeling...
1070 TI: https://geizhals.eu/?phist=1717563
1080 TI: https://geizhals.eu/?phist=1587606
Visual Studio is actually an awesome development platform (I’m including VSTS in this). There’s not much secret sauce in Xcode to threaten Apple’s App Store revenue — it’s just a bunch of signing keys that Apple controls and manages.
I do think Apple will make a strategic decision to diversify from core Mac OS soon. The ever-shrinking space allocated to traditional computers at the Apple Store has convinced me of that. They have more floor space allocated to the Apple Watch than they do the Mac these days.
And honestly, their desktop line looks very similar to many PC companies. They make the Mac Mini (aka the Apple NUC), the all-in-one iMac, and the Mac Pro (ok, the Mac Pro sucks and there’s no excuse for it).
Anyone doing serious workstation tasks these days is likely using a cloud-based solution. There are a few specialized exceptions (ML development stands out to me), but none big enough to build a product around. Especially when doubling the entire PCs and Laptops category wouldn’t move the needle for Apple at all.
They tend to refresh in line with corporate cycles, i.e. 18 month to 3 years, from what I see.
PCs stand still, even laptops tend be mostly stationary. No wonder my work laptop is now 6 years old and I feel no urge to request upgrade.
Don't forget planned insecurity two years after release through unpatched software vulnerabilities.
https://tinyurl.com/yakmsz6b (sorry, in German)
But I expect cheaper devices to take more share, since the "high-end" won't give you much benefits over them. My 120 EUR Xiaomi already offers amazing cost performance compared to most other Androids.
You may call me Captain Obvious now. :)
There could be a number of factors that would hurt them, like margin compression, reaching market saturation, etc. All the ailments that Business Schools will warn you about...
However, they (IMHO) have become arrogant and out of touch with their users. VISION is their problem.
They need the MacBooks and desktop Macs to be seen as very desirable, in order to project the "creative people buy Apple" halo over the rest of their products.
come on steve, reinvent death please
Microsoft could not care less about OpenGL on Windows. However, it works just fine.
You know why? As soon as you install your video card drivers, your OpenGL implementation is no longer from Microsoft. It comes from AMD, NVidia or Intel, with all needed optimizations for their hardware.
Apple insisted in not allowing this and doing the OpenGL implementation themselves (which was always crappy and outdated).
Had they allowed the GPU vendors the ability to provide their own implementation, this would have been a non issue.
(Many toolkits, like Qt and Cocos2d, also use ANGLE on Windows for OpenGL functionality)
The same is true for plenty of other “Pro” apps on Windows.
The real question is why would a GPU vendor go through the expense of creating and supporting such an implementation when Apple doesn't even make a computer with slots that you can install their video cards into?
If producing an OpenGL implementation doesn't provide a competitive advantage for selling their products, why would they bother?
Because it is a fact that Apple develops their own drivers. Also, when did you last download a driver update from NVidia for your Mac?
Article is about Vulkan, but briefly mention Apple's own outdated OpenGL stack.
From 2012. Which means it is still accurate, given how out of date drivers are.
There are more references, you can look it up.
> The real question is why would a GPU vendor go through the expense of creating and supporting such an implementation when Apple doesn't even make a computer with slots that you can install their video cards into?
They still have GPUs, which can be Intel, AMD or NVidia depending on year and model. Just because they are soldered on, doesn't mean they don't need drivers.
EDIT: Some more research seems to indicate that there are drivers developed by NVidia for the NVidia Quadro GPU line.
Nvidia releases drivers for cards that the drivers which ship with macos don't support. I would also guess that the nvidia drivers which ship with macos are written by nvidia under some agreement with apple, same is likely true of AMD and intel.
That's a tiny part of Apple's lineup though.
(Ironically, using OpenCL avoids this problem)
When Microsoft abandoned OpenGL for DirectX, GPU vendors produced their own OpenGL implementations because doing so provided a competitive advantage that allowed them to sell more product.
The question is, why would those GPU vendors do the same thing now that Apple is following the same path?
Apple doesn't even produce a computer with slots you can install their products into.
>EDIT: Some more research seems to indicate that there are drivers developed by NVidia for the NVidia Quadro GPU line.
Keep doing research, because NVidia provides downloadable Pascal drivers even though the last time Apple produced a computer with a PCI slot was the Cheese Grater Mac Pro which came out over a decade ago.
Making sure nothing diminishes CUDA is very much in NVidia's competitive interest.
It doesn't seem to change anything on the OpenGL stack, unfortunately.
In case you missed the news, only DirectX is supported on UWP and store apps.
To the point that Microsoft has their own port of Angle.
So unless they change their mind, say goodbye to ICD drivers on Windows as well.
I run Windows at home, but wouldn't if it went Store-only. For me, an open platform is the only thing Windows had going for it.
Quite a few apps like Adobe XD are store only.
Next Office for Windows 10 is only available via the store.
Microsoft has taken the other approach, if apps don't come to the store, the store comes to the apps.
So thanks to the outcome of Project Centennial, they are now merging the UWP and Win32 worlds into Windows containers and making the store Win32 aware as well.
Deep down session from BUILD here, https://channel9.msdn.com/Events/Build/2018/BRK2432
Legacy software, blah, blah, blah. No legacy software runs forever, and least of all on Apple platforms. Who cares.
(Of course that does not mean that the OS needs built in OpenGL support. If you can convince an old game to use some kind of OpenGL-Metal compatibility wrapper without needing access to the game's source code or support from the original developer, that's fine with me as well.)
People in FOSS friendly circles really don't get the games development culture, IP management or the contracting business related to ports.
Requiring developers to use an API locked to a particular platform feels pretty hostile to me. Doesn't matter if that API isn't perfect, or even far from it.
* Being deprecated does not mean that things will suddenly stop working. It will take a few more releases of macOS before this can be removed.
* Next to MoltenVK there is MoltenGL, which is an implementation of OpenGL ES 2.0 that runs on (edit) Metal . That indicates it's at least feasible to wrap OpenGL applications in the future if necessary.
Furthermore, Apple wil drop support for all Macs that don't support Vulkan in this release of macOS . Ouch, what a waste.
: https://9to5mac.com/2018/06/04/macos-10-14-mojave-supported-... (anything from before 2012 does not support Vulkan)
You would be correct, but not on OSX.
So like DirectX?
Plenty of big 3D/CAD/etc players? In lots of creative areas, the Mac dominates still (despite stories about people moving to Windows nobody's going anywhere, where nobody = quite few creatives overall).
Besides, with Metal they'll target iOS as well, and that's a huge platform, and where most of the profits are for mobile.
And with this deprecation Mac is pretty much dead as a platform for professional 3D.
Graphic Designers still like Macs for the most part I guess -- and I still see them in video production a lot, but that's starting to change pretty quickly.
I think the Final Cut "Pro" X was the inflection point - the change is ongoing.
Which AFAIK they’re free to do on MacOS as well, they just don’t seem to bother since Apple was doing that work for them
As far as I am aware Apple develops the GPU drivers for OS X (though, I think, based on code that the GPU vendor provides).
It's distributed with the Graphics Driver, but most of it exists in a user space library, not in the driver proper.
I'm not sure. NVIDIA provides updates for CUDA and an extremely limited amount of updates for their graphics stack (AFAIK none at all for integrated graphics, for example).
The difference is that OpenGL is designed to be easy for humans. glBegin(GL_TRIANGLES); glVertex3f(x, y, z)...; glEnd(); you can't beat that. The issue is that it hard for the driver to optimize.
That's where Metal and Vulkan come into play. These are low level APIs, sacrificing user friendliness for a greater control over the hardware. It is designed for 3D engines, not for application developers.
Look into what it takes to write the minimum viable OpenGL program, written using non-deprecated routines, that puts a textured triangle on the screen. It sucks. On top of that, OpenGL is slow and gives you no way to create programs with smooth performance -- for example, it will randomly recompile shaders behind your back while you are trying to have a smooth frame rate.
1990s-style OpenGL was good for the time. In 2018, OpenGL is a pile of poop.
What? I've written commercial games with opengl on osx/ios and my experience doesn't show that at all.
State-based recompilation is a known issue in many GL drivers, particularly on mobile. E.g. changing blending settings may cause shaders to get recompiled. This can take up to a second.
Some engines work around this by doing a dummy draw to an offscreen surface with all pipeline configurations that they use at init time. This (usually) guarantees that all the shaders are pre-compiled.
To give a ballpark figure, my Vulkan "base code" is less than 2x what my OpenGL boilerplate is for the same functionality. The big difference: the Vulkan code is easy to understand, but the GL code is not.
Comparing "Hello World" doesn't make much sense, OpenGL gets really darn complicated once you get past the basics.
This happens in other APIs too (we definitely had it happen with DX11), it's just that OpenGL is a lot more complicated than anything else due to its history, so it has proportionally more bugs.
That's fine for a "hello triangle" program, but quickly becomes ridiculous for anything approaching a serious engine. There's a reason that glDrawArrays() has been around since 1995 (and part of the core specification since 1997).
I wonder how much of this stuff is deprecated now.