Hacker News new | comments | ask | show | jobs | submit login
Deprecation of OpenGL and OpenCL (apple.com)
668 points by WoodenChair 8 months ago | hide | past | web | favorite | 452 comments



Ugggggh. As if graphics support on macOS weren't middling enough already. It's like they're trying to become as irrelevant as possible in that area.

I could understand if they were deprecating it in favor of Vulkan. That would be in-line with Apple's history of aggressively pushing forward new standards. But by no means do they have the clout to coerce developers into their own bespoke graphics API that doesn't work anywhere else. All they'll accomplish is further killing off the already-small presence they have in the gaming space.


Apple must truly hate gaming, or suffer from a serious case of Not Invented Here with their Metal stuff. As if any serious gaming studio would target Metal which doesn't run on Windows.

In fact, they couldn't get their act together, keep with current versions, and as a result titles like Elite Dangerous were being shut down anyway. Reason: OpenGL stuck on an old version without support of compute shaders.

https://forums.frontier.co.uk/showthread.php/424243-Importan...

https://support.frontier.co.uk/kb/faq.php?id=228


As much as Apple would like to emulate the Microsoft of the 1990's, they're just so bad at it.

Embrace. Extend. Then extinguish.


Worse still, by ignoring a ubiquitous tech in favour of their own bespoke solution they are emulating Apple of the 90s!


Apple's model is Embrace. Replace. Extinguish.


I don't think there is any embracing.

Just Replace. Extinguish.


Or just plain extinguish.


To be fair, most games today are built using Unity3D, Unreal Engine etc, which all support Metal already. Hardly anyone writes their own game engines these days, and if they do they probably have the resources to support Metal. Overall still a bummer though.


The problem is still with apple forcing them to invest resources, without any reason, but to advance their vendor lock in. And if you're a developper of a small high performance 3d graphics and gpu computing library like me, its just a giant middle finger from apple and I will either need to drop opengl/opencl or apple - there is no way that i can afford to offer both, especially since i'd need to buy apple hardware to test things.


> most games today are built using Unity3D, Unreal Engine etc

On PC: Maybe by number of games, but not by the number of players.

(I count Fortnite as an outlier because it's technically not built on a third-party engine)


So based on number of players, what is the most used engine today if not UE nor Unity?

There's LoL, Dota 2 and Overwatch using their custom engines with huge numbers of players but... What else? CS:GO?


The Witcher 3 is using RedEngine, GTA V RAGE, the Battlefields and SW:Battlefront {1,2} are using Frostbite IIRC, the two new Tomb Raider on Horizon, Rainbow 6 Siege and the Assassin's creed are on Anvil, Overwatch & SC2 have their own engines too, same for League of Legends, CoD are on a heavily customized id Engine, Minecraft is custom, Bethesda have their own engines too for Skyrim & Fallout, Path of Exile cutom too, all taken from Steam 100 most played.


That's a nice list, quite complete. Many console exclusives also use custom engines by the way, e.g. Decima for Horizon:ZD, KillZone and Death Stranding, Naughty Dog has their own engine (don't know the name), etc.

The OP's point was that the companies that make these engines can afford to invest in supporting an additional back-end API though. I think it's hard to argue that any of the companies that develop these engines would not be able to also add a Metal back-end. Many of them already work across a pretty wide range of back-ends anyway. Xbox One, PS4 and Switch all use completely different API's, for example. I think most of the work is not in adding an additional backend like Metal, but in tuning the back-end for some specific piece of hardware (NVidia vs. AMD vs. mobile GPU, etc).

Whether companies are actually willing to invest in a Metal back-end remains to be seen, but considering many of them license their engine for commerical use, I would be surprised if the major players will simply ignore Metal.

I tend to agree with Jonathan Blow's comments on Twitter, that the low-level graphics API should be just that: as low level as possible, small, focussed, and not actually intended (but still allowing!) to be used directly. Engines or higher-level API's can be built on top of that, with the option to dive down to the lowest level when needed (which will probably be a rare occasion).

DirectX will definitely not be this API because it is Windows specific. Likewise for Metal because it is Apple-specific. Blow appears to be of the opinion that Vulkan is also not moving in the right direction, because it is becoming too complex, and trying to be too many things for too many applications at the same time.

If true, in a sense, it's not that surprising Apple is doubling down on their own API. I think they should consider making the API open-source though, and develop something like MoltenVK (but the other way around) for Windows/Linux.


>The OP's point was that the companies that make these engines can afford to invest in supporting an additional back-end API though.

OP doesn't know the diminishing profits of the AAA gaming industry.

OP would never have to justify to producers and directors why <1% of gamers should take 30% or more of rendering architects time.


Switch supports Vulkan.


Also, the Forza (custom) and Far Cry (CryEngine/DuniaEngine) series. Titanfall uses a modified Source engine IIRC.


CSGO and DOTA2 both run on different versions of the source engine.


The top 10 most played today in steam are using UE4 (2), Source 2, Source (2), and custom engines (5: AnvilNext, RAGE, Evolution). That's a lot of variety, there's almost no reuse.


Battlefield, CoD, AC, all the sports titles, GTA. More or less everything build by one of the big three uses their own in-house engine.


Err at least some Battlefield games, fifa 17 and 18 use frostbite engine. I don't think one can call it custom.


> what is the most used engine today if not UE nor Unity?

My answer was aimed at that part. But yeah, maybe we should define what "custom" is.


> Hardly anyone writes their own game engines these days

more people than ever do.


> To be fair, most games today are built using Unity3D, Unreal Engine etc

What's the "etc"? Are there any other engines in that set?


With a bit of luck, Godot Engine. Sort of a dark horse, but I like it and my very smart corporate-programmer brother likes it. He says it's designed like a programmer would design it: everything's a node. I know I did a game in Unity (which has become overcomplicated) and had a surprisingly easy time jumping into Godot.


Cocos2D, CryEngine, MonoGame, Orge, Unigine, etc…


You’re not giving history it’s due.

Go back in time six years ago. What were Apple’s choices?

(1) continue to live with the deficiencies of OpenGL. Remember that, over time, it had come to fail at one of its primary purposes which was to provide efficient access to GPU hardware. Further, sticking with OpenGL would be to accept the leadership of a group that had allowed its flagship standard to falter.

(2) They could marshal their resources and create the better API that the Khronos Group wouldn’t/couldn’t.

They really had no choice. Note that Vulkan wasn’t announced until after Metal was released.

The gripes in this are should really be leveled at the Khronos group, which fumbled their stewardship of OpenGL and, with it, the chance to lead open GPU APIs.


The time table is being pretty generous to Apple. Metal, Vulkan, and DX12 are reworked versions of Mantle.

The entire point of Mantle was to be a proof of concept that could be reworked into cross platform API (which became Vulkan), there was plenty of work already being done by Khronos in 2014 (and Apple knew this). And they just went out and released Metal anyway.

I also blame Microsoft for the same thing, early parts of the DX12 docs were taken word for word out of the Mantle docs, that's how similar they are. But Microsoft at least had couple decades of having a competing API, but Apple went out to create a new one for some reason.


Talk about rewriting history, Mantle wasn't never supposed to become Vulkan, it only happened because AMD was generous and Khronos would otherwise still be thinking how OpenGL Next would look like.

It started as an AMD/Frostbyte proprietary API.

https://community.amd.com/community/gaming/blog/2015/05/12/o...


They had no choice, really? Oh, give me a break.

AMD was basically in the same position when they started Mantle (at about the same time, no less.).

They chose differently. Now we have Vulkan. No thanks to Apple.


While I get the concern, everybody's history here is backwards. Apple released Metal 2 YEARS before Vulkan. Why? Because OpenGL wasn't hacking it anymore and had become too asymmetric. Vulkan copied Metal, not the other way around.

I'm not sure they should have spun around and dropped Metal for Vulkan once it became available, or slow down the pace of progress til the rest of the market caught up. Doesn't make sense.

Also Apple is perhaps the largest GPU manufacturer in the world, with 200-250M GPUs shipped in 2017. That is 4-5X of Nvidia! Also Apple is investing highly in AI from tools to devices to GPUs, being able to customize may have tremendous value.

It is highly possible that Apple sees owning their interface stack as a means to keep their software-hardware warchest a couple years ahead of the competition. Which in mobile has been paying off of the last 5 years, as they constantly have crushed all others by 2-3X.


Does it matter anymore? People are using less and less of the higher level stuff of OpenGL. Most of the graphics code is now in the engine. OpenGL is getting very outdated, who starting a project today would chose it over Vulcan, Directx or Metal? I would bet most small shops would prefer to use some sort of middle layer or engine from a third party. That pushes the problem of implementing the lower layers in Vulcan, DirectX or Metal to a small group of specialists.


Well, it matters for people who are writing an engine – or a web browser.


No games aren't going to target Metal to support the Mac any more than printer manufacturers are going to go out of thier way to support AirPrint to make printers Mac compatible.

What developers will do is go out of thier way to support iOS and supporting the Mac is just a side benefit. Just like almost every printer company supports the Mac as a byproduct of wanting to support iOS.


Hypothesis: There are more machines in consumer hands which support Metal than DirectX.

This may sound crazy, but remember there are billions of iOS devices out there in the world, and I don't think X boxes plus windows game machines count in the billions.

Its true Apple hasn't won the hard core gamer market, but they are no longer the niche player that had to cater to windows users.


If you're counting only gaming PCs (i.e. device used mainly for demanding 3D games) you should also count only gaming Macs/iPads/iPhones. How many are there in the world?


PC's use openGL and Directx All the android devices use opengl ES. Older ios devices use opengl


Android runs OpenGL ES and there are a lot more of those than iOS devices.


Are there more Android devices that actually have hardware that can actually play high end games decently? The average Android phone is a low end phone - with an average selling price of $225 for all Android phones how can they not be?


Yeah, and they still support OpenGL. The most popular iphone by a long margin is the 6, which is not exactly a graphics power house.


Based on what statistics? Who is selling all of these high end Android phones? Even Samsung is selling mostly low end phones.

Also looking at Apple's sells every year since the 6 came out, I doubt very seriously that Apple has sold more 6 phones than 6S, 7, 8, and X phones.

Also if the 6 from 2015 is not a powerhouse, neither is Samsung S8 that was just introduced last year....

http://bgr.com/2017/05/23/iphone-6s-vs-galaxy-s8-speed-test-...


OpenGL is part of the platform, they all support it. The stats page doesn't even include 'not supported' [1]

Being able to run anything slightly demanding is other thing, but you can't argue there's no support.

Also, the benchmark you linked is for application load, which is heavily influenced by storage speed and load method (android has to JIT compile sometimes) and has almost no impact from the graphics' performance other than the bus between CPU/memory and GPU

[1] https://developer.android.com/about/dashboards/#OpenGL


Being able to run something suboptimally doesn't turn into sales. I'm sure that the owner of a $70 Blu R1 HD is not going to be spending money on high end games.


>This may sound crazy, but remember there are billions of iOS devices out there in the world, and I don't think X boxes plus windows game machines count in the billions.

There should do, albeit not for gaming but most of the office software is Windows with DirectX support. You won't be playing on, though.


I've yet to see anyone build a hardcore Mac gaming machine.

Oh wait, you can't.


you could make a pretty safe argument that the iMac Pro is right up there with the best gaming PCs one could buy/assemble.


It really isn't. The fastest GPU available is a Vega 64 underclocked to basically the performance of a normal Vega 56. A 1080Ti is ~50% faster. Even if you connect an external 1080Ti it's constrained by TB3 bandwidth.


You can with an external GPU.....


"But by no means to they have the clout to coerce developers into their own bespoke graphics API that doesn't work anywhere else"

To be fair, the overwhelming majority of game shops develop on engines, and leave the engines to deal with the platforms. Unreal Engine, Unity, etc, support Metal, among others.


Not everything that runs on OpenGL is a video game. Tons of applications out there that just won't have the budget to do a rewrite(and even fewer were probably setup with the right architecture if they were depending on OpenGL in the first place).


Indeed, I directly thought of 3d applications like Blender, Maya, etc which use OpenGL.

It's a very weird move to me, even if the software in question will be kept compatible with Apple's Legacy OpenGL, these versions will be worse than their counterparts running on other platforms making use of new shiny OpenGL features.

It's like Apple is saying 'we don't care' to the 3d professional market, also doesn't Photoshop rely on OpenGL these days as well ?


After discontinuing AutoCAD for Mac in 1994 people begged for 18 years to get it back and now Apple says "eh, we didn't want that anyway."

I heard they have a WebAssembly/WebGL version now, betting that'll get wrapped up in a WebView and we can all pretend it's a native program still.

Speaking of WebGL, that's basically OpenGL ES 2.0, but I assume the implementation in WebKit is backed by Metal? What about other browsers like Firefox?


Firefox uses OpenGL to implement OpenGL ES. It also uses OpenGL for hardware accelerated compositing.


AutoCAD is a dead technology. Architects/Stuctural Engineers/MEP Engineers are moving to BIM platforms (Revit, ArchiCAD, etc)! Product/Automotive/Industrial design and engineering use PLM tools (Catia, SolidWorks, etc). Besides, AutoCAD didn’t/doesn’t need much graphics power at all. AFAIK it never really used OpenGL.


Translating between these two is not particularly hard. (Similar to Vulkan backend for OpenGL)


It is. Keep in mind you have to transform the shaders too and make them safe so that they would not allow for undefined behaviour to happen.


LibANGLE


> Photoshop rely on OpenGL these days as well ?

Yes, though with Adobe's relationship with Apple they probably got all the handholding and resources to do the port to Metal.


I'm not sure about that. Or maybe Adobe just doesn't care. My 2017 Macbook Pro has horseshit graphical bugs in both Illustrator and Photoshop. I'm exclusively doing all my graphics work on my Windows 10 machine now (even though windows and my Wacom tablet do not play nice together.)


I'm in the same boat, my so called Pro machine the USB-C 2017 MBP has had glitching out completely unusable rendering on the latest version of Illustrator since October 2017. Adobe blame Apple, presumably Apple blame Adobe because neither of them are fixing it.

As if my deteriorating keys on this machine were not bad enough. This wasn't a good WWDC for me. My PC is working great despite being an obscure setup with mismatched GPUs, can't say I understand why the graphic designer workhorse machine MBP is unusable with Illustrator and that is working just fine..


> even though windows and my Wacom tablet do not play nice together.

They don't? Whats the matter? (mine works great, but I'm not a heavy user so curious if it depends on the model or I just didn't run into it so far)


microsoft changed the pen behavior in one of the creators' updates and now the pen buttons behave strangely (randomly dont work in certain applications) as well as the pen being registered as a finger in legacy applications for a while... making windows 7 the only really viable way to use wacom for a professional (speaking as one)

let alone the inability to reconfigure things like n-trig pens to have hover right click/middle click functionality, it's been INCREDIBLY frustrating without any communication from microsoft.


I've got the Intuos Pro from a couple models back. Windows Ink randomly causes pressure sensitivity to drop out (especially since the creators update.) On Windows 8 I never had trouble with the wireless adapter, now I have to run wired. Button clicks don't always register and sometimes will send the wrong input.

Overall it's rough, there are days where it seems better than others - but I'll randomly lose sensitivity and multiple reboots appears to be the only pseudo-consistent means of getting it back.

That being said - It's still way more usable than Photoshop/Illustrator on my Mac.

I miss 15 years ago when I had CS2 + Intuos Pro 2 and everything just worked.


Photoshop CC 2015 already had partial support for Metal.


True, but they didn't remove OpenGL, they simply deprecated it (e.g. don't expect any updates to it, new tooling will not be built around it, etc). That shouldn't affect legacy apps.


Yes, and deprecation doesn’t mean a lot on the Mac. Apple often deprecates stuff and still leaves it in. They remove it only when there’s something to be gained.

(eg. linking with the system-provided OpenSSL has been deprecated for years, but AFAIK they still ship it.)


They mentioned in the State of the Union that this is the first step towards removing it.


Chicken and egg. If they remove it apps stop working, if apps don’t update they can’t remove it.


I’m not so sure they’re worried about apps breaking. They’ve certainly stuck to the “no more 32bit iOS apps” thing.


Apple can get away with that on iOS, but they're a lot more conservative with macOS.

To expand on your example, I maintain a legacy app that is stuck in 32-bit land because it relies on the QuickTime framework. QuickTime has been deprecated for seven years, and the transition to 64-bit has been in progress for over a decade, and yet my legacy app runs just fine even under the Mojave beta. There are multiple time bombs lurking in that app, and one of these days I'm going to have to rewrite it from the ground up, but I've been astonished at how long it has lasted.

Apple knows it would be bad karma to make a large number of legacy apps and games suddenly break on the Mac. They're not idiots; they have a perfectly good idea of the scale of mutiny that would ensue. So I'll eat my hat if OpenGL doesn't continue to work for at least the better part of the next decade.


They said in the Platform State of the Union that Mojave will be the last macOS that runs 32bit apps, so QuickTime.framework and your app are running out of time!


Huzzah! Thanks for the heads-up. I’m looking forward to catching up on the whole SOTU.


They specifically mentioned QuickTime in the release notes as well.


> QuickTime has been deprecated for seven years,

It has entirely been removed from the latest (since 10.12 IIRC?) SDKs so now you have to keep older SDKs just to build your app.


True — 10.12 is my recollection as well — but I’ve been bitten so many times by compiling under a new SDK, especially with an older build target, that I do that as a matter of course anyway.


I find it hard to fathom that people think a huge software company like Apple doesn’t have awareness of the impact of its changes or people responsible for compatibility.


If there isn't one already, I'm sure someone will implement OpenGL on top of Metal when it's needed badly enough. At least they're going closer to the hardware, not further away.


Already there: https://moltengl.com/moltenvk/

"MoltenVK is a runtime library that maps Vulkan to Apple's Metal graphics framework on iOS and macOS."


But molten is for Vulkan (ie: not OpenGL) - and not 100% of Vulkan, AFAIK.


Both exist: MoltenGL is for OpenGL ES https://moltengl.com/moltengl/


OpenGL ES is not OpenGL though.

It's a subset mostly for mobile.


LibreOffice uses OpenGL and OpenCL extensively.


Why use Libre Office on a Mac when Pages, Numbers and Keynote are free (as in beer)? I’m going to go out on a limb and make a baseless argument that the Libre Office install base on the Mac is very low. On iOS it’s non-existent.


There is no way that Pages, Numbers and Keynote can open as wide a range of file formats that LibreOffice can. And there are way more features in LibreOffice.

On iOS, dunno, you probably have a point there.


Having used pages and word, please don't tell people to use Pages for everything.

It doesn't have the features you need when you're creating more complex documents. Last time I've used it it didn't even allow you to have different sections which allow you switch between the rotation of pages.


Every game developer I know turns off the metal rendering pipeline and uses the much more stable and refined OpenGL one unless getting every tiny bit of performance needs to be squeezed out.

I’ve witnessed plenty of last minute builds be saved by a Unity game dev on a Mac just flipping their renderer settings.


Sounds like this might be the incentive Unity needs to fix their Metal implementation.


I don't think it's much of an incentive. According to Valve's hardware surveys roughly 3% of Steam's market is MacOS. Those type of numbers are similar across different distribution platforms that a Unity game dev will target. It'll be hard to nudge it away from low priority with that share.

https://store.steampowered.com/hwsurvey/Steam-Hardware-Softw...


The other figure to look at is the amount of money spent. If it's similar to the hardware percentage then you're right, if on the other hand, macOS users spend more on games then a rethink is in order.


Do you have a source for that? Because I can't really find a figure. Genuinely curious.


True. This is the only sane reasoning they could've had.


Fortunately you can still use Vulkan on iOS and Mac OS through MoltenVK[1], a Vulkan implemention in Metal.

[1] https://github.com/KhronosGroup/MoltenVK


Sure, but the whole point of both Vulkan and Metal is to bring out more performance by being lower-level. I'd assume that at least part of that benefit is lost when you use something like this.


A lot of the point of Vulkan is not having to rewrite your entire graphics stack for every OS you want to target.


But you have to rewrite it for every major hardware vendor, or else you won't get the performance you want.

GL should always work for the simple case, but instead you need to rewrite to avoid bugs in its many layers. And once you have Vulkan/Metal, industry-specific wrappers are better than the impossible to debug procedural GL junk.


I'm not sure I agree with the claim, but even if we take a full rewrite at face value, "every major graphics vendor" for desktop applications is NVIDIA and AMD/ATI. On mobiles, you're probably using Unity or similar middleware and therefore not thinking about bare metal (no pun intended)


Actually it is more like every major GPU family, thanks to the extension fest of Vulkan.


That's the point of OpenGL; to my understanding, Vulkan is a modernized, lower-level spiritual successor to OpenGL.


Isn't that kind of the point of OpenGL?


Yes, but OpenGL is so outdated that the people that should use it the most (game and 3D application developers) were avoiding it due to a hardware vs. API incompatibility.

Vulcan was created to get that same portability, with an API that fits modern hardware.


That isn't remotely true. OpenGL is only outdated on macOS where Apple hasn't updated it for 8 years.

OpenGL 4.6 isn't anything like OpenGL 1.0/2.0 even though you can still _run_ those old OpenGL 1.0/2.0 tutorials.

You can even do most of the cool stuff of Vulkan in OpenGL via AZDO techniques (example: https://developer.nvidia.com/opengl-vulkan )


> OpenGL is only outdated on macOS

Also on Windows.

Apparently, if you want to distribute your software to wide audience, you can only rely on OpenGL 3.0, with minimal set of extensions. Here’s an example: https://github.com/Const-me/GL3Windows#building-and-running

All the target systems had latest Windows updates, and they all run Direct3D 11 software just fine (I mostly develop for D3D and I test on them). On some systems it works in 10.1 compatibility mode, MS calls that “feature levels”. Not a big deal in practice, the majority of D3D11 stuff still works OK.


No, you're getting stuck at 3.0 because you're hitting the deprecation strategy. You need to specifically request a post-3.0 context with wglCreateContextAttribsARB which you're not doing. Thus the system thinks you're an old legacy OpenGL app, and is giving you 3.0 as that was the last version before things were removed.

See: https://www.khronos.org/opengl/wiki/Tutorial:_OpenGL_3.1_The... for a tutorial.

My Pascal-based Nvidia GPU is showing OpenGL 4.6 on Windows 10. Nothing outdated here.


> No, you're getting stuck at 3.0 because you're hitting the deprecation strategy.

I think you’re wrong here. Two reasons.

1. If that would be the case, I would be stuck with GL3.0 regardless on the GPU. In reality, I’m only stuck with GL version < 4.0 on HD2000 and VmWare. On my desktop PC (Maxwell at the time I’ve wrote that demo) OpenGL 4.0 worked just fine in that very project. Even on Intel HD 4000 laptop, OpenGL 4.0 worked just fine with the same code.

2. Please read Intel’s documentation: https://www.intel.com/content/www/us/en/support/articles/000... Specifically, please expand “2nd Generation Intel® Core™ Processors” section. As you see in that table, Intel says HD Graphics 3000/2000 only support OpenGL 3.1, which is exactly what I’m getting from the GLEW library I’m using in that project.

Also, you can see in that article that no intel GPU supports GL 4.6 mentioned by GP. Even the latest generation UHD Graphics 620/630 only support GL 4.5. Meanwhile, they support the latest DirectX 12 for several years already.


> 1. If that would be the case, I would be stuck with GL3.0 regardless on the GPU. In reality, I’m only stuck with GL version < 4.0 on HD2000 and VmWare. On my desktop PC (Maxwell at the time I’ve wrote that demo) OpenGL 4.0 worked just fine in that very project. Even on Intel HD 4000 laptop, OpenGL 4.0 worked just fine with the same code.

Behavior depends on if the device supports 3.2+ compatibility mode which is optional.

You're hitting the legacy path, that's well-defined ( https://www.khronos.org/registry/OpenGL/extensions/ARB/WGL_A... ). You need to use the method I mentioned to get real post-3.0 OpenGL.

> Also, you can see in that article that no intel GPU supports GL 4.6 mentioned by GP. Even the latest generation UHD Graphics 620/630 only support GL 4.5. Meanwhile, they support the latest DirectX 12 for several years already.

ok, so? 4.5 isn't really outdated, either. It still supports all the modern good stuff. And, as we've established at this point, it's not Window's stopping you from leveraging the full extent of the hardware you have. By contrast macOS does stop you from using the hardware you got to the fullest, as it's stuck on 4.1


> depends on if the device supports 3.2+ compatibility mode which is optional.

For the systems I have in this house it’s not required, i.e. I’m getting the same OpenGL version that’s advertised by the GPU vendors.

> You need to use the method I mentioned to get real post-3.0 OpenGL.

Either I don’t, or the authors of that GLEW library https://www.opengl.org/sdk/libs/GLEW/ already did that. When running on modern GPUs, the code in my repository already uses real post-3.0 OpenGL just fine, including the shaders.

> ok, so? 4.5 isn't really outdated, either.

Right, but 3.1 (Intel Sandy Bridge) is. And 4.0 is outdated, too (Intel Ivy Bridge). Meanwhile, modern Direct3D works fine on these GPUs, 11.0 feature level 10.1, and native 11.0, respectively.


> Either I don’t, or the authors of that GLEW library https://www.opengl.org/sdk/libs/GLEW/ already did that. When running on modern GPUs, the code in my repository already uses real post-3.0 OpenGL just fine, including the shaders.

Go read the extension I linked, it explains the behavior you're seeing. Also go read the tutorial I linked, it's using GLEW and shows you how to create a context.

You have a bug if your intention is to get a post-3.0 OpenGL context. Whether or not you care is up to you. You may be perfectly happy being in the compatibility bucket. I don't know. But you're not in the explicit 3.1 or later path.

> Right, but 3.1 (Intel Sandy Bridge) is.

Sandy Bridge is a 7 year old CPU. Of course it's outdated...? And D3D 10.1 is from 2007, it's also hugely outdated. You're getting anything more modern out of the hardware with D3D than you are OpenGL here. I don't even know what the argument you're trying to make is at this point.


Just a nitpick, OpenGL 4.0 is newer than DirectX 11.0 and on par in capabilities.


No. Both ATI and Nvidia drivers include recent OpenGL versions, so OpenGL support problems are limited to actually not capable hardware.

In the old link you offer as example, Intel HD3000 and HD4000 are bad, with bad drivers that lie about OpenGL versions (hence the need to downgrade the client), and fortunately obsolete. Current Intel integrated graphics have improved. And VMware is a virtual machine, not hardware; it should be expected to be terrible.


> Intel HD3000 and HD4000 are bad, with bad drivers that lie about OpenGL versions

Technically that’s probably true. However, if you drop support of Intel GPUs, your GL4+ software will no longer run on a huge count of older Windows laptops people are still using. For many kinds of software this is a bad tradeoff. That’s exactly why all modern browsers implement WebGL on top of Direct3D, and overwhelming majority of multi-platform games and 3D apps use D3D when running on Windows.

> VMware is a virtual machine, not hardware; it should be expected to be terrible.

It’s only terrible for OpenGL. The virtual GPU driver uses host GPU to render stuff, and it runs D3D11-based software just fine. I don’t use it for gaming but it’s nice to be able to use a VM to reproduce and fix bugs in my software caused by outdated OS, windows localizations, and other environmental factors.


That's not why they do that at all. They don't need anything recent from OpenGL or Direct3D, which is why they target DX9. And DX9 specifically is targetted because that also works on XP, which D3D10 doesn't.

Intel GPUs D3D drivers have historically been better than their OpenGL ones (which isn't saying much since their D3D drivers are also trash), but now we're talking driver quality of one player which has nothing to do with the API itself or opengl somehow being outdated on windows.

But ANGLE also targets desktop OpenGL (and vulkan), and as OpenGL 4.3 adoption increases I'd expect increasingly more browsers to use it for WebGL 2.0 since you don't need translation there at all. OpenGL 4.3 provides full compatibility with OpenGL ES 3.0.

You seem to be pretty confused on how OpenGL versions line up with the D3D ones, too. For reference OpenGL 3.1 is roughly equivalent to D3D 10.1. When you're complaining about only getting GL 3.1, you're also complaining about being stuck with D3D 10.1


Yeah, but that software won't run inside Windows containers, like the store, or work with the Visual Layer Engine in W10.


You know that OpenGL is an API standard, and not a piece of software, right?


Yes? What does that have to do with anything that I've said?


It's still way faster than opengl in some senarios


Way faster in most scenarios actually. I'll give it that. (Any scenario I've tested anyway.)

But it's also more difficult for the lay programmer to use as well.


Just use a vulkan implementation of opengl.

OpenGL -> Vulkan -> Metal

I can feel the performance gains already.


Why is Apple investing in Metal at all then?


Same reason Microsoftinvests in DirectX. Lock in software developers and consumers alike.


I can kind of understand iOS, but it’s not like there’s a thriving graphical computing market worth locking in on the mac side. All major titles already use game engines. They’d just be locking out smaller developers who can’t invest in porting all their shaders: it’s not gonna be worth the effort.

Granted, this may also be true for OpenGL.


But iOS is a lot bigger than macOS.

You notice that Apple supported OpenGL while they were making most of their money from desktop and laptop sales; but once iOS became so profitable, they decided to go their own way and start pushing Metal.

Lock in, or at least getting people to do more iOS first development (helped by lower profits on app store sales on Android, Android fragmentation, etc), helps Apple out a lot. You get the first version of apps and games, or more polished versions of apps and games, on iOS this way.


So why lock in macOS at all?


Because the developers who would be working on OpenGL on macOS are working on Metal instead, because that's where the value is for Apple.


Maybe.... why would you assume they continue developing for macs at all? Small studios might not have the resources, and the market is tiny for many apps, eg indie games, modeling software, and ML (to be fair, apple has repeatedly emphasized they don’t care about ML on the desktop by not offering nvidia cards...).

And again, I don’t see the benefit for apple over supporting cross platform apis to encourage development. It seems like a net loss for everyone but some line in their budget on driver maintenance.


They do make some money on Macs, and Mac software, but not nearly as much as on iOS.

Providing macOS gives a developer and designer platform for iOS. That is really important for them. So Metal being available on macOS is important for that reason. But it's also important in that the Mac platform is still important, just not nearly as important as iOS.

OpenGL doesn't really have much of a future. Everyone is moving towards the next generation frameworks. It just happens that there was a lot of uncertainty about whether OpenGL could adapt or whether there would be a successor, and during that time Apple decided to invest in developing Metal. It wasn't until a couple of years later than Vulkan was released.

In the meantime, Apple has built up quite a lot of tooling around Metal.

And it's not like it's that difficult to write cross platform apps that target the Mac. If you write against major engines, they will already have support for the different backends. If you are writing your own software, you can still target OpenGL, or you can target Vulkan and use MoltenVK to run it on macOS.

And for the next several years, people writing portable software are going to have to either just target OpenGL, for compatibility with older graphics cards, or maintain at least two backends, OpenGL and Vulkan. Given that lots of games target DirectX first, and consider any of the cross-platform frameworks a porting effort, Apple probably doesn't consider it a big loss to add one more platform that needs to be ported to.

What's going to wind up happening is more software like ANGLE (https://github.com/google/angle), MoltenVK (https://github.com/KhronosGroup/MoltenVK), and gfx-rs (https://github.com/gfx-rs/gfx and https://github.com/gfx-rs/portability, see http://gfx-rs.github.io/2018/04/09/vulkan-portability.html for details) for providing one set of abstractions on top of several different backends.


Wouldn’t they be better off attracting developers instead of locking them in?


They could easily jump ship if that isn't the case. If you're working on Metal, your skills aren't much of a use in Microsoft.


Tailored to their hardware, more modern and is written in Objective-C which makes it much easier for Mac developers to integrate in their projects, since Objective-C interface nicely with Swift and most script languages.


Metal is a C++ API, not Objective C.


Ditto for OpenGL ES (http://moltengl.com/)


I've been using a cross-platform GUI framework/engine to do app development on all the platforms: Linux, MacOS, Windows, iOS and Android - and it has been a joy to deploy one app on all of these systems.

One of the reasons this has been so feasible has been the fact that the engine (MOAI) uses GL ES to manage the framebuffer itself - giving the same look and feel on all platforms the host runs. This has been, honestly, revolutionary in terms of building a single app that runs everywhere.

This now becomes more complicated because the engine has to be modified for Apple platforms to use Metal, and represents another fork/splinter in the unity of the host itself.

I wonder if their decision to use a non-standard graphics API is due to them wanting to make this style of development a lot more difficult in the future - i.e. are Apple passively antagonizing the cross-platform framework builders in order to establish an outlier condition for their platforms? The cynic in me says yes, of course this is what they are doing ..


>All they'll accomplish is further killing off the already-small presence they have in the gaming space.

In the AAA game space you mean. Else, in the casual gaming space, iOS is perhaps the most popular platform -- and the new integration effort means all those games will be able to run on macOS as well soon.


And those games are horrible. Almost all of them are built around exploiting weaknesses in the human psyche to convince people to spend money and become addicted. The biggest difference between those games and gambling is that you don't carry a slot machine in your pocket. For the most part the only exceptions to that are the games that were ported from desktop.


>And those games are horrible.

They work fine for me -- both as implementation and as gameplay.

>Almost all of them are built around exploiting weaknesses in the human psyche to convince people to spend money and become addicted.

I think you confused casual gaming with Zynga or something. I was referring to smaller, non-AAA megatitles. Could be anything from a platform game, to Angry Birds, Monument Valley, Threes, Letterpress, racing games, RPGs and so on...


I'm not saying there aren't decent games on iOS. You can find gems like Monument Valley, Florence, or, as I mentioned, the games ported from other platforms like Limbo, Terraria, and so on. But take a look at the top charts on the iOS app store and compare that to the top games on Steam. With few exceptions, the games on iOS are riddled with ads, microtransactions, and are designed to be as addictive as possible.

The point is that the kind of games that thrive on the app store tend to be exploitative and low quality. Desktop gaming isn't immune from that, but it's a dramatically better platform.


Not if Microsoft has any say. I can't count the number of Windows Updates that re-installed the previously uninstalled Candy Crush Saga, Bubble Witch 3 Saga and March of Empires (among titles).


I played the fuck out of Angry Birds - on Android. How exactly does forcing developers to adopt a platform specific API help anyone? That was a rhetorical question BTW, don't even try to answer it. Apple are being arrogant as fuck with this.


>How exactly does forcing developers to adopt a platform specific API help anyone?

Well, platform specific APIs aren't lowest-common-denominator affairs, and get support for native platform capabilities faster (plus can be more optimized).


I understand their benefits. But why refuse to support standards as well? I don't think Apple is short on resources.


I think because then you don't give developers the extra motivation to use your platform APIs.


That just means the benefits don't outweigh the down side - lack of cross platform API, new learning curve to climb.


You talk of a subset. A lot of casual games on iOS a very good: Cut the rope, Angry Birds, Bad Piggies, Simple Rockets. Civilization for iPad was very good. I can't remember all the stuff I've played but a lot of games are not the candy crush kind.

Also a lot went wrong when Apple opened up for ads and in game purchases.


PUBG Mobile is one of the best game I have played in a long time. And it doesn't cost me a penny. Nor do I have to pay to win. ( Actually I may have to upgrade my phone to play better )

But not every game are gambling. Fortnite seems to be doing great. And that shouldn't be a pay to win game.


PUBG was ported from desktop. It originally started out as an ARMA 2 mod, was turned into PUBG, and only much later ported over to mobile. It's a perfect example of the kind of game that can come out of the desktop gaming community. You don't get games like PUBG, Minecraft, Starcraft, Terraria, Civ, Kerbal, and so on without desktop gaming.

The games that grow out of the app store ecosystem are games like Candy Crush, Clash of Clans, Clash Royale, etc. I'm not saying good games don't exist on the platform, I'm saying the platform is conducive to low quality games. Almost all of the great games on the app store did not grow out of the platform.


Woah now, Clash Royale is a good game that is skill driven.

People take level 1 accounts into legendary arena which shows that skill dominates money.

How on earth the devs at Supercell tricked management into making a skill rather than money based game is genuinely a mystery to me.


That I certainly agree. But I do think the future are Games built on top of Game engine. Unreal seems to have a massive improvement changelog every 6 months and if Unity didn't exist I did doubt how anyone is going to compete within a reasonable budget.

Game Engine Choices would become the old day of OpenGL vs DirectX. No one sane would write their own game from scratch.


“99% of everything is shit”

There are plenty great games. It help to source them from a gaming community you trust.


>> built around exploiting weaknesses in the human psyche to convince people to spend money and become addicted

I guess that explains why they are on Apple devices in the first place.


I don't expect there will ever be an AAA game presence on macOS at this point, given so few of their machines offer dedicated GPUs anymore.

And even in cases where they are available, for example Macbook Pros, the cost difference involved in stepping up from an integrated GPU to an entry-level dedicated card is greater than the cost of buying an Xbox or PlayStation.


Which none of them support either Vulkan or OpenGL.


XBoxes and PlayStations have been getting more game ports than macOS tho, doesn't take a genius to figure out why.


Consoles always had more game ports that other computer systems.

Having a fixed set of hardware to target is a dream.


Additionally, I don't think indie developers have loads of time on their hands to port their niche games over to a new technology. I can see Unity supporting Metal, but smaller platforms (jMonkeyEngine) will have a slower adoption rate, and in that time hopefully open-source middleware will come out to handle legacy APIs.


Indie engines like Ogre3D just need to add a new backend to their already backend agnostic engine.


Sure, but this makes it a right pain in the arse to bring over apps from other platforms.


Well, at least in the Rust ecosystem there is https://github.com/gfx-rs/gfx which provides backends for Vulcan, Direct X 12, Metal and OpenGL. I'm not sure it's super relevant outside of the Rust ecosystem right now, but it's worth spreading the word around of such solutions.


How is the cross api shader support?


Surprisingly good. My tests have run across properly across 4 os's without much change. I can't verify for Apple though.


> across 4 os's

windows, gnu/linux, android; and what other os?


https://github.com/gfx-rs/portability may be relevant out side of Rust, since it's a linkable C library.


All they'll accomplish is further killing off the already-small presence they have in the gaming space.

That of course depends on the definition of "gaming space".

The classical desktop gaming space, Apple was never a player in it. They simply don't care about it. Hence why they treated OpenGL on macOS the way they did.

But: Apple is arguably the biggest player in the mobile gaming space. That's what they care about. So instead of spending a large amount of money to attract a low number of AAA desktop titles to their OS they just tap into the vast (game-)developer base that they already have in iOS and make it easy for them to deploy and sell their games on macOS too [1].

The move to deprecate OpenGL and OpenCL in favor of Metal makes total sense in that regard.

[1] https://techcrunch.com/2018/06/04/apple-is-bringing-the-best...


And where is Vulcan standard? Windows games use DX. iOS is Metal. Android Open GL, ES, etc. Gaming consoles have proprietary APIs.

I would very much prefer games to use Metal on macOS (Starcraft2 is much smoother on the same hardware)


On Khronos dreams, even on Android which was adopted on version 7, is not worthwhile to display on the developer's dashboard.

https://developer.android.com/about/dashboards/

And good luck finding a Vulkan version beyond 1.0, with extensions that work properly across all mobile devices.

http://vulkan.gpuinfo.org/vulkansupport.php#android_devices

You get a spectrum all the way from 1.0.0 up to 1.0.66, with Vulkan 1.1 promised for Android P devices only.


Nintendo Switch


I believe, as of now, there are more AAA games running Metal than there are running Vulkan. Pretty much every new macOS release is running Metal now, meanwhile a game running Vulkan on PC is considered a rarity.


Because on PC, there are working alternatives.


To me this signals that the Second Moribundity of Apple is nigh.

The First Moribundity was the period in the 1990s when Apple was coasting on the DTP and Photoshop advantage over Windows they had, reducing the features of their desktops and not innovating. Schindler, a smart but ineffective leader, was replaced by Gil Amelio, a star in his prior field but unable to get Apple headed in the right direction. It took Jobs' return to right the ship that time.

The emphasis on thin but less functional and less serviceable laptops, the dropping of OpenGL, the cruft piling on top of OS X to no new net benefit for users, and their coasting on the desktop market all point to this IMHO.


Everyone is coasting in the desktop / laptop market. Refresh cycles are like 5-10 years on PCs now, and it’s not clear tablets even have a refresh cycle. You can’t get growth on PC hardware anymore, not even if you’re Apple. So it goes into maintenance mode. Nobody even really sells desktop PCs anymore except custom gaming rigs and low-power kiosk boxes.

Apple is all about mobile phones, the Apple Watch and air pods right now. The MacBook is barely a blip on their product radar.


"Nobody even really sells desktop PCs anymore except custom gaming rigs and low-power kiosk boxes." I believe you should embrace the novelty concept called 'work' sometime. Then you may see some 'niche' application of PCs. Mobile devices can, yes, be sold in bigger numbers, may produce higher profit, many trivial applications like browsing for news and sharing our newest and greatest experiences by tweets and photos do not require, luckily, a desktop anymore and so these paramount usages could been shifted to the only important platform of mobile devices, yet the second grade activities of design, manufacturing, academia and so still rely on the archaic concept of desktop computers and they keep alive this dying artefact. Even traitor mobile developers (all of them, the fools!) dare to use desktop computers still instead of solely relying on mobile phones. But not long, soon the last aircraft engineer or corporate accountant will sell their desks and throw away the last of the keyboards moving to the only necessary platform of mobiles so the PC can go extinct finally!


I think the GP’s point was laptop sales are outgroing desktop sales at a high pace for a while now. And that in an overall stagnating market.


More that both laptops and desktops are dinosaurs compared to mobile so Apple isn’t pouring R&D into them. And it’s a smart move.


> Nobody even really sells desktop PCs anymore except custom gaming rigs and low-power kiosk boxes.

My personal impression is that while pre-built PCs are mostly sold to businesses, there is a strong growing trend since a few years to build your own PC from parts. I really do observe that if mobility is not an issue, people now tend to move away from laptops towards self-built PCs (or often rather: let a good friend build one). I don't want to go into the details what advantages these have over laptops, but just mention customizability with respect to requirements (e.g. very silent, very high-power, ...) and repairability.

A small (but only small) contributing factor is that laptops do not have sufficient power for VR, so you really need a stationary PC for VR.

The strongest "dampening factors" with respect to this trend are the growing prices of GPUs, because lots of cryptocurrency miners (in the last years in particular Ethereum) hoard them (but it is my impression that this will somewhat ebb away as soon as attractive ASICs for Ethereum get released) and the growing prices of RAM lately. On the other hand the release of Ryzen made building of PCs with either very high-performance or very good cost-benefit ratio feasible.


Is building your own PC really a recent trend?

A lot of early pcs were kits people built, and I remember pc gaming in the '90s and '00s was also heavily focused on custom built machines.


Probably the last couple decades saw custom PC building decline in market share as prebuilt solutions became mainstream, but perhaps it's seeing a resurgence now that most consumers have moved from focusing on computers to focusing on smartphones as their personal tech hub.


Now that prebuilt PCs tend to be cheap and limited, anyone who wants something better needs to build the machine from parts.


Prebuilt pcs were cheap in the '90s and '00s as well. Premium brands like AlienWare still exists for gamers.


But in the 90s laptops were insanely expensive. :-)


Not a recent trend but online communities act as a catalyst.


Custom built PCs are a niche. The vast majority of people just want to buy a box that works.


> Custom built PCs are a niche. The vast majority of people just want to buy a box that works.

That is why I wrote

"(or often rather: let a good friend build one)." :-)

Seriously: In my opinion (but others might disagree) an advantage of self-built PCs over one that some company produces is that you know exactly what components are inside (in particular you can buy components in which you trust), which makes you far less dependent on driver support by the manufacturer (i.e. you can find drivers in the internet by yourself if necessary). I often had bad experience with driver support by manufacturers (expect for a few well-respected names).

Also lots of cheap PC manufacturers install lots of crapware by default, while your self-built one is a very clean install.

In summary a self-built PC often "just works" far better than a pre-built one.


You're forgetting the most important reason to DIY: money. Where manufacturers are happy to charge you 2x the price to double your RAM or put in a non-shitty SSD, DIYing can shave off a few hundred Euro.

Just as an example, over at apple.com, the new iMac comes with 32GB of DDR4 2666MHz ECC RAM. To add 32GB (64GB total) they are charging 960€, or 2,880€ to add 96GB (128GB total).

Looking on amazon or geizhals.eu (price comparison site), 16GB ECC DDR4 2666MHz sticks cost 200-220€ x 2 = ~450€ für 32GB. That means you can save 500€ on just RAM alone.

Same applies to SSD storage. And graphics cards. And CPU. And basically anything you could want to upgrade.


> Just as an example, over at apple.com, the new iMac comes with 32GB of DDR4 2666MHz ECC RAM. To add 32GB (64GB total) they are charging 960€, or 2,880€ to add 96GB (128GB total).

Specifically Apple is well-known for their expensive pricing for better optional components (which is often necessary to pay, since for many models you cannot simply replace the component (RAM, SSD etc.) by another one on your own).


Good point. I should have used e.g. Dell or HP as an example, they do the same albeit AFAIK slightly less absurd price-hiking.


PC gaming is growing and thats where mostly the trend for self-built PCs comes from and most of the growth in PC market. It is still dwarfed by Laptop sales though. As an anecdote, today i know hardly anyone who owns a desktop PC while 10-15 years ago almost everyone had one.


The trend around me is to see the shops that used to sell PC parts closing down, while consumer stores have 80% of their PC surface full with laptops, tablets and mobile devices.


> The trend around me is to see the shops that used to sell PC parts closing down

Couldn't this be simply explained by the hypothesis that people simply buy their PC parts mostly in the internet instead of a brick and mortar shop?


It could, but I am willing to bet that is even more niche than going to a physical shop.

The kind of experts capable of understanding if they are ordering hardware that is supposed to work together instead of blowing up in some form when assembled, is quite tiny.


I'm sure it's rarer than I'd like to think, but if I managed to figure it out as a teen in the mid 90's without much in the way of internet access or budget, it's still probably easier today.

Anyone with a modern internet connection and a bit more patience than money (or at least a willingness to learn) can hop on Reddit or PC Part Picker and get a pretty good idea of what is out there and works together.

Compared to the days of making sure you had the right number and type of ISA, PCI, and AGP slots, assembling a PC from parts today is a breeze. Shopping online keeps costs low and places like Microcenter are great for buying in person.

I only haven't built one in a while because my current 5-year-old workhorse media/editing/gaming/everything PC shows no sign of needing a full upgrade any time soon. Sure I bought a new GPU after a few years when I got bitten by the experimental VR bug but other than that, it was an afternoon buying parts and snapping them together, an evening installing software, and 5+ years of "just working".


The “enthusiast” segment of the market is small. Even then you don’t have nearly the choice in vendors you used to, so people mix and match parts from the same 5 or 6 vendors who’ve been there forever.

Cryptocurrency mining has kept the enthusiast hardware market afloat, I have a feeling...


GPU prices seem to be normalizing the past week, at least here in Europe using historical price data from Geizhals[0]. If things continue we will be back to "normal" 2017 prices very shortly.

[0]: www.geizhals.eu

Examples:

1070: https://geizhals.eu/?phist=1456548

1070 TI: https://geizhals.eu/?phist=1717563

1080: https://geizhals.eu/?phist=1449277

1080 TI: https://geizhals.eu/?phist=1587606


I disagree, I think Apple is still rather committed to the Mac even if it is no longer their #1 priority. Even if it's a small part of their profits, it's a big source of the strength of their brand (visually speaking, a Mac stands out against a PC far more than an iPhone stands out against any other smartphone), and that's something that's crucial for all of their products. Bad press about the Mac damages the brand, and if they don't continue to put out a good product (and correct their recent mistakes) they are going to keep getting bad press. I would argue that the Watch and Air Pods are more in the category of the Apple TV than they are the iPhone, because they're just a means of pushing up the average value of each iPhone customer. The Mac both does this and serves as a "halo" which is crucial even if it's less profitable. Honda certainly isn't making much money on NSX sales but that doesn't mean it's not still vital for them to produce it just for the sake of the brand (although I admit that's a bit of an extreme example).


The big signal for me that Macs are becoming second class citizens is if they ever start supporting Xcode on something that isn’t MacOS. That is the only thing keeping app developers on their platform, and if/when that changes, you know the final nail is in.


I would be surprised if Apple doesn’t throw in with Microsoft soon around mobile development. There’s a lot of symbiosis there — Apple and Microsoft don’t really compete in very many markets against eachother anymore.

Visual Studio is actually an awesome development platform (I’m including VSTS in this). There’s not much secret sauce in Xcode to threaten Apple’s App Store revenue — it’s just a bunch of signing keys that Apple controls and manages.

I do think Apple will make a strategic decision to diversify from core Mac OS soon. The ever-shrinking space allocated to traditional computers at the Apple Store has convinced me of that. They have more floor space allocated to the Apple Watch than they do the Mac these days.


Sure; but users stopped buying computer hardware on specs last decade (or at least the mass market Apple is chasing). Macs aren’t even any more expensive than top-end PC laptops. They’re just a brand choice.


They definitely aren’t usually buying them because of typical CPU/RAM specs or whatever, but I think things like battery life, trackpad, keyboard, display, general build quality are all still very relevant aspects of hardware that people pay attention to. The slow upgrade cycle is definitely hurting things and generally slowing everything down, but it’s not as terrible as it’s often made out to be. It’s been a few years since the 5K iMac launched and it’s still a great desktop. The Mac Pro was botched but at least they’re fixing it. We’ve only had 2 years of weak MacBook Pro releases, and it’s very possible that the next release later this year will address both major issues: a bad keyboard and no 32GB RAM option. The keyboard will matter more for most people, but the added RAM plus 2 more CPU cores and Vega graphics will be a big win for true pro users. I don’t think they’ve necessarily “coasted” on Macs just because the last couple years have been weak. It’s more accurate IMO to say they’ve just fucked up a couple times in the last 5 years — specifically with the Mac Pro and the butterfly keyboard devices.


Oh I agree; Apple’s brand carries a ton of heft in the laptop market because they generally get the “product” features (keyboards, battery life, screen quality, etc) right. It’s only so noticeable because their track record has been so good — we would still have plastic laptops and trackpads with mushy membrane keyboards without Apple’s industrial design incorporating metal and glass.

And honestly, their desktop line looks very similar to many PC companies. They make the Mac Mini (aka the Apple NUC), the all-in-one iMac, and the Mac Pro (ok, the Mac Pro sucks and there’s no excuse for it).

Anyone doing serious workstation tasks these days is likely using a cloud-based solution. There are a few specialized exceptions (ML development stands out to me), but none big enough to build a product around. Especially when doubling the entire PCs and Laptops category wouldn’t move the needle for Apple at all.


Approximate yearly sales of desktop PCs worldwide (all kinds) are 200 million per year... I can't find 5 year refresh cycles on any HP, Dell, etc. desktops or laptops.

They tend to refresh in line with corporate cycles, i.e. 18 month to 3 years, from what I see.


Contrast with ~1.5 billion (!) smartphone sales per year.


Sounds like a 'we sell more shoes than airplanes so we don't need airplanes no more' kind of argument.


such figures wont last forever once smartphones are sufficiently mature and there is no performance difference from one model to the next.


The main reason to buy a new phone these days surely is "the old one broke". Being with us all the time. The planned obsolence - unreplaceable battery, glued screen, glassback etc mean phones break all the time.

PCs stand still, even laptops tend be mostly stationary. No wonder my work laptop is now 6 years old and I feel no urge to request upgrade.


> The planned obsolence - unreplaceable battery, glued screen, glassback etc mean phones break all the time.

Don't forget planned insecurity two years after release through unpatched software vulnerabilities.


With shops still selling un-upgradable Android devices with 5.1 on them, I seriously doubt non-technical users actually care about it.

https://tinyurl.com/yakmsz6b (sorry, in German)


People drop their smartphones more often than their desktops.

But I expect cheaper devices to take more share, since the "high-end" won't give you much benefits over them. My 120 EUR Xiaomi already offers amazing cost performance compared to most other Androids.


Which is pretty much where we are now. At this point smartphone sales are mostly based on making performance more affordable, and lots of marketing.


The difference is that in the 90s they were close to bankruptcy whereas today they are swimming in cash. That's not going to help them much in the long run if they have lost their vision, but they are definitely too large nowadays to go away within a few years.

You may call me Captain Obvious now. :)


I think it is a good point, and they could probably survive about 2.5 years of losses without too much pain - but Apple's margins back in the 1990s were pretty good also: what hurt them was that sales dropped off a cliff, so their good margins weren't enough to save them...

There could be a number of factors that would hurt them, like margin compression, reaching market saturation, etc. All the ailments that Business Schools will warn you about...

However, they (IMHO) have become arrogant and out of touch with their users. VISION is their problem.

They need the MacBooks and desktop Macs to be seen as very desirable, in order to project the "creative people buy Apple" halo over the rest of their products.


This time around, they have way more cash and maintain a larger share of much larger markets, and keep tighter control of spending and resource use. Their decline, if you are right about the writing being on the wall, will be an extremely long and slow one when it begins (we're talking about the first derivative here, right?). Too slow, probably, to reach "moribundity" rather than an equilibrium point of mediocrity.


well, all we need is the third coming of steven christ

come on steve, reinvent death please


This is a problem that is entirely of Apple's own doing.

Microsoft could not care less about OpenGL on Windows. However, it works just fine.

You know why? As soon as you install your video card drivers, your OpenGL implementation is no longer from Microsoft. It comes from AMD, NVidia or Intel, with all needed optimizations for their hardware.

Apple insisted in not allowing this and doing the OpenGL implementation themselves (which was always crappy and outdated).

Had they allowed the GPU vendors the ability to provide their own implementation, this would have been a non issue.


OpenGL is very much a second class citizen on Windows. Mass-market OpenGL apps like browsers currently use ANGLE to emulate OpenGL on top of D3D. Native OpenGL is used in professional apps that can make demands on GPU and driver setups.

(Many toolkits, like Qt and Cocos2d, also use ANGLE on Windows for OpenGL functionality)


Pro 3D apps from Autodesk tend to use DirectX. Certainly 3dsmax and Revit use DirectX over OpenGL and have done for a while.

The same is true for plenty of other “Pro” apps on Windows.

https://knowledge.autodesk.com/support/3ds-max/learn-explore...


This is to a large extent because browsers can't make the same assumptions about reasonable graphics drivers being installed as games can.


What makes you think that Apple refuses to allow GPU vendors to provide an OpenGL implementation?

The real question is why would a GPU vendor go through the expense of creating and supporting such an implementation when Apple doesn't even make a computer with slots that you can install their video cards into?

If producing an OpenGL implementation doesn't provide a competitive advantage for selling their products, why would they bother?


> What makes you think that Apple refuses to allow GPU vendors to provide an OpenGL implementation?

Because it is a fact that Apple develops their own drivers. Also, when did you last download a driver update from NVidia for your Mac?

https://arstechnica.com/gadgets/2018/02/vulkan-is-coming-to-...

Article is about Vulkan, but briefly mention Apple's own outdated OpenGL stack.

http://renderingpipeline.com/2012/04/sad-state-of-opengl-on-...

From 2012. Which means it is still accurate, given how out of date drivers are.

There are more references, you can look it up.

> The real question is why would a GPU vendor go through the expense of creating and supporting such an implementation when Apple doesn't even make a computer with slots that you can install their video cards into?

They still have GPUs, which can be Intel, AMD or NVidia depending on year and model. Just because they are soldered on, doesn't mean they don't need drivers.

EDIT: Some more research seems to indicate that there are drivers developed by NVidia for the NVidia Quadro GPU line.


> Also, when did you last download a driver update from NVidia for your Mac?

Last week.

Nvidia releases drivers for cards that the drivers which ship with macos don't support. I would also guess that the nvidia drivers which ship with macos are written by nvidia under some agreement with apple, same is likely true of AMD and intel.


Last week. Nvidia releases drivers for cards that the drivers which ship with macos don't support.

That's a tiny part of Apple's lineup though.


Yes, but you need to switch to an older version of of Xcode/ developers tools if you want to program CUDA on a Mac. Specifically I have to switch back to last Decembers release when I want to do any CUDA development on my 2015 MacBook (I don't think there is any later Mac's that even support nVidia).


That is not a macOS specific problem. Because CUDA interacts with the compiler, you can have exactly the same problems on Linux with gcc.

(Ironically, using OpenCL avoids this problem)


Yes, once upon a time both Microsoft and Apple provided an implementation of OpenGL with their OS.

When Microsoft abandoned OpenGL for DirectX, GPU vendors produced their own OpenGL implementations because doing so provided a competitive advantage that allowed them to sell more product.

The question is, why would those GPU vendors do the same thing now that Apple is following the same path?

Apple doesn't even produce a computer with slots you can install their products into.

>EDIT: Some more research seems to indicate that there are drivers developed by NVidia for the NVidia Quadro GPU line.

Keep doing research, because NVidia provides downloadable Pascal drivers even though the last time Apple produced a computer with a PCI slot was the Cheese Grater Mac Pro which came out over a decade ago.

https://9to5mac.com/2017/09/27/nvidia-pascal-drivers-high-si...

Making sure nothing diminishes CUDA is very much in NVidia's competitive interest.


NVIDIA has mac drivers for their whole consumer line. For example https://images.nvidia.com/mac/pkg/387/WebDriver-387.10.10.10...


Good to know. It seems that they also provide updated CUDA drivers.

It doesn't seem to change anything on the OpenGL stack, unfortunately.


It just goes to show that NVidia thinks supporting CUDA everywhere is very much in their competitive interest, while creating and supporting an OpenGL implementation simply is not.


Yes. But tell me a Mac past 2015 that you can even use an nVidia card in.


They're doubling down on eGPU support, so there's that. But not as ideal as having something actually inside your machine.


Any Mac that can use an eGPU.


Only on desktop.

In case you missed the news, only DirectX is supported on UWP and store apps.

To the point that Microsoft has their own port of Angle.

https://docs.microsoft.com/en-us/windows/uwp/gaming/compare-...

https://github.com/Microsoft/angle

So unless they change their mind, say goodbye to ICD drivers on Windows as well.


Is UWP/Store actually taking off? I was under the impression that it was another flop.

I run Windows at home, but wouldn't if it went Store-only. For me, an open platform is the only thing Windows had going for it.


Yes, little by little.

Quite a few apps like Adobe XD are store only.

Next Office for Windows 10 is only available via the store.

Microsoft has taken the other approach, if apps don't come to the store, the store comes to the apps.

So thanks to the outcome of Project Centennial, they are now merging the UWP and Win32 worlds into Windows containers and making the store Win32 aware as well.

https://bramwolfs.com/2018/03/13/msix-the-platform-for-all-w...

Deep down session from BUILD here, https://channel9.msdn.com/Events/Build/2018/BRK2432


As a long-time professional game engine programmer, it is hard for me to see consternation over things like this, and avoid judging it as mainly ignorance. The amount of code in an engine that needs to touch the graphics API is tiny. A handful of methods for device init, filling buffers, uploading shaders, setting state, and saying "draw!" All of the graphics API code can easily fit in one medium-sized source file. As a proportion of the whole engine, it's very small. As a proportion of the whole game or app, it's negligible. It's also boilerplate. There are only so many ways to copy a stream of structured data into a buffer.

Legacy software, blah, blah, blah. No legacy software runs forever, and least of all on Apple platforms. Who cares.


Professional game engines are not the only application for OpenGL, though. Many people would like to build cross-platform software without using major abstraction layers, such as game engines. This could be research software, prototypes, small tools in general.. -- I think there is a long list. For these people life might get harder.


Gaming is all about legacy software, especially single player games. As a gamer I'm very happy that almost all older games still work on Windows today (either directly or using an emulator like DOSBox).

(Of course that does not mean that the OS needs built in OpenGL support. If you can convince an old game to use some kind of OpenGL-Metal compatibility wrapper without needing access to the game's source code or support from the original developer, that's fine with me as well.)


I'm one of the happy persons about this move, because my competition use OpenGL on Mac and I just use software rendering. It was easy to see this deprecation coming...


Quite true.

People in FOSS friendly circles really don't get the games development culture, IP management or the contracting business related to ports.


OpenGL isn't pretty, but it's at least cross-platform. And my impression was that OpenGL support is mostly handled by the GPU manufacturers, so I'm not sure how much Apple gains here by deprecating OpenGL.

Requiring developers to use an API locked to a particular platform feels pretty hostile to me. Doesn't matter if that API isn't perfect, or even far from it.


Although I agree it's a terrible decision for Apple only to have Apple-specific graphics APIs, please note that:

* Being deprecated does not mean that things will suddenly stop working. It will take a few more releases of macOS before this can be removed.

* Next to MoltenVK there is MoltenGL, which is an implementation of OpenGL ES 2.0 that runs on (edit) Metal [1]. That indicates it's at least feasible to wrap OpenGL applications in the future if necessary.

Furthermore, Apple wil drop support for all Macs that don't support Vulkan in this release of macOS [2]. Ouch, what a waste.

[1]: https://moltengl.com/moltengl/

[2]: https://9to5mac.com/2018/06/04/macos-10-14-mojave-supported-... (anything from before 2012 does not support Vulkan)


Did you mean Metal instead of Vulkan? :P


It seems like a clear signal that Apple is preparing to develop its own GPUs. They're already doing this on the iPhones.


Nah. The GPU on Intel chips is free and the eGPU thing, to me, is official notification that Apple think GPU's should be on the outside. I bet this generation of MacBook Pros are the last to have discrete graphics...


You don't get free Intel GPUs on your ARM laptop chips...


This is the most obvious explanation I have read on HN... After I read it!


> And my impression was that OpenGL support is mostly handled by the GPU manufacturers

You would be correct, but not on OSX.


I mean it wouldnt be that big of a problem if they adopted Vulkan, but they are pushing Metal :-/


>Requiring developers to use an API locked to a particular platform feels pretty hostile to me.

So like DirectX?


I sure hate it when Microsoft does it, but at least they have market share. Who wants to support Metal just to target the Mac? And last I checked I have the choice of OpenGL and Vulkan on Windows because these days MS doesn't control the hardware stack from top to bottom on their software platform.


>I sure hate it when Microsoft does it, but at least they have market share. Who wants to support Metal just to target the Mac?

Plenty of big 3D/CAD/etc players? In lots of creative areas, the Mac dominates still (despite stories about people moving to Windows nobody's going anywhere, where nobody = quite few creatives overall).

Besides, with Metal they'll target iOS as well, and that's a huge platform, and where most of the profits are for mobile.


CAD on Mac is pretty much non-existent, as is any professional 3D market - the market share isn't there, the hardware support is terrible, so few major players bother with supporting Macs. All this stuff is either Windows (CAD) or Linux (3D simulation, visualization) these days.

And with this deprecation Mac is pretty much dead as a platform for professional 3D.


Creative Suite has run better on PC and at better price-performance ratio for almost a decade.

Graphic Designers still like Macs for the most part I guess -- and I still see them in video production a lot, but that's starting to change pretty quickly.


> I still see them in video production a lot, but that's starting to change pretty quickly.

I think the Final Cut "Pro" X was the inflection point - the change is ongoing.


Visualisation is largely done on Windows, mainly with 3dsmax. Has been for a while. Linux is used more in movie VFX.


Your view of visualization is a limited world: windows and Max?


OpenGL is still an option on Windows, it's not deprecated.


That depends on who you ask. OpenGL is in the deprecated API section on MSDN[1]. Because of the ICD model, Microsoft can't prevent GPU vendors from adding OpenGL features, but they don't bother integrating it with modern Windows APIs. You can't create an OpenGL context on a DirectComposition surface or in a UWP app. It integrates poorly with the compositor. You can't get composition feedback, and most drivers will flicker or show artifacts when windows are resized. OpenGL apps don't get windowed-fullscreen optimizations and you can't control when they enter exclusive fullscreen mode. I don't think you can use windowed-stereoscopic or windowed-HDR either. All these issues push developers away from OpenGL and towards DirectX, which is what Microsoft wants.

[1]: https://msdn.microsoft.com/en-us/library/windows/desktop/ff8...


Deprecated means very different things when coming from Microsoft and Apple.


It’s not deprecated because it’s not even there to begin with — Windows 10 doesn’t ship OpenGL by default; GPU vendors provide their own implementations.

Which AFAIK they’re free to do on MacOS as well, they just don’t seem to bother since Apple was doing that work for them


> Which AFAIK they’re free to do on MacOS as well, they just don’t seem to bother since Apple was doing that work for them

As far as I am aware Apple develops the GPU drivers for OS X (though, I think, based on code that the GPU vendor provides).


OpenGL is not a driver though, it's a graphics API


At least on Windows, the OpenGL implementation is part of the graphics driver. Why? Because by default Windows only has between rudimentary (at least up to Vista, I think; I am not sure about Windows 7 and 8.1) and no (Windows 10) OpenGL support - this is what the GPU vendors provides as part of his graphics driver.


> At least on Windows, the OpenGL implementation is part of the graphics driver.

It's distributed with the Graphics Driver, but most of it exists in a user space library, not in the driver proper.


Which AFAIK they’re free to do on MacOS as well, they just don’t seem to bother since Apple was doing that work for them

I'm not sure. NVIDIA provides updates for CUDA and an extremely limited amount of updates for their graphics stack (AFAIK none at all for integrated graphics, for example).


Not on UWP or store apps.


Exactly like DirectX. Great API to use if you don't give a shit about portability. If you do, it's useless.


No not like DirectX because DirectX is optional.


OpenGL is pretty. Much prettier than these Metal and Vulkan abominations.

The difference is that OpenGL is designed to be easy for humans. glBegin(GL_TRIANGLES); glVertex3f(x, y, z)...; glEnd(); you can't beat that. The issue is that it hard for the driver to optimize.

That's where Metal and Vulkan come into play. These are low level APIs, sacrificing user friendliness for a greater control over the hardware. It is designed for 3D engines, not for application developers.


Nope, glVertex3f was deprecated years ago by OpenGL itself. That is not the way the API works any more. [1]

Look into what it takes to write the minimum viable OpenGL program, written using non-deprecated routines, that puts a textured triangle on the screen. It sucks. On top of that, OpenGL is slow and gives you no way to create programs with smooth performance -- for example, it will randomly recompile shaders behind your back while you are trying to have a smooth frame rate.

1990s-style OpenGL was good for the time. In 2018, OpenGL is a pile of poop.

[1] https://www.khronos.org/opengl/wiki/Legacy_OpenGL


> for example, it will randomly recompile shaders behind your back while you are trying to have a smooth frame rate.

What? I've written commercial games with opengl on osx/ios and my experience doesn't show that at all.


Maybe the games were not very complex? Professional game programmers building games with lots of shaders are very familiar with what I am talking about. See for example this thread:

https://www.opengl.org/discussion_boards/showthread.php/1998...


> What? I've written commercial games with opengl on osx/ios and my experience doesn't show that at all.

State-based recompilation is a known issue in many GL drivers, particularly on mobile. E.g. changing blending settings may cause shaders to get recompiled. This can take up to a second.

Some engines work around this by doing a dummy draw to an offscreen surface with all pipeline configurations that they use at init time. This (usually) guarantees that all the shaders are pre-compiled.


Also, you can handle caching of compiled shaders yourself now (glProgramBinary).


I think the recompilations being talked about here are shaders generated by the OpenGL implementation behind your back. That is, your program never sees them as shader or program objects because they implement some permutation of blend mode, depth test, culling type, etc..


The non-deprecated OpenGL code for a hello world triangle is still an order of magnitude less verbose than Vulkan though.


While Vulkan is a bit verbose, it's not an order of magnitude difference if you follow modern OpenGL best practices. If you rely on default state and use the default framebuffer and rely on implicit synchonization, you can squeeze it down to a few hundred lines but that's not a good foundation to build practical apps on.

To give a ballpark figure, my Vulkan "base code" is less than 2x what my OpenGL boilerplate is for the same functionality. The big difference: the Vulkan code is easy to understand, but the GL code is not.

Comparing "Hello World" doesn't make much sense, OpenGL gets really darn complicated once you get past the basics.


Vulkan code is extremely front-loaded. HelloTriangle is much longer. A complete application can be significantly shorter.


In my opinion a similar difference exists between CUDA and OpenCL. OpenCL takes more code to get something simple going. But at least it doesn't break if you upgrade your gcc or use a different GPU vendor.


Each to their own but over the last 6 months I've written a graphics engine in openGL + SDL. Once you truly understand modern openGL you realise how beautiful it is.


You will think it's less beautiful when you ship that game on several platforms and find that it has different bugs on each platform, on each hardware version, and on each driver version. And most of these bugs you can't fix or work around, you just have to bug the vendor and hope they ship a fix in a few months, which they usually won't because your game is too small for them to care about.

This happens in other APIs too (we definitely had it happen with DX11), it's just that OpenGL is a lot more complicated than anything else due to its history, so it has proportionally more bugs.


> glBegin(GL_TRIANGLES); glVertex3f(x, y, z)...; glEnd();

That's fine for a "hello triangle" program, but quickly becomes ridiculous for anything approaching a serious engine. There's a reason that glDrawArrays() has been around since 1995 (and part of the core specification since 1997).


Made me want to revisit the good old NeHe tutorials for a quick browse :)

http://nehe.gamedev.net/tutorial/creating_an_opengl_window_(...

I wonder how much of this stuff is deprecated now.

More

Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: