Hacker News new | past | comments | ask | show | jobs | submit login

Wow, they are DEPRECATING OpenGL from both macOS and iOS: https://developer.apple.com/macos/whats-new/

This is in addition to last year's announcement that "macOS High Sierra is the last version of macOS to run 32bit apps without compromise"

I wonder if we will soon see a new lineage of Macbooks fitted with Apple-specific arm64 chips.

The most scary thought is if UIKit-on-macOS starts requiring Developer ID entitlements and need to be installed via the app store, with fairplay DRM encryption of binaries and everything.

Edit: Also, r.i.p. my old macbook air 2011 :-/

And OpenCL too. This is terrible. I was thinking about adding GPU support to a numerical simulator I am working on, and I was planning to have nice cross-platform support with OpenCL. Whelp, that's no longer the case. My code is in C++, and I refuse to use proprietary, vendor-locked, Objective-C-only Metal. If people want GPU support, they'll just have to use Linux, which doesn't artificially constrain you with corporate frameworks.

The deprecation of OpenCL is especially crappy given how heavily Apple were pushing OpenCL back in 2014 or so with the Mac Pros.

Apple lost interest in OpenCL after learning Khronos wasn't steering it into the direction they wanted it to go.

For example, Metal compute shaders are C++14 and Khronos only adopted C++ in OpenCL after the beating they took from CUDA, which supported it since the beginning.

>My code is in C++, and I refuse to use proprietary, vendor-locked, Objective-C-only Metal. If people want GPU support, they'll just have to use Linux, which doesn't artificially constrain you with corporate frameworks.

Well, some people refuse to use non-platform-native lowest-common-denominator libs, so there's that too...

one possibility for cross platform gpu is webgpu. the people working on it seem to be planning to have a c or c++ level standalone library. behind the scenes it will be directx/metal/vulkan

Apple/Google/Microsoft/Mozilla and others are all participating


Does this imply Apple is planning to continue support for OpenGL informally/externally, the way that X Window support is through xQuartz, perhaps via an external Mesa-based library?

And maybe the same informal/external support model for OpenCL?

Performance will be diminished but not extinguished.

FWIW, they've only deprecated it, meaning no new updates. They haven't removed it from the platform. Furthermore, desktop OpenGL seems like it's dead anyway, given that Vulkan has replaced it.

Aren't most people running those kinds of workloads doing so on Linux already?

Doesn't seem cost-effective at scale to run on beefy Apple machines.

People just getting into a field like to run code on their personal machines. This can be quite relevant when your code gets a 50X speedup from running on GPU.

This is sort of like saying "people only do web serving workloads on Linux, we don't need web servers to run on Apple machines" to me.

Not necessarily. Many media editing apps use OpenCL to speed up processing. I know Capture One uses OpenCL, and I think Adobe's Lightroom and Photoshop use it also. At this point even Pixelmator and less well known alternatives use OpenCL, too.

Sadly, most companies won't have any choice but to port their app to Apple's proprietary APIs. It's really a net loss for consumers because most of these devs have better things to spend their time on than Apple breaking compatibility on a whim.

"At this point even Pixelmator and less well known alternatives use OpenCL, too."

Pixelmator, at least, is based on Core Image, which Apple has probably already moved from OpenCL to Metal.

I mean, if you're already running decent code or already have extremely good tooling where small examples can be easily sent to other machines. Additionally, debugging code running on other machines is also a huge pain in the ass vs. being able to step through it directly, locally.

Almost all of my (and my lab's) time is spent tinkering with small numerical examples before sending it off to one of the lab machines to run overnight, and using the GPU on the MBP through OpenCL is a huge advantage.

People used to buy expensive eGPUs for those workloads to keep them on macOS.

Look into either an existing engine to abstract it away or Vulkan.

This is just quibbling, but 1) you can write Metal code with Swift as well as Objective-C (and that's not even vendor-locked; I'm doing Swift in Linux right now), and 2) you can write C++ in Objective-C.

I know this isn't what you're really complaining about, though.

> you can write C++ in Objective-C

You mean Objective-C++.

The most common pattern I've seen with cross-platform stuff is a small Objective-C wrapper over MacOS APIs, which then gets called like C functions from pure C++ code.

Somehow this is directly related to the failure of desktop Linux.

The only successful variants of it, actually do constrain devs with either Web or Java corporate frameworks, with a very tiny subset of native code allowed to take part into the whole game.

Yes, because GPU computing is mac/windows primarily, and in these cases, primarily for games?

I would be a lot more okay with this if Apple supported Vulkan, the more portable comparable API, rather than just the macOS/iOS-only Metal.

I also wonder what means for WebGL and its future. Right now, WebGL works in browsers on macOS, Linux, Windows, iOS, Android, which is incredible. There is no equivalent.

Sure, Apple has started working on WebGPU, but that’s not yet ready nor is it guaranteed to gain Linux, Windows, Android support.

Agreed it would be nice if there were official Apple support, but FYI there's a project called MoltenVK that implements the Vulkan API on top of Metal: https://arstechnica.com/gadgets/2018/02/vulkan-is-coming-to-...

Apparently Valve shipped a DOTA 2 update a few days ago that is using it:



So that's pretty promising!

My guess is browsers will just implement WebGL on top of Metal and this will be transparent to web developers.

There's precedent: Chrome and Firefox both implement WebGL on Windows with ANGLE, which translates OpenGL to Direct3D.


And it seems they are working on a more generic library : https://github.com/google/nxt-standalone

Ah, and NXT has a Metal backend! Very cool.

It's important to note that NXT isn't necessarily a replacement for ANGLE. It's an experimental replacement for WebGL as a whole, with a different API. There still needs to be a way to run WebGL programs on Mac if this deprecation leads to removal a few versions down the line.

And not strictly a replacement either, WebGL would not go away. The successor to WebGL is still so far away that there will probably be some versions after WebGL 2.0.

WebGL is a browser level API that does not require native driver support like OpenGL. They're related only in semantics.

Mentioning this in a couple different places, but consider Molten (https://moltengl.com/) for this use case; it's a third party but very high quality reimplementation of Vulkan and OpenGL ES atop Metal. In other words, an easy way to adopt metal without actually porting your app.


Wow, does this mean Maya, Houdini, and basically every 3D package out there will no longer run on macOS? If so that seriously sucks for 3D pros.

I'm guessing it's not the end of the world for Autodesk to add a metal backend to maya. Some smaller teams might very well choose to let go of mac though.

If they weren't already doing that just for the performance improvement itself.

Apple will easily ease the transition. That would be way too stupid to kill partners that way.

No, because contrary to HN beliefs about 3D APIs, in the real world most business already added Metal backends to their rendering engines.

Exactly. Not sure why everyone thinks their favorite Mac apps are using OpenGL anyway. They probably moved to Metal a long time ago — it is much better.

I'm sure everyone's favorite /exclusive/ Mac apps are probably using Metal.

I'm guessing you haven't used apps like Maya, Nuke, or Houdini. They were all written in the mid-90's on IRIX machines and later ported to Linux, Windows, and OSX. Surprisingly, 3d performance isn't always big goal of their's. My guess is the core features don't sell new versions, so even though they have annual releases those things don't get much attention. They'll have drawing issues and transparency sorting problems for years. Same with audio bugs.

Their Mac support was spotty and irregular until the past 5-10 years.

OpenGL has just been deprecated. It hasn’t been removed.

It will be in a year.

Won't there just be OpenGL drivers as a separate download? Is this any different than when they removed Java?

Does anyone know how this will effect Blender in the short term?

When Metal was introduced for the Mac they had the Modo devs at WWDC and they ported their code to Metal in like a week or two. Not really the apocalypse.


Metal and OpenGL two completely different APIs, shading languages and probably a whole host of other things.

I've ported my fair share of things from fixed-function to programmable shader pipelines and you'd be effing naive if you think you can do that in a couple weeks on anything more than a toy demo.

I worked with a AAA gamedev recently who had written a Vulkan renderer for their game to demo quality level in 2 weeks. It all depends on existing level of abstraction for the rendering API and shading language (and to some extent assets), and how much performance and efficiency you’re happy leaving on the table.

> demo quality level in 2 weeks

Getting pixels on the screen and shipping something to end users are to very, very different things, 90/10 rule and all that.

Vulkan also has the benefit of multiple platforms supporting it so you're not doing all that work for a minority(which is what OSX is in the graphics space) platform.

If you already had an architecture with replaceable renderers, especially with DirectX 12 one already written, adding Vulkan one will be a matter of just a few weeks. If you hadn't, it will be much tougher.

Well Modo isn’t a toy.

The question is, how many devs did they have working those two weeks (and what resources did Apple provide to help them)?

I develop an OpenGL-based video engine for a live media playback application, which is very nearly as simple an application of OpenGL as you can expect to find in the real world, and there's no way I could expect to port it to Metal in a week or two singlehandedly. Like others have mentioned, it's a completely different paradigm, not just a matter of changing around some function calls.

That said, I welcome this change with open arms (and secure in the knowledge that legacy code will continue to work for the foreseeable future). OpenGL is a fragmented, brittle, spaghetti-inducing pile of global state. Rewriting in Metal isn't anywhere near as small a project as Apple claims, but I'm perversely looking forward to it -- I'll be very happy to have OpenGL in my rearview mirror.

Nice marketing campaign here a couple years ago. https://developer.apple.com/opencl/

Make up your fucking minds. Developers aren't hamsters.

I now declare MacOS to be deprecated, if that's how it's going to be.

Oh this is hilarious.

You could put money on how long this page gets forgotten and left up.

Aw, crap. MacOS is deprecated!? How could you!?

Yup, you will see a deprecation notice when booting MacOS now.

Good riddance. OpenGL has been an awful API for many years now. The drivers are way too complicated, and applications don’t have enough control to deliver smooth performance. All OpenGL does now is let you kind of mediocrily put things on the screen.

It would be a good riddance IF there was another universal API (Vulkan) and they would adopt it in substitution.

The fact they want to force game developers to use instead Metal is... ridiculous, especially considering the extremely low macos marketshare, particularly outside US.

Sometimes at work I daydream about this alternate reality where we let go of the idea of an universal API on top GPUs and just let vendors publish some ISA and hardware-specific libs/drivers/docs. I'm sure people would figure out nice (and less nice) abstract libraries on their own just as well.

It's so frustrating to read what the GPU is actually capable of (for example in the intel PRMs) and to know that there is absolutely no way to get the driver's compiler to do the right thing in a reliable way.

I mean imagine if icc was the only x86 compiler.

It's ok man, we can just keep piling on more turtles.

I do get the impression the new more-explicit APIs help somewhat with that, no?

All game engines that matter already support Metal, plus writing platform specific APIs is something that professional game developers are used to do since Atari 2600.

What do you think about them forcing you to use Swift or Objective-C for this? Forcing you to use languages that use ref counting and object oriented pointer tables that are traversed at runtime? How much of the gains does objc_msgSend eat up? I though you are against such things?

How ugly would the JAI code to need to be to interface with this?

I wish they had a simple Metal C API, but their new API comes with a bunch of Objective-C baggage.

Have you actually used the API? When it comes to submitting geometry and textures (some of the biggest bottlenecks), it's exactly the same as in C. You pass a pointer to your buffer of bytes and they get copied to the GPU for you.

When you're calling any library, that simple C call invariably goes somewhere else inside the library itself. For OpenGL this is because there's always hooks for introspection, or because the GPU driver wants to implement something themselves, etc. For many other libraries it's just because people don't feel like they've written "production quality code" unless it goes through a bunch of hoops and ends up in some method with five underscores in the name.

In Metal the Obj-C abstraction is part of the design and used to eliminate any other abstraction people would want to introduce. The objects you get back from the API are implemented straight in the GPU vendor code, and the debug tools can swap the methods out for extra validation, recording, etc.

Any overhead coming from objc_msgSend is minuscule compared to the gains from things like better serialization of GPU-bound tasks and not having to synchronize repeatedly with the CPU.

If you're worried about refcounting, use ARC (which you have to with Swift anyway). First, the compiler is very smart about optimizing away retain/release/autorelease calls whenever it can. Second, when those calls do have to be made, they're implemented using vtables, and never hit objc_msgSend() in the first place.

I think it is about time industry relevant OS vendors start to move away from C, kudos to Apple.

Pretty ballsy, it didn't work well for Microsoft when they said "Use DirectX or die" I doubt it will work well for Apple. This is especially true for OpenCL (also deprecated) which nobody on big Linux server farms with GPUs is going to be using "Metal on Linux" for their code.

How exactly did it not work well? Windows is THE gaming platform for PC.

It kept a lot of commercial software off the Windows platform and left it on Workstations like SGI and DEC had. The movie houses that were rendering movies were using OpenGL on their renderfarms and its lack of availability on Windows kept windows off those desktops.

The key being that if you've got a technology that works on both server farms for production and workstations for development, you support that so that your OS is a viable candidate for the developer workstation. I don't see a Metal port coming to Linux in any reasonable way any time soon, and I don't see researchers giving up OpenCL or even OpenGL any time soon, so it just means that Apple is going to forego that business.

With the recent github purchase it gives the oddly dissonant experience of having Microsoft being the 'developer friendly' OS company and the MacOS being the 'developer hostile' OS company. Where, and this is important, support for cross platform tools determines hostility or support. I would not argue that Apple is not the best development environment for the Apple platform, or Windows for the Windows platform.

OpenGL is a real-time graphics API, not an offline render system used by renderfarms. I have never heard of movies or special effects rendered in OpenGL. The first major renderer was Renderman, the only game in town for years, and it has nothing to do with OpenGL.

You are correct. However, many in-studio tools are written in OpenGL. These tools are used to model objects and layout lighting, scenes, etc. by artists. They are written in OpenGL, usually on Linux. (Source: I work with several people who formerly built these tools for well-known studios like DreamWorks and Sony.)

Of course, but that’s not what we are discussing. The context for my comment was regarding offline rendering and render farms.

That's like saying roads are THE travel surface for cars.

... yes?

Not sure what you're saying is changing on the server -- people are going to go from not using OpenCL to not using Metal.

Everything is CUDA. Everything depends on the shitty unstable software designed by a hardware company (Nvidia). This sucks and I hope someone can disrupt it, but Apple has no influence in the field of GPU computing.

When did Apple ever have influence in the field of GPU computing.

And I work in data science and nobody is using their own laptops when you have AWS.

I didn't say they did. My claim is that Apple deprecating OpenCL is a straightforward and uninteresting thing; it's a company that has no influence on GPU computing getting out of the business of a technology that also has no influence on GPU computing.

I work in data science too, and who cares about laptops. Desktop computers with GPUs, SSDs, and a lot of RAM are what you need. You can thoroughly bling out the hardware and the entire computer will still cost less than your monthly AWS bill to access a GPU. (This is all getting pretty irrelevant to Apple, though, who doesn't make such computers.)

Whatever criticisms can justifiably be levelled against nvidia, having a bad software stack isn't one of them.

Most of the field of machine learning is irreproducible right now because you can't not use CUDA, but you can't promise that it will work the same on anyone else's computer, or that it will work six months from now.

Actually, it worked well for MS - back in the day, most of the games in the industry were done in DirectX. And to be honest, DirectX/3D was the only option if you wanted to have Vulkan-like low-level access to GPU.

Apple isn't a large gaming platform and they aren't used as server farms.

So I can't imagine this is going to hurt at all.

The writing was on the wall since they stopped updating it (OpenGL) 5 years ago.

True, but it'd be a shame to lose out on the library of legacy software and games.

Edit: and XQuartz (X11) with OpenGL comes in handy once in a while too...

I mean... it looks like there's at least one commercial option for OpenGL on Metal (MoltenGL).

Also, it seems to me that the better option in the long run for legacy game support is to just run a VM for the highest degree of compatibility.

Yep, I bet the MoltenGL guys are throwing a major party right now. I wonder what kind of effort it would be to hook up Mesa to a Metal backend...?


EDIT: As a follower of https://mesamatrix.net/ I don't think it would be unreasonable to say about 3 years to get something that kind of works, five for something semi reasonable, and 8 for something at modern open gl level.

I guess John Carmack is sad today (he advocated early for OpenGL on Mac).

I am not sure. In the past Carmack stated that DirectX at some point became a much better API compared to OpenGL. He also once stated (admittedly when talking about id Software, after he left the company) that he's not really a sentimental person.

John Carmack probably knows the Apple of today is not the Apple of yesteryear.

If the Apple of today would just include a couple more buttons on their mice, John Carmack would probably be happier.

Say what you will of Microsoft they still understood how important backwards compatibility was and didn't do the same to OpenGL back when they were pushing D3D hard.

I really hate to see such a focus on a platform lock-in API when viable alternatives(Vulkan) are available.

So that's how they weed out the still good 2011 Macs...

I can't say I'll be protesting by not buying a new Mac because I'm already devoid of any desire to do so.

I'm curious about this – does it really matter? How important is it that the Operating System has OpenGL? Can't individual apps just static link to (and ship) their own versions of OpenGL?

OpenGL is just an API, the underlying features are provided by the graphic driver. You can't ship with your own OpenGL, that would mean shipping with your own amd/nvidia/intel driver (and the associated kernel module, etc).

A reasonable alternative would be to implement OpenGL on top of Metal for compatibility but this is a lot of work.

What is gonna happen to games and apps which were using OpenGL since today then? They will stop working in the new macOS version?

edit, ups just read it: Apps built using OpenGL and OpenCL will continue to run in macOS 10.14, but these legacy technologies are deprecated in macOS 10.14. Games and graphics-intensive apps that use OpenGL should now adopt Metal. Similarly, apps that use OpenCL for computational tasks should now adopt Metal and Metal Performance Shaders. [0]


There's at least one implementation that I'm aware of in MoltenGL (https://moltengl.com/moltengl/), though that's only ES.

There is also MoltenVK https://github.com/KhronosGroup/MoltenVK for Vulkan on metal...

"Also, r.i.p. my old macbook air 2011 :-/"

macOS performance was already getting so poor on 2010/2011 MacBook Airs, so I think this is the right move. I recently downgraded my old 2GB 2010 MacBook Air back to 10.11 El Capitan and it runs much, much better than it did on High Sierra.

Careful with that. A number of security fixes make it to El Cap, but I don't think all of them do.

I guess we will be saying our last goodbye to “write once, run on all major platforms” graphics code. Does this mean that the only choices now are 1. Fork your graphics code into two separate implementations or 2. Go up the stack and use a cross-platform “game” engine like Unity? I suppose 3. Stop supporting macOS is another (sad) option.

If you're programming in Rust, you can just use `gfx-rs`'s HAL[1]. This is designed to be a Vulkan-like API that works on top of Vulkan, Metal, D3D12, or OpenGL.

If you aren't in Rust, just use Vulkan. There are high-performance wrappers for it in various stages of development, such as MoltenVK[2] for macOS and iOS, and VulkanOnD3D12[3] for UWP. Non-UWP Windows, Linux, and Android should support Vulkan natively through their GPU drivers.

[1]: https://github.com/gfx-rs/gfx

[2]: https://github.com/KhronosGroup/MoltenVK

[3]: https://github.com/Chabloom/VulkanOnD3D12

> “write once, run on all major platforms” graphics code.

This was never true for game consoles, regardless of urban myths regarding OpenGL support.

Fine. All major desktop, laptop, and mobile platforms not including game consoles. Still, more portable than any other existing alternative.

What does that mean for games like League of Legends running on OpenGL? (https://engineering.riotgames.com/news/trip-down-lol-graphic...)

Riot probably has enough resources to pay its developers to target Metal on macOS by the time OpenGL is removed (some number of years from now).

It’s the smaller scale developers and projects that might not have have such ability.

Certainly has enough resources. Their 2015 revenue estimate was $1.6 billion. 2,500 employees in 2017.

(Not intending to nitpick--just pointing out that Riot is a LOT bigger than many folks realize)

The smaller scale ones are generally using an existing engine like Unity or Unreal, which handles Metal support.

Generally, yes. Certainly not all. The likes of SFML and SDL2 and libgdx have zero support for Metal, although SDL2 did recently add support for Vulkan.

How far back did Unity/Unreal transparently handle Metal support? I suspect some developers won't have the resources to update games running on older versions of those engines.

Games made by developers who've gone out of business are probably just going to stop working in a few versions of MacOS.

Perhaps they could use some middleware like MoltenGL [0]. That way they might still be able to write against an OpenGL API (which allows for code re-use), while supporting Metal under the hood. It does seem this particular tech might be more suited for mobile platforms, unless OpenGL ES is also used these days on the PC / Mac platforms.


[0]: https://moltengl.com/moltengl/

I guess an eventual port to Metal, like WoW.

> Wow, they are DEPRECATING OpenGL from both macOS and iOS

How does this effect Qt iOS applications?

It doesn't, because they are migrating to an universal 3D backend, away from a pure OpenGL one.

Do you have more info on this? When is it expected to be done?

macOS apps as well...

OpenGL is a pain in the ass to use, Metal does what OpenGL does so much better, and with the Vulkan Metal wrapper you can still write cross-platform apps/libs. So nothing of value was lost.

What's the consensus on Metal? How does it compare to OpenGL?

You can’t compare them directly, they’re very different categories of graphics APIs. Metal belongs to the category of “modern low-level graphics API” (which also includes Vulkan and Direct3D 12), while OpenGL is an older higher-level graphics API (which also includes Direct3D 11 and previous).

The low-level graphics APIs like Metal and Vulkan allow for much better performance, but they are much harder to use and require more work from the developer (hence, they’re usually used only by game engine makers). Higher-level graphics APIs like OpenGL are less efficient and have lower peak performance, but are easier to use for individual projects and have the benefit of having existing functional code (no need to rewrite a working project).

Also, OpenGL (and its mobile and web variants OpenGL ES and WebGL) are very portable (macOS, Linux, Windows, iOS, Android, both and native and in browsers), while Metal is macOS/iOS-only.

To be fair - OpenGL was fairly horrible to use directly. It sat in an awkward middle area between "low level enough to be efficient" and "high level enough to be productive".

Maybe I'm biased - every time I looked at OpenGl code I shuddered and ran away to a higher level framework (I'm excluding shader code from this - that's concise enough for me not to mind getting that close to the metal)

So much this! I've been writing OpenGL code on a daily basis for the past 10 years, and I hate it. It works like an API designed in the 1970s. It uses none of the modern features of languages, easily allowing you to pass incorrect data to a function, and giving you almost no clue what, if anything, went wrong. Just look on StackOverflow some time at how many questions are basically, "My previously working OpenGL program is now rendering nothing but a black screen. What went wrong?" And then compare the huge number of different causes. There's no reason they couldn't have improved it along the way, keeping backwards compatibility, and just adding more informative error codes and better use of language features. But they didn't. My feeling is "Don't let the door hit you on the way out."

I don't think that is quite accurate. But you will have to ask a professional game developer for better judgement. Comparatively speaking.

Vulkan, you need to know exactly what you are doing. There are little to no handhelding. You are trying to squeeze out every last 10% of performance in exchange for lots more development time. You need to write a hell a lot more code to do something what you previously thought were simple.

OpenGL is higher level that should be compared to Direct3D 10, not 11. As a matter of fact I will go about saying compare to Direct 3D 9. And unless you are a OpenGL zealot, any sane game developers would have told you Direct X 9 already exceed OpenGL in most way.

Metal is offering what most of Vulkan can do and making it even easier then OpenGL.

Honestly I don't get / understand why all the backlash. OpenGL is deprecated, and it simply means Apple won't be updating OpenGL anymore. Stop asking every god damn year. They have been shipping other deprecated library and API for years! OpenGL is developed in such a way that no one really cares. And designed by committees for maximum backward compatibility. And if you want your App on iOS, you will have to use Metal anyway.

Thank you for explanation. I don't do any graphics programming, besides few toy projects with OpenGL, but my understanding was that one of it's benefits was portability (for a varying definition of "portability").

That's why I wasn't sure what Metal is offering instead.

Except they aren't portable at all for game consoles.

Most of the game devs I follow on Twitter expressed a positive opinion on Metal, particularly over Kronos' Vulkan and of course everyone(that actually has to develop with it) hates OpenGL.

To my understanding, the consensus is that it's, y'know, fine, but nothing particular to recommend it over Vulkan. The main problem people seem to have with it is that it feels unnecessary, like Apple being incompatible for the sake of being incompatible.

Didn't Apple try to work with Krohnos to make Vulkan, but they were taking so long, that Apple just gave up and made Metal and shipped it before Vulkan was finished?

Much better.

A 3D modern API that acknowledges the time of pure C APIs is long gone, with C++14 as shading/compute language, providing math, 3D models, materials, fonts, textures support.

Whereas in OpenGL you are stuck with fishing for libraries, multiple execution paths depending on the extensions and bugs, compiling and linking shaders at runtime without library support, C style APIs and a plethora of deprecated APIs.

Metal is a lower-level API, somewhat similar to Vulkan or Direct3D 12. OpenGL supports driver extensions that can be used to reduce overhead in a similar manner to Metal, but this is largely a moot point when it comes to OpenGL on macOS, as many such extensions are simply not available.

They also sneakily deprecated the sub-pixel font rendering, so the "Use LCD font smoothing when available" option will be gone from the preferences.

More info on this for the interested: https://twitter.com/siracusa/status/1004143205078597633

Mentioning this in a couple different places, but consider Molten (https://moltengl.com/) for this use case; it's a third party but very high quality reimplementation of Vulkan and OpenGL ES atop Metal. In other words, an easy way to adopt Metal without actually porting your app.

Will this effect Safari (and other browsers) supporting WebGL on OSX/iOS?

Doesn't Core Image use the OpenGL Shading Language, so is that going too?

That was gone last year, check WWDC 2017 Metal 2 related sessions.

When it was introduced, the Core frameworks were updated to use Metal instead, with OpenGL left for backwards compatibility mode.

the notch...no 3.5mm aux...no OpenGL... the hits just keep on coming. Is this what passes for "invention" today? What's next, deprecating USB?

You can add CD ROM, the diskette drive, SCSI, ADB, FireWire and the LaserWriter to that. How dare they!

Notch decreases function and is aesthetically displeasing.

3.5 mm = 1) there is no way a wireless medium will ever have better throughput than a wired medium over a superb DAC - ever, 2) dongle, 3) extra battery worry now for BT.

OpenGL, I will grant you that one, since both Metal and/or Vulkan are a vast improvement on OpenGL.

> Notch decreases function

I know that some people like to say this, but it's pretty clearly wrong, and it makes you look unobservant when you repeat it. It takes about five seconds and two phones to demonstrate this. The standard top section on every phone, Android and iOS alike, is a dedicated status section with little system icons with wide gaps of unused blank space between them. Every phone, every time. A notched phone just puts the camera directly between those icons instead of putting unused pixels there. It doesn't take a genius to see that un-notched phones have both larger top bezels and more pixels wasted in the status area. The notch doesn't cut into the screen. The screen extends up around the camera and puts the status icons on the horns.

> and is aesthetically displeasing

All of the tens of people I know who have notched phones say they love it and don't notice the notch. So, maybe to you, but it seems like the market is speaking.

It might benefit you to look at a 3rd phone. Because my top notification bar is always full. So no, they are not clearly wrong. Or better yet, maybe it just comes down to individual preference!? I personally never thought I'd see someone trying to justify the loss of screen space as a "win". But here we are. And also, the market is still out on it. Until the majority of phones are doing a notch, the market hasn't spoken. Apple hasn't been the leader in smartphone sales in many years now. They aren't even #2 anymore.

Notched phones have gained screen space, not lost it. Without the notch, the new screen area around the notch would not be screen area at all.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact