Hacker News new | past | comments | ask | show | jobs | submit login

The PS3 had OpenGL. Nobody used it because it was vastly slower than libgcm. The PS4 has OpenGL. Almost nobody outside of indie-land has even looked at it because there's no reason to when libgnm is right there and you still pay a significant performance vig, in an industry where you do not get the luxury of ignoring performance, for an only marginally better API. The Xbox One, on the other hand, does provide what appears (I don't have access to any of these consoles, I just know a lot of game devs who like to drink) to be a complete DX11 implementation...and it's plagued with perf issues you don't see on a PC because the assumptions of the API don't map to the actual hardware--which, I understand, the lower-level constructs of DX12 will address.

The future of non-indie game development seems very much to be going towards hardware: Mantle, Metal, libgnm, DX12. If you're on a console, you can afford to actually develop for the console. If you're not, you don't matter. (Indies don't care, but few people care about them, so that tail wouldn't wag the dog. Nobody cares about Steam Machines, so I don't know why you'd bring them up.)




> The PS3 had OpenGL

No. It had OpenGL ES 1.0 with Cg for shaders.

> The PS4 has OpenGL

Since when? Only if a third party company is offering it.

http://develop.scee.net/files/presentations/gceurope2013/Par...


PSGL and GLES are close enough for the discussion at hand; if Cg vs. GLSL is really going to grind your gears you've already lost. You're right about the latter, though--I misunderstood a prior discussion with a friend of mine (he was speaking in hypotheticals, I thought he was speaking in specifics). Mea culpa.

It doesn't change the abject silliness of the sibling thread though.


Yeah, but by not being the same thing, in the end the result is the same. Multiple code paths, or backends to handle the differences.

> It doesn't change the abject silliness of the sibling thread though.

Agreed.

I do boring IT stuff. But once upon a time I had some opportunities in gaming (long long time ago), which I blew up for being more focused on the OSS aspects of the tooling than what really mattered, having a game.

I really learned the hard way, how the different the game development culture is from the FOSS one. Which I should have known given the similarities with the demoscene.

Even though I am not in the industry, I do follow it regularly.


It depends on where you're coming from though, yeah? Like, PC->PS3, (good) OpenGL/Cg to GLES/Cg wasn't a big step (I use Cg with OpenGL today). Nobody really used it anyway, though, because libgcm was just so vastly superior.


The fact that OpenGL on consoles has poor performance is the fault of console manufacturers, not of OpenGL itself. Also, their release schedule results in outdated hardware in comparison with regular computers. I.e. developers are forced into mentality of writing on a very low level to squeeze performance from a dated spec. More frequent updates could solve this situation and that's what Steam Machines aim to change as well. There is no valid reason why OpenGL can't be made to perform well on consoles except the lock-in mentality which plagues consoles market.


> There is no valid reason why OpenGL can't be made to perform well on consoles except the lock-in mentality which plagues consoles market.

Oh sure. Lock-in mentality is the reason. Never mind that Apple is doing Metal and AMD is doing Mantle because of defined, demonstrable issues with higher-level abstractions, both GLES and DX11 respectively. Certainly never mind that DX12 is shaping up to provide the same sort of low-level primitives as Mantle/libgnm/Metal. No, they're just hissss lock-in hissss. Can't possibly be because, like, it's a better option, can't have that. Like, do you know how many calls you have to make to accomplish stuff in some basic OpenGL extensions? And how much less you can do without the significant amounts of overhead it causes--overhead you can't remove because it's in the spec and now it never goes away?

Did you ever stop to think, for just one second amidst your "there is no reason" absolutism about something it certainly sounds like you don't know much about past laymanship...maybe they're not laymen and maybe they have a good reason for stripping away abstraction? Not even providing their own incompatible abstraction, but just getting rid of it? I mean, Snidely Whiplash wants to make his platform dev-friendly too, but instead of doing that he gives you this bleeding bare-metal API? Come on. Have you stopped to give these guys an ounce of credit?

.

And complaining about a "release schedule resulting in outdated hardware" fails to comprehend that "outdated hardware" is every nine months. That is a feature, so that normal people don't have to go buy nVidia GTX9581 Platinum Edition cards every two years. Your position frankly beggars belief. The PC gaming treadmill sucks. The mobile treadmill sucks. And you want to make people jump to it on consoles? Do you know a normal person?


> Never mind that Apple is doing Metal and AMD is doing Mantle because of defined, demonstrable issues with higher-level abstractions

Bad comparison. While AMD proposed to Khronos their Mantle as a base for OpenGL-next and in general aim to make it open, Apple didn't do any such thing. Apple does Metal precisely out of lock-in mentality. Any API that will lock developers to one platform is a dead end.

> "release schedule resulting in outdated hardware" fails to comprehend that that is a feature, so normal

Yeah, a "feature" that degrades games' quality because developers are constantly held back by requirements of console ports. It's not a "feature" to stagnate hardware for such long periods of time. Consoles are closed platforms with manufacturers in full control. It has some pluses like more stable expectations, but minuses outweigh them. Open platforms are the future and the locked-in ones will be considering some changes when stronger competition will give them a kick.


> Bad comparison. While AMD proposed to Khronos their Mantle as a base for OpenGL-next and in general aim to make it open, Apple didn't do any such thing. Apple does Metal precisely out of lock-in mentality.

If you don't think Apple has invested a shit-ton of money into GLES to make it habitable I just don't think you're living in a world inhabited by the rest of us. Apple has spent tons of man-hours and money dealing with the OpenGL process. But no, Snidely fuckin' Whiplash is over here curling his moustache and hissing "yessss, yesss, everything we do by stripping out layers of abstraction and making things faster for obvious and comprehensible reasons is eeeeeeevil."

They still support GLES. They still participate in the process. But they offer a frankly better solution too, one on top of which you can build an agnostic API that can better leverage each platform's better options than OpenGL.

> It's not a "feature" to stagnate hardware for such long periods of time.

Cool. Get normal people to shell out for a new console--you know, that thing that lives in a home theater center that you buy and stop thinking about--every two years. Don't worry, everybody else will definitely wait for the market to prove you right, they won't be over here cooking on what actually works.


Upgrades should not depend on where you place your computer. There is absolutely no logical relation between how often you want to upgrade and the type of gaming it's used for. So who said consoles have to be all locked up and controlled by Sony, MS and Co.? There is no reason to, but they do it in order to control developers. It's all about lock-in and reducing choice, not about anything else.

> If you don't think Apple has invested a shit-ton of money into GLES to make it habitable I just don't think you're living in a world inhabited by the rest of us... they offer a frankly better solution too

So, where is their proposal to make Metal into OpenGL-next? Until it surfaces, Metal will remain a lock-in attempt.


> Upgrades should not depend on where you place your computer.

They aren't computers. This is the blinkered view of people way too close to their choice of silicon that ignores what people who actually buy hardware want. Normal people don't want computers. Computers suck and are hard to use. They want predictable, turn-key solutions. I'd point to the smoking ruin that is the commercial HTPC market as evidence that, hey, maybe this actually is a thing that works...but I have this sneaking suspicion you'll just say that's definitely not a true Scotsman, no no no.

> So who said consoles have to be all locked up and controlled by Sony, MS and Co.?

Nobody, so don't put words in my mouth.

> So, where is their proposal to make Metal into OpenGL-next? Until it surfaces, Metal will remain a lock-in attempt.

If you would stop extrapolating general-purpose software to to hardware layers for just a tick, the notion might enter your head that Metal is designed around the behaviors of the guts of the A7. It doesn't make sense to generalize it because the generalization of it into "OpenGL-next" takes away the reason to use it! You can bleat 'till you're blue in the face about how terrible it is that people would trade abstractive consistency for performance, but it's what everybody's done in games--and, when you add enough zeroes to a problem, software in general--forever.

Abstractions inevitably fail. Your ideology can attempt to ignore it. It'll fail, too. And the arrogance that you display in this insistence that people have thought about this much much more than you must be acting dishonestly because they disagree with your ideological bent is simply astounding.


> Nobody, so don't put words in my mouth.

Sony and MS did. "Nobody" now is probably Valve who want to disrupt the sickening locked up consoles scene. Time will tell if they'll succeed.

> It doesn't make sense to generalize it

No, it does make sense to provide a generic cross platform API. We aren't in the dark ages of computing anymore. Forcing developers to use hardware specific code is backwards unless we are talking about drivers and assembly like development.

> Abstractions inevitably fail.

Is that Apple's argument for Metal? Abstractions don't fail. They save tons of effort for developers. Otherwise why don't you propose writing programs in machine code like in the olden days? That for sure can produce the most optimized result in theory.


> We aren't in the dark ages of computing anymore. Forcing developers to use hardware specific code is backwards unless we are talking about drivers and assembly like development.

Do you think game development followed the web off the performance-doesn't-matter cliff and went "welp, we're just going to go write everything in Ruby"? What exactly do you think game development is if not intensely performance-critical low-level software development? Do you somehow think they aren't writing whacks of assembly and adding code paths for specific versions of drivers because of reasons X, Y, and Z?

> Otherwise why don't you propose writing programs in machine code like in the olden days?

You literally must be trolling. What do you think game developers do when they run into hot spots in their code? Bunches of AAA games (and every engine vendor, thanks much) have one or more gurus around who eat and sleep the assembly language for every deployable platform. I know a guy--socially, we're not friends, but we've talked about this--whose entire job is rendering optimization for iOS. He owns a big whack of stiff C where every allocation is carefully considered and which has more than a little assembly in there. Because that's what you do when you need to wring performance out of a system. He digs Metal because it makes him better at his job--at making the hardware optimally do what he very badly needs it to do.

Stop trying to conflate web development on future computers from beyond the moon with game development because what you know does not hold. These games are not being written in Node, they are not being written in environments with garbage collection--hell, a lot of the time they don't even use inheritance in C++ because of the cost and complexity of vtables (this is one of the reasons the standard collection libraries in C++ are a template soup, as it happens). It's not the "olden days," it's today, now. That's why these low-level APIs are coming about: because the abstractions you think don't fail don't cut it.

"Abstractions don't fail" is one of the most unintentionally funny things I've read in a very long time and I can literally think of a dozen places just off the top of my head in my very much non-gamedev job where the abstraction layers other people put in place screwed me hard. Quit while you're very far behind.


> Do you think game development followed the web off the performance-doesn't-matter cliff and went "welp, we're just going to go write everything in Ruby"?

Cross platform APIs can be native in C/C++. Ruby or Node have nothing to do with it.

> It's not the "olden days," it's today, now.

Sure, it's today now on consoles which provide low quality generic APIs because no one cared to provide high quality ones. That's not a valid reason to say "fire! we now need machine code to save the gaming industry at large". That's a reason to say that console manufacturers don't care about developers besides locking them into their platforms. Luckily this is going to change. Of course you are free to use low level code still when it's really needed.

> I can literally think of a dozen places just off the top of my head in my very much non-gamedev job

We are talking about gaming. I can think of cases where low level development is needed too. That's irrelevant to the discussion about using cross platform APIs vs using platform locked ones.


If you are seriously going to attempt to reject the parallel between assembly code and low-level platform-specific APIs, I'm utterly done with you.

(And there's x86 and x64 assembly in UE3. For PCs, right now. But don't let that stop you from telling people how to do their jobs. It's a shame @ShitHNSays got canned.)


If you use Metal or similar you'll have to either show users of other platforms to the door or to support N such APIs. Most would prefer to reduce that number not multiply it.

All that cheering for Metal is not sincere. Talk to actual developers who are forced to support many APIs because no one cared to make one cross platform work well. The bottom line, we need less of lock-in APIs promoted as the way to develop games.


> Talk to actual developers who are forced to support many APIs...

This is not how the game industry works.

There are studios specialized in porting games for specific platforms.

Usually a studio focus on one specific platform and outsources the remaining platforms to such porting studios.

This is a common practice since the early days.


In some cases. In other cases engine developers support multiple back ends and for actual game developers this is simplified by using that engine. But in any case if the game wants to be inclusive rather than exclusive the burden of supporting multiple APIs will show up. Either for studio developers themselves or as expense to hire contractors who do the porting or this will fall to engine developers.


Sony asked major studios if they wanted OpenGL ES 2.0 and they didn't care.

http://sandstormgames.ca/blog/tag/libgcm/

<quote> At one point, Sony was asking developers whether they would be interested in having PSGL conform to the OpenGL ES 2.0 specs (link here). This has unfortunately never happened however, as developers seem to have mostly preferred to go with libGCM as their main graphics API of choice on PS3. This has meant that the development environment has started becoming more libGCM-centric over the years with PSGL eventually becoming a second-class citizen – in fact, new features like 3D stereo mode is not even possible unless you are using libGCM directly. </quote>

> There is no valid reason why OpenGL can't be made to perform well on consoles except the lock-in mentality which plagues consoles market.

Game studios culture doesn't care about FOSS.

What matters is making the best game on the platforms the publishers are paying in advance for.


The "major studios" are mentioned here probably because most developers simply don't target PS. I.e. indie developers aren't interested in it that much. So I wouldn't take that as an indicator that developers at large don't care about cross platform APIs. If Sony would offer an open platform without barriers to enter, those answers would be very different.


Most developers don't care about the platforms with the most reach and most invested gamers. Sure. Must be why every indie developer I know would sell their eyeteeth to get even a mid level promo package on PSN.

You're taking your ideology and trying to retrofit the world to accomodate it. Doesn't work that way. In the world of what is, rather than the world of what one might like it to be, nobody really cares. Tough.


> You're taking your ideology

You are talking about Apple's spin of Metal (which is lock-in ideology). In practice it's easier for developers to use one toolkit instead of using 20 locked-in incompatible APIs.

If developers use ready engines it becomes easier for them, but that burden is shifted to engine developers then. Someone will have to deal with that major mess. There is clear pragmatic benefit in cross platform APIs for gaming, and claiming that it's just ideology is nonsense.


> In practice it's easier for developers to use one toolkit instead of using 20 locked-in incompatible APIs.

Not when that "one toolkit" is worse.

You are conflating "lock-in" with "optimal suitability for a platform". Hardware matters. Your continual handwaving can't ignore that.


> Not when that "one toolkit" is worse.

Here it's the case of "worse because it wasn't made better", not because it can't be better. That's my point. So difference in approaches with this subject like between AMD and Apple shows who cares about making it better and who cares about locking developers into their platform.


So, your "point", in everything that I have read from you thus far, is that, if the world only worked the way you think it should work, then OpenGL would reign supreme as king. We can just ignore the many ways in which it sucks because, hey, you're talking hypotheticals!

Right. Of course, the real world works nothing like the world that you have described, but ok, good luck with that.


There's a lot of Linux-on-the-desktop-style wishcasting in this thread.


Or rather Apple-style lock-in cheerleading.


Nobody's advocating lock-in. What is being advocated is using the right tool for the job. What has been brought up--by you--is the completely laughable notion that "abstractions don't fail" when the overwhelming majority of people involved with high-performance graphics are pretty sure that the abstractions have failed. Which is why they're going exactly the other way from your ideological wishcasting and building libraries and frameworks that are tailored to the hardware rather than using a driver to overcome the impedance mismatch at the cost of performance.

You don't know what you don't know but it isn't stopping you from talking shit. Stop.


> when the overwhelming majority of people involved with high-performance graphics are pretty sure that the abstractions have failed.

Poor abstractions. I'm not convinced that there can't be a well designed cross platform graphics API that is sufficient for the majority of cases. Prove that it's impossible or stop, because otherwise your claim that you don't advocate lock-in doesn't sound sincere.


>Prove that it's impossible.

The abominable snowman exists. Prove that it doesn't or shut up.


Anything more useful to say than trolling comments? The commenter above claimed that cross platform graphics APIs aren't the way to go because they failed. I see no proof that it's not a possibility. They didn't even fail - they were quite useful in many cases but with their current downsides they didn't live up to real potential. So that can be improved by making better cross platform APIs instead of claiming that one has to run to hardware specific APIs right away and there are no other options.


I'm not trolling; I'm pointing out the fault(s) in your logic. Perhaps it is a "possibility", but it doesn't exist today, and better options are available, so why would I use an inferior implementation? Because it more closely coincides with my world view?

No, I need to get software written that runs well. Now, if my requirement is to run on N different platforms (where N > 1) then I will have to look at OpenGL. If it's not a requirement then I won't waste my time. You're comments reek of ideology and completely lack practicality. You make assumptions about a complex subject which smarter people than you or I have spent years working on and come to different conclusions.


> Perhaps it is a "possibility", but it doesn't exist today

So? It's not a reason not to make one or to claim that since it "failed" everyone needs to run to platform specific APIs. That was the point.


So indie developers are "most" developers? What planet do you live on? Do you know what the term "indie" comes from?


Indie means studios independent of major publishers, i.e. self publishing ones. Out of the recent games the majority I'm actually interested in are from indie developers and they are true work of art. The rest are mass market junk.


That's beside the point. What you are interested in and what the majority of people are interested in do not align. And where there are more consumers, there are more developers.


Did you see some studies that say that there are less indie studios that publisher funded ones?


It's actually a fault of OpenGL. The main issue is that the modern GPUs are executing draw calls concurrently. If you've done some programming you might know that you cannot really have global mutable state together with parallel execution, yet the global mutable state is the core of OpenGL. So you need to copy the state every time you launch a draw call, which is an expensive thing to do even with the hardware aids available. Definitely more expensive than just launching parallel draws each with its own state.

Not just the OpenGL is suffering from this, DirectX has very similar problem up to DX11, the DX12 is out to address this.

There are extensions that allow you to do the same in OpenGL but then what would be a point to bring the entire OpenGL if you are only going to use the extension that is nothing like the OpenGL and does the same the native API already does?


OpenGL 4.5 addresses some of concurrency bottlenecks. My point was not to say that OpenGL is perfect as is. But to say that any potential successor can't claim that OpenGL is bad because it aims to be cross platform, therefore better alternatives all need to be platform specific. That's a false logic. Cross platform API can be made better.


> OpenGL 4.5 addresses some of concurrency bottlenecks.

How so? It adds DSA which is just a different way to access global state. The global state is still the cornerstone of the 4.5. The only way it addresses performance issues is through "bindless" extension, which pretty much tosses away the entire OpenGL pipeline and leaves just a draw_indirect_multi call call with everything pushed into shaders. It's too extreme for many developers even it's been available on non-NVidia hardware.

> My point was not to say that OpenGL is perfect as is.

Never mind me then. I only replied to your claim it's not the OpenGL fault it's slow on consoles.


> How so?

I mean new flush control support.


It is not the concurrency issues I meant. It's about accumulating commands in the userland and transferring them to the kernel mode driver. The concurrency that affects performance the most is inter-draw call concurrency, as I said above. Flush control is about GPU-CPU concurrency, which, btw, still has long ways to go to reach the level of the control available in native APIs on consoles.


Hopefully OpenGL-next won't be a minor modification of the current limited design but would create a new design from the ground up. So far they indicated that it's the goal.


Nobody can forbid you from hope, but JFYI there had been two major version of OpenGL (3 and 4) already in the age when all the GPUs available had been parallel. In fact, at the time of OpenGL 2 design there had already been parallel GPUs on the market and everybody involved knew it's the way of the future (companies on the ARB either design their own GPUs or have access to the pre-release designs).


I'm not really sure why previous major versions failed to redesign the API from the ground up to fix major deficiencies in it. Was it organizational flaw or lack of good proposals?

I guess now they came to a better realization of the situation or now there are more people who actually want to improve things significantly rather than avoid such issues.


It's not a mystery to me: OpenGL is a scene description API, not a real-time graphics API. It comes from the environment where frame times under 5 seconds (not milliseconds, 12 frames per minute) are considered real-time and the compatibility is much more important than performance. Apple was the only ARB member that had been pushing for the OpenGL to become game-grade real-time mainly due to lack of any other real-time 3d graphics API on their platforms (both Mac and iOS), which is not the case any more as Apple is pushing its own API now.


> It's not a mystery to me: OpenGL is a scene description API, not a real-time graphics API.

Rather it originally was such API. I'm saying, why couldn't they draft a new API from the ground up as one of the previous major versions? It looks like only now they are doing exactly that. So why wait so long? That's what I mean I'm unsure of. Probably some organizational problem or lack of some major pushers for such change in the past.




Applications are open for YC Winter 2020

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: