Hacker News new | past | comments | ask | show | jobs | submit login

OpenGL is pretty. Much prettier than these Metal and Vulkan abominations.

The difference is that OpenGL is designed to be easy for humans. glBegin(GL_TRIANGLES); glVertex3f(x, y, z)...; glEnd(); you can't beat that. The issue is that it hard for the driver to optimize.

That's where Metal and Vulkan come into play. These are low level APIs, sacrificing user friendliness for a greater control over the hardware. It is designed for 3D engines, not for application developers.




Nope, glVertex3f was deprecated years ago by OpenGL itself. That is not the way the API works any more. [1]

Look into what it takes to write the minimum viable OpenGL program, written using non-deprecated routines, that puts a textured triangle on the screen. It sucks. On top of that, OpenGL is slow and gives you no way to create programs with smooth performance -- for example, it will randomly recompile shaders behind your back while you are trying to have a smooth frame rate.

1990s-style OpenGL was good for the time. In 2018, OpenGL is a pile of poop.

[1] https://www.khronos.org/opengl/wiki/Legacy_OpenGL


> for example, it will randomly recompile shaders behind your back while you are trying to have a smooth frame rate.

What? I've written commercial games with opengl on osx/ios and my experience doesn't show that at all.


Maybe the games were not very complex? Professional game programmers building games with lots of shaders are very familiar with what I am talking about. See for example this thread:

https://www.opengl.org/discussion_boards/showthread.php/1998...


> What? I've written commercial games with opengl on osx/ios and my experience doesn't show that at all.

State-based recompilation is a known issue in many GL drivers, particularly on mobile. E.g. changing blending settings may cause shaders to get recompiled. This can take up to a second.

Some engines work around this by doing a dummy draw to an offscreen surface with all pipeline configurations that they use at init time. This (usually) guarantees that all the shaders are pre-compiled.


Also, you can handle caching of compiled shaders yourself now (glProgramBinary).


I think the recompilations being talked about here are shaders generated by the OpenGL implementation behind your back. That is, your program never sees them as shader or program objects because they implement some permutation of blend mode, depth test, culling type, etc..


The non-deprecated OpenGL code for a hello world triangle is still an order of magnitude less verbose than Vulkan though.


While Vulkan is a bit verbose, it's not an order of magnitude difference if you follow modern OpenGL best practices. If you rely on default state and use the default framebuffer and rely on implicit synchonization, you can squeeze it down to a few hundred lines but that's not a good foundation to build practical apps on.

To give a ballpark figure, my Vulkan "base code" is less than 2x what my OpenGL boilerplate is for the same functionality. The big difference: the Vulkan code is easy to understand, but the GL code is not.

Comparing "Hello World" doesn't make much sense, OpenGL gets really darn complicated once you get past the basics.


Vulkan code is extremely front-loaded. HelloTriangle is much longer. A complete application can be significantly shorter.


In my opinion a similar difference exists between CUDA and OpenCL. OpenCL takes more code to get something simple going. But at least it doesn't break if you upgrade your gcc or use a different GPU vendor.


Each to their own but over the last 6 months I've written a graphics engine in openGL + SDL. Once you truly understand modern openGL you realise how beautiful it is.


You will think it's less beautiful when you ship that game on several platforms and find that it has different bugs on each platform, on each hardware version, and on each driver version. And most of these bugs you can't fix or work around, you just have to bug the vendor and hope they ship a fix in a few months, which they usually won't because your game is too small for them to care about.

This happens in other APIs too (we definitely had it happen with DX11), it's just that OpenGL is a lot more complicated than anything else due to its history, so it has proportionally more bugs.


> glBegin(GL_TRIANGLES); glVertex3f(x, y, z)...; glEnd();

That's fine for a "hello triangle" program, but quickly becomes ridiculous for anything approaching a serious engine. There's a reason that glDrawArrays() has been around since 1995 (and part of the core specification since 1997).


Made me want to revisit the good old NeHe tutorials for a quick browse :)

http://nehe.gamedev.net/tutorial/creating_an_opengl_window_(...

I wonder how much of this stuff is deprecated now.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: