Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Not trying to shit on anyone, but that screenshot reminds me of 1990s/early 2000's games. I'm sure there has been a ton of work that went into it, but having some better textures or highlighting the usability would go a long way.

Just based on the screenshot, I am not sure that I would even bother to dig too deeply into the library.



"Just based on the screenshot, I am not sure that I would even bother to dig too deeply into the library."

Adding any PBR materials as samples would have been the wrong choice, since those are not hardware or graphics api dependent, and are always for the implementor to implment by themselves.

You don't want a graphics api to look nice at this abstraction level.

You get access to device resources, shader API etc.

Once you get triangles in, it's up to you to make it nice using the shaders you write - materials and GI model of your choice.


> Adding any PBR materials as samples would have been the wrong choice, since those are not hardware or graphics api dependent, and are always for the implementor to implment by themselves.

That "bistro" image is all PBR materials, represented in glTF. It's supposed to look the same for all standards-compliant glTF renderers, and it pretty much does. I posted the same scene in another renderer above. It's a brightly sunlit scene with no environment shaders, so it looks rather blah. glTF and Vulkan can do more than that, but this is all the test example asked for.



I don’t see the API claiming to be a standards compliant GLTF renderer, and I would be very confused if it did claim something like that as it’s feature. That said a ’render GLTF PBR’ sample would not be a bad thing to show the authors intent how to organize things etc.


Now look at real 1990s/early 2000’s screenshots. You memory is deceiving you.

This screenshot is far from current AAA games, but there is no way to render such a scene in a game made for 2000 hardware.


It looks about on brand with the static renders used in a lot of Final Fantasy games of the PS1 era, albeit with higher pixel density of course, so I can see the resemblance. That said, this is definitely doing it in engine, so while I can see how the GP's memory palace built that memory, you're definitely accurate that no game from that era was doing graphics like this in-engine.


Half life 2 had better looking cityscapes, so early 2000s is accurate.


Half Life 2: Lost coast from 2005 might be a fair comparison for this droll hypothetical

https://youtu.be/j-Iykz0gb7Q (video uploaded 2006)


Pre-rendered their were. Vaguely reminds me of Myst or FF7 quality.


For those reading who may not remember or have played the original FF7, here are links to pre-rendered backgrounds in FF7:

https://www.jmeiners.com/pre-rendered-backgrounds/img/ff7.jp...

and the image in the article about Meta's library:

https://www.khronos.org/assets/uploads/blogs/2023-july-blog-...


Not apples-to-apples because FF7 was NTSC resolution and heavily compressed.

If FF7 were rendered at the same resolution, I think it would be comparable. Here's an FF7 AI upscale mod for reference:

https://www.resetera.com/threads/a-full-high-res-ai-upscale-...


Pre-rendered? Sure. In-game? No. I think the art direction is half of the problem here. You can place lighting much better and the textures are lacking.


there is no "art direction" by Meta or the people working on this project; this is a reference scene for PBR pipelines.

https://developer.nvidia.com/orca/amazon-lumberyard-bistro


You are right, the lighting makes the original scene look worse.


Unreal and Quake engines handled light and textures far better than that.


the screenshot could/should probably be better, but that doesn't mean the library is incapable of producing higher quality renders. I haven't dug in, but I am assuming this is basically Meta's WGPU. if so, these sorts of libraries are low level libraries abstracting different platforms that can be used to build high quality render pipelines on top of that can run anywhere. you could build an N64 quality rendering pipeline with little effort, or something rivaling AAA studios with a lot more knowledge and effort.

I guess, to make a poor analogy, your comment is sort of like looking at a still frame of a poorly shot movie and complaining that the codec is shit.


It does seem to be Meta's answer to WGPU.

The picture looks like they didn't have automatic tonemapping, the rendering equivalent of auto exposure control. So the picture is too dim. I brought it into a photo editor, saw that the top third of the intensity space was empty, used "Levels", and it looked much better.

That's a standard glTF test scene, called "bistro". Here's the same scene, rendered with Rend3/WGPU.[1] Here's the source code for that example.[2] Rend3 is a level above WGPU; it deals with memory management and synchronization, so you just create objects, materials, transforms, and textures, then let the renderer do its thing. Rust handles the object management via RAII - delete the object, and it drops out of the scene.

Looking at Meta's examples, there are too many platform-specific #ifdef lines. More than you need with WGPU. Probably because WGPU is usually used with something like Winit, which abstracts over different window systems.

We'll have to wait for user reports about performance. Meta didn't show any video. Here's a test video of mine using Rend3/WGPU on a town scene comparable to the "bistro" demo.[3] This is a speed run, to test dynamic texture loading and unloading while rendering. The WGPU people are still working through lock conflicts in that area. The idea with Vulkan land is that you should be able to load content while rendering is in progress. For that to be useful, all the layers above Vulkan also have to have their locking problems hammered out. Most open source game engines don't do that yet. Unreal Engine and Unity do, which is why you pay for them for your AAA title.

[1] https://raw.githubusercontent.com/BVE-Reborn/rend3/trunk/exa...

[2] https://github.com/BVE-Reborn/rend3/blob/trunk/examples/scen...

[3] https://video.hardlimit.com/w/sFPkECUxRUSxbKXRkCmjJK


Why does Meta need an answer to WGPU? How does the existence of WGPU create problems for them, and how does this new thing solve problems that anyone else has with WGPU?

This just feels like sour grapes about the fact that WGPU excluded Khronos when it was developed, so Khronos wants their own, with maybe a bit of promotion-driven development on Meta's part.


hi John! you know a lot more about this stuff than I do. is it possible they just haven't implemented a full PBR pipeline for this demo/screenshot, or do you think this (the differences in the two screenshots) is more an indication of what would likely be areas for future development?


They seem to have implemented everything that the "bistro" scene calls for. I don't know if those hanging colored lights emit light, though. Rend3/WGPU doesn't handle large numbers of light sources yet. But you wouldn't see them in daylight anyway, because this is high dynamic range rendering, and, as in real life, those light are dim relative to the sun.

Here's the same scene in Godot.[1] This was modified a bit, and has accurate values for the lamp illumination. So they are totally washed out by the sun.

And here it is in several other renderers, with a video.[2]

The original scene was in .fbx, from Amazon's "Lumberyard" project. [3] That project started as the Crysis engine, was bought by Amazon, spun off as open source, was renamed Open 3D Engine, and is still getting Github changes, so it's not dead.

There are many open source game engines. Most of them get stuck at "mostly works, not ready for prime time". That's where the problems get hard and fixing them stops being fun.

[1] https://github.com/godotengine/godot/issues/74965

[2] https://www.ronenbekerman.com/orca-amazon-lumberyard-bistro/...

[3] https://developer.nvidia.com/orca/amazon-lumberyard-bistro

[4] https://en.wikipedia.org/wiki/Amazon_Lumberyard


The problem is, it implies that the library isn't capable of higher quality. I could imagine maybe because it is a lowest common denominator. Or because the Metaverse is not focussing on high end graphics, as their previous releases looked poor. Or maybe this is something like VML. Designed to be fairly barebones and people would use it for museum websites and educational tools but not for graphically intense games.


I always see this idea on HN that a library, website, framework should be marketing itself for mass appeal and adoption.

this is made for people building rendering engines on top of. if you are the software engineer with the knowledge necessary to do that, the screenshot is probably not going to influence you, because you understand what this is for. if you aren't, why should they be marketing to you with eye candy?


For better or worse, these graphics API shots are not the place to show cool shaders. SIGGRAPH papers are usually the same dry, boring test scenes. It's just the culture of it.


The textures are ok, but the lighting is super flat. People are used to games making at least some attempt at global illumination, whether it's prebaked light maps, or faking it with SSAO, or anything to not have surfaces be a totally consistent brightness across the whole thing.


Your rendering API is not going to implement GI for you, and having it in a sample app is kind of misleading, that's not really the point. It's probably a mistake to include that as a sample scene as it creates the impression that it's trying to be a game engine. A few material spheres and test meshes would probably be a better example.


Agreed, just pointing out why parent commenter gets the “1990s/early 2000's game” impression from the screenshot


Whether you like the textures has nothing to do with whether the graphics library is any good.


This is a great example of a principle I heard from CoderFoundry: "People are visual buyers. If it looks good, people assume the code is good."


Sure but it has everything to do with whether I can easily tell that it's good.


I can slap together a few high-res textures in SDL_Renderer, and maybe hack in some pretty shaders. Doesn't mean it's a good API.


The original Vulkan Demos were butt ugly too. Then 2 or 3 years later that super flashy DOOM remake went all-in on it and shut everyone up for a while.


Counter Strike 1.6 energy


Let’s go go go!


I like it. It manages to be on the healthy side of the uncanny valley, so as to feel more like an actual inhabitable world instead of a disconcerting knock-off of the real world.


The Utah teapot is never around when you need it.


You are too kind. Mid to late 90s game. What is up with those shadows. I have written a 3D-engine with better image quality than this like 20 years ago.

At least it is not a teapot.


> I have written a 3D-engine with better image quality than this like 20 years ago.

Well good news, it's not a 3d engine at all! It's a nice common API to cover all the existing graphics APIs.


A large part of creating a 3D-engine is to try figuring out what capabilities can be used with what performance across different hardware. If this is only an abstraction it won't solve anything.


It's not texture, it's the lighting, it makes it feel very flat, especially when contrasting with RTX'd stuff which we all have in memory to some degree.


Lmao, people are absolutely oblivious to the effects of lighting and it shows.

Go watch nvidia’s demo of their lighting and scene modification/remastering tool.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: