Hacker News new | past | comments | ask | show | jobs | submit login
Unreal 5.5 is a big deal [video] (youtube.com)
122 points by RobinHirst11 19 hours ago | hide | past | favorite | 64 comments





Gotta love videos with useless titles and almost no text in the description...

Unreal 5.5 changes: https://dev.epicgames.com/documentation/en-us/unreal-engine/...

Page on MegaLights, which appear to be the focus of the video: https://dev.epicgames.com/documentation/en-us/unreal-engine/...

Key quote from the above:

MegaLights is a stochastic direct lighting technique, which solves direct lighting though importance sampling of lights. It traces a fixed number of rays per pixel towards important light sources. If a light source is hit by a ray then that light’s contribution is added to the current pixel.

This approach has a few important implications:

Direct lighting is handled by a single pass in an unified manner, replacing multiple existing Deferred Renderer shadowing and shading techniques.

MegaLights not only reduce the cost of shadowing, but also reduce the cost of the shading itself.

MegaLights has a constant performance overhead, but quality may decrease as lighting complexity increases at a given pixel.


With Unreal becoming the Chrome of the game engine world, Unity dying a slow death triggered by their corporate overlords, I can't help but feel like it's a matter of time until they become the bad guys.

I don't want them to, just seems like the circle of life.


Unity is still a lot more popular then Unreal and alternatives exist and are created all the time-- others have mentioned Godot, or Cry Engine which are old. People are working on Rust engines. Lots of commercial developers use their own tech. Game engines don't have to gravitate toward a standard. They only do when the cost/benefit works in their favor.

No it's not - I know a few people who work in the industry and Unity's dead on new projects - everyones moving to Unreal. Indies, etc. might still use it (or other engines), but professionals have been constantly moving away from the engine for years, for all sorts of reasons, the recent licencing shakeups being one of them.

It seems this is getting downvotes, so I need to clarify that this doesn't represent any sort of personal bias - it's just that Unreal has way more (and better paid) jobs than Unity does.

Trust is earned in drops and lost in buckets.

Unity took all the buckets outside, burned them, then demanded money from all their customers. I've not seen new projects using them, because stakeholders are concerned about surprise expenses down the line.


Unity is more popular amongst hobbyist / aspirational game developers, it's not at all popular in AAA.

This is completely false. Unity is popular with AAA, as is Unreal Engine. Both are less popular than having an in-house engine.

Source: I work in the games industry.


Which AAA games are Unity? Maybe my understanding of the term is more subjective than it actually is, but Wikipedia's chosen examples - Pokémon Go, Monument Valley, Call of Duty: Mobile, Beat Saber and Cuphead - are either very badly downplaying its use or showing that AAA means something very different than I think.

Yeah our whole studio shifted from Unity to Unreal. If console is in the equation then it's even more reason to go Unreal.

Strongly feel that Epic is playing the long game to monopolize game dev. Their "free games" teaser on EGS is an attempt to gain market share. /

Once they have captured both the engine and distribution market share, the anti-consumer policies will begin.

In the long run, I remain highly pessimistic and suspicious regarding Epic's long term strategy.


They don't have to be bad guys to be at the forefront of a growing monoculture, precisely because Unity shit the bed and the remaining competition for indie devs tapers off.

I love Godot for game dev, but it's not the full stack package, and last time I checked it wasn't great for 3D, but it's basically the best you have short of modding Doom.

RPG Maker and Adventure Game Studio corner the market on a niche genre of pixel-art RPGs and point-and-clicks if you want to stay 2D.

It's expensive to build a new engine from scratch, especially when Unreal says it can cross-build to PC, Xbox, PS4/PS5, Switch, Mobile, etc.

The problem is that making a generic does-anything engine means it can't be optimal, and more and more these days studios are using the engine but don't understand how to optimise within it. So now you're downloading shader caches from Steam because you can pull a pre-compiled shader across the network faster than it will build at runtime, and you still expect the game to run smooth as butter.

I build shit for the web, and this is the exact trade-off you make when you, say, choose Rails for your server over a more hands-on setup. I can spin up web apps in Rails with my eyes closed, it's that easy, but the performance has a high floor.


Unity is not dying lol. Unreal is only becoming the "Chrome of the game engine world" in the AAA space. It's not a nice tool to use for independent projects at all, speaking as someone who works with it professionally. Most aspects of what makes Unity great for Indies are missing from Unreal (good docs is the big one).

Luckily there are still loads of games using their own engine tech out there, but the pool is shrinking.

It would be cool if companies in their position make two versions of their products. One continues on this trajectory that {adds bloat | becomes more modern} and another that prioritizes avoiding performance regression by a few metrics.

Unity is not dying.

Any evidence to back up "With Unreal becoming the Chrome of the game engine world"?

I wouldn't worry. This cool tech looks mid when the games are in motion. It's great for static renders and tech demos, but a lot of folks are just turning it off.

There are still a lot of engines. Plenty of people are starting to incorporate Godot, Unity hasn't disappeared, and companies build and leverage their own engines all the time.

The main reason Epic Games is so big is Fortnite, not Unreal Engine. They're still incredibly focused on that metaverse play.


I mean, Unreal is still Epic and while they’re sometimes underdogs, they’re pretty much always the bad guys.

It's his stick he uses the same title for every minor ue release.

*shtick

*schtick

I jumped directly to the video and understood the deal pretty quickly! I wonder if there are super amateurs here in HN using this for creating short movies.

Can someone ELI5 how Megalight works so much better than the prior approach, and what the tradeoffs might be? (Im not a game developer)

Reading the link below, the best I can surmise is that they shifted the prioritization from following a lightsource through all the different interactions it has, to the opposite of identifying, per pixel, how each pixel is illuminated via a prioritization scheme for light sources, including any nearby pixels. So it's a per-pixel approximation that results in a something far more accurate than the prior 'declared brute force' approach? Each pixel gets 2-5 inbound rays that have the highest impact.

How did I do?

https://dev.epicgames.com/documentation/en-us/unreal-engine/...


In essence mega-lights is ray tracing one shadow ray per pixel, then using some temporal de-noising to try and smooth away the noise. This is in contrast to how games traditionally calculate shadows, which is by rendering a shadow map from the point of view of each shadow casting light in the scene. For a small number of lights, the shadow map approach will be considerably faster. What mega-lights enables is effectively an unlimited number of shadow casting lights for a fixed cost (because it only ever traces a fixed number of rays per pixel), but at a much higher base performance cost, and with the potential for noisy artefacts caused by the denoising.

In addition, it samples a limited amount of lights at each frame (sorted by likely importance based on the position), and accumulates lighting over time: the only way you get a stable, nice image is when you leave your camera and world stable for a second. Move, and it blurs.

It's actually much less accurate, but much faster.

Typical rasterized pipelines with deferred rendering:

1. For each light, setup a virtual camera at the light and 1a. For each mesh, rasterize depth to a texture for that light (the shadow map) 2. For each mesh, rasterize material properties and depth to a texture for the main camera 3. For each pixel on screen 3a. For each light in the pixel's bin: sample the shadow map several times to determine if the pixel is in the light's shadow, and if not, apply the lighting for that light

By contrast, megalights:

1. For each mesh, rasterize material properties and depth to a texture for the main camera 2. For each pixel, pick a random light influenced by how close it is, how bright it is, etc 3. For each pixel, raytrace to the light. If you hit another object before the light, then you're in shadow. If not, the light is visible and affecting the point, so write the light to the direct light texture. 4. For each pixel, denoise the direct light texture by essentially blending spatially (with nearby pixels) and temporally (with pixels from the previous frame's direct light texture). 5. For each pixel, sampling the material properties and the denoised direct lighting texture, apply the lighting to the pixel and write out the final result.

The rasterized method is:

* Expensive - having to culling + rasterize every mesh per light means you can only have a few shadow casting lights * Expensive a second time - Each pixel needs to loop over all the lights in its bin * Expensive yet again - It's a lot of memory usage to store all those shadow map textures * Expensive a fourth time - To get good results, you need multiple samples of the shadow map per pixel, which is a lot of texture reads. * Poor quality - Shadow maps are a discrete approximation of shadows. Even with multiple samples, you'll often have poor quality and even artifacts unless you have either artists or a very complicated system to tweak shadow biases and cascades.

Megalights (stochastic light sampling), by contrast:

* Have a higher base overhead - ray tracing is expensive! * Scale much better - With a fixed number of rays per pixel (usually 0.5, 1, or 2), the time and memory cost at a given resolution depends mostly on the BVH complexity used for accelerating raytraces (if the random sampling is done correctly, Nvidia research shows that this can be surprisingly slow if you're not careful). Forget having only a couple shadow casting lights, now you can have thousands! * Looks much better, with much less work - No more adjusting biases or cascades or shadow sampling patterns. There's no need since raytracing is a fully continuous representation of visibiliyy between two points, unlike shadow maps. You can also integrate proper transparencies, refractions, reflections, volumetrics, emissive meshes or textures, IES light profiles, etc with raytracing, unlike rasterization. * Has different failure modes - With rasterization, too many lights or too complicated lighting will kill performance, but not quality. With megalights, performance will be fine, but you'll end up with too much noise for the denoiser to handle, which will look bad.


Thank you for this wonderful comparison!

With megalights, could you for each pixel, pick multiple random lights to increase accuracy?


I supposed I am getting older and that's the reason but I recently tried playing Horizon Zero Dawn LEGO and Cyberpunk and... the lightning was too much. I can't focus on what matters on the screen (can hardly find the hot spots), to the point where I'd tried to lower the quality settings to have less lightning elements and flatter textures. I noticed for those games that I really need high FPS (>60) otherwise I can't focus well enough.

It's not just you. There's a real problem right now with the push towards graphical fidelity (specifically "realism") coming at cost to readability -- most of which is informed by careful choices in the game's art (lighting, level design, style, etc.)

The main example I can think of is the "yellow paint" discourse w.r.t to games like Resident Evil 4 remake Final Fantasy 7: Rebirth. Freya Holmer has a really great thread on the latter: https://x.com/FreyaHolmer/status/1756276112462627119

To be clear, I don't think the push towards realism is at odds with having a stylish and readable game. But the more dimensions you're dealing with (greater complexity in lighting models, materials etc.) the more difficult it gets to tie everything together into something cohesive.


2min papers released their vid on 5.5 a few days ago at about half the length. https://www.youtube.com/watch?v=zzPxWXEZZcc

The advanced capabilities of state-of-the-art game engines are extremely impressive in a technical sense. Unfortunately, in recent years this cutting-edge tech is rarely the enabling element in delivering games I enjoy playing. This is one reason I've changed my gaming focus to retro-gaming (of course, the DLC and other monetization-centric sins of big game studios are also a major factor).

While there are a few newer games I do enjoy, as far as I can tell, the bleeding edge features aren't needed to create what I find enjoyable about them. For me, I think the issue is that, outside rare exceptions, beyond a certain point increasing fidelity doesn't add value to game play. Admittedly, this is my own subjective assessment. Although I'm not in the game industry, I also worry that the increasing breadth and complexity of SOTA engines requires increasingly specialized knowledge and skill sets to leverage thus raising the bar out of reach of the smaller developers who often create the new games I do enjoy. It would be reassuring to hear from smaller and indie game developers who've found the new engine features in recent UE versions to be both accessible on small-team time budgets and enabling of significant new game play value.

Outside of gaming, I think newer capabilities like using SOTA engines for real-time virtual sets during film production are more creatively exciting because they can enable big budget story-telling on smaller and indie budgets.


Very impressive! I have to wonder, as someone in the game development world working on 2D game, what the budget of a game would be to take full advantage of Unreal 5+.

City Skylines 2 is a good accessible example (even though it's made with Unity). Via mods, editing a single property takes several hours to make it look amazing. Time is spent adding texture decals, doodads, importing new textures, and playing around with the texture geometry (e.g. adjusting the grassy area size).

It seems like this depth of detail requires exponential time investment on the devs part. Small wonder CS2 launched with very few building options?


There's two things going through my mind after watching this video:

1st) What an amazing piece of technology this tool is.

2nd) How many years I would probably have to spend to learn all the ins and outs of that program from scratch.


Call me when the motion clarity isn't garbage.

Tell us more?

Unreal Engine 5 relies very heavily on temporal algorithms and upscaling to deliver playable performance, which looks fine for stationary shots but often makes it look like someone smeared vaseline all over your screen whenever the camera is moving.

thats why all UE5 stuff has a 'look'

While this looks astounding compared to where computer graphics was decades ago, it still often looks fake, both the terrain and the characters. Why is that, what is left to improve? Obviously character motion (maybe more variation so every motion captured playback is not identical), but even the cities and nature have some quality that seems fake. Maybe it's a limitation of my monitor, the fact that it can't display true daylight colors with UV? But live film or video looks reasonably real on the same monitor. Something (or several things) is still missing, the results still look Unreal.

Check out videos on YouTube of how movie studios are using unreal for virtual sets.

I think a lot of it is artistic choice.


I feel the same. Even the narrator's voice sounded robotic to me.

> results still look Unreal.

Maybe its the NAME :)


Slightly on topic for anyone interested; Unreal Engine 5 now runs on WebGPU!(unofficially through my company, a third party)

Demo here:

https://play.spacelancers.com/


Are you looking to get acquihired?

What's to stop Epic Games from building their own?

Is this a lot of effort to build and maintain? Seems like they could easily pull something out of Apple's playbook and leave you with a lot of tech debt.

Not trying to insult or belittle, just really curious about the line you're walking and your thinking about the ecosystem you're in.


What are the performance requirements for all these new fancy features? What kind of systems (console, PC+video card) can actually use them at a predictable frame rate?

A mid tier one of 2030 should do the trick, I'd think

The model or the year?

Hah! Sorry I meant the year 2030

Character meshes were taking up 1GB? I wonder what surprising (to me, a non-gamedev) things are hiding in modern 100GB+ games...

There's been a recent explosion in mesh sizes because of Nanite as well. A lot of devs who don't understand optimization will pull in a 1+ GB movie-quality mesh and just slap Nanite on it, even though they'll never need that level of fidelity.

"infinite lighting sources and infinite models" reality : 8 times higher frame rate in optimal case when using multiple light sources.

Blender has the famed Donut Tutorial, which is a blast if you're even slightly interested in learning about it.

Does Unreal have something similar?


Your First Hour in Unreal Engine 5.2: https://www.youtube.com/watch?v=peUO_55ck4o (link to course is in description)

Does anyone know how megalights works? What’s the major innovation there? Is it using neural nets to approximate stuff?

Can't wait to play all these stunning Unreal 5 games at 1080p 60fps (upscaled from 720p with DLSS and frame gen) on my $2000 RTX 5090 that draws 500W. Truly next-gen.

5090 is rumored to draw 600W. I'm afraid reality has overtaken your satire.

My hair dryer pulls 800W. Are you telling me the 5090 doesn't even support basic features like drying my hair?

It does as long as you stick your head in the right place

Even worse on the "next-gen" console front. After yet another generation of being promised at least near 4k and 60fps, we're back to 30fps with sub 1080p render resolutions.

We're gonna get to the point where peoples' gaming rigs won't even start because they continually trip standard circuit breakers. Gonna need to build a special service to play games in 5 years.

I still like Brotato.

I still don't have WR in Celeste, no point buying a GPU

yeah but how about the photorealistic power draw WITH game mechanics and animations based on collisions of simple cuboids & capsules?

but why does the street look like wet plastic



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: