
How Unreal Renders a Frame - ingve
https://interplayoflight.wordpress.com/2017/10/25/how-unreal-renders-a-frame/
======
iainmerrick
I'm a little surprised at how surprised people are here, at how much work is
done per frame!

Sure, game engines are really complex and impressive, but the same is true of
a lot of other kinds of software. If you work on websites, have you thought
about how much processing goes on when loading and rendering a page?

~~~
blattimwind
How much work the parsers, layout engine and renderer have to do in a browser
is far more opaque than what a game engine does.

For game engines there is a lot of material out there explaining the basic
processes and data flows; there are free or open source games and game engines
around as well: Unreal engine, used in the article, is free, including full
access to the source code. (This is by far the most widely used game engine in
the entire gaming industry). And with a graphics debugger (free) you can look
fairly easily at how games (pretty much _any_ game you can run on your PC)
render things as well.

That's not to say that game engines are simple, however, it is relatively easy
to understand how they work on a basic level (i.e. what they have to
accomplish and how they do so in general terms) and, specifically considering
graphics, simple renderers are not particularly hard to develop yourself as a
small side project.

With browsers, this is a lot more difficult. There are perhaps three modern
browser engines, which are mostly open source but also very large pieces of
software. There are no books or tutorials explaining how to develop browser
engines. And the debugging tools in the browser only tell you how often a "DOM
update" or "render" happened and perhaps how long it took.

~~~
robbies
UE4 is not the most widely used game engine in the industry. That would be
Unity by a wide margin.

If you are restricting to AAA, I’m not even sure if that holds. There are a
good amount of big teams that use UE4, I think I’d go with Frostbite due to
the sheer amount of titles that EA pushes out that are now based on Frostbite.
It’s certainly more big games than UE4 is shipping. I’d probably put Ubisoft
ahead of them too with their internal engines.

~~~
prophesi
It's difficult to measure game engine usage for AAA games since they can
afford to

a) build their own internal engine

b) pay licensing fees to not disclose what engine they're using.

For overall game usage, however, I'd definitely say Unity is most used, with
UE4 coming in second

------
2sk21
So all of this processing happens for one frame? And this is going on at 60
Hz?

~~~
mrweasel
I'm completely fascinated that games can perform massive amounts of math and
render 120 frames per second, while some business applications fail to do
completely basic data manipulation in an reasonable time frame.

~~~
test1235
I think it comes down to requirements. If games aren't running smoothly,
people simply won't play them.

Business apps have it a lot easier - the users are more forgiving, I guess
either because they don't have a choice in the matter or any better options.

~~~
bobthedino
Also the people who buy business apps are often not the people who actually
have to use them every day. So UI/UX ends up being a much lower priority than
say cost.

~~~
hyperpallium
And, if the program processes data too quicky, users think it isn't working.
(Managers, perhaps, think it isn't working hard enough.)

~~~
tekromancr
Fortunately, there is a rails library for that.
[https://github.com/airblade/acts_as_enterprisey](https://github.com/airblade/acts_as_enterprisey)

------
lordnacho
This is why I don't get why the game industry has such a bad rep for
overworking people.

It's genuinely hard to do this kind of code so you'd think they take care not
to scare away the few people who can do it.

I recall at one point Goldmans was bragging to my team about how they'd hired
a game dev to do their click-to-trade currency UI. Seemed like the guy found
an easier job for more pay.

~~~
pavlov
Game industry is like Hollywood. There's an endless supply of highly talented
hopefuls from all around the world who want to get into the industry more than
anything else.

The focus of exploitation is just different -- the Harvey Weinsteins of the
game industry don't try to sleep with brilliant Romanian programmers, instead
they work them to death.

~~~
mariusmg
"the Harvey Weinsteins of the game industry don't try to sleep with brilliant
Romanian programmers, instead they work them to death."

As a romanian programmer (although not in the game industry), i must admit
this is a very eloquently put. Hats off to you sir.

------
pault
Does anyone know of a good book or course that does a deep dive into unreal
engine source? I'm trying to level up my graphics programming skills and that
seems like it would be an amazing reference, but I don't currently have the
knowledge to navigate it myself.

~~~
pfranz
Like the other commenter said, game engines are a huge topic. The game engine
architecture book has a nice overview (even though it's hefty), you would
probably want an equal size book on each aspect of a game engine; graphics,
networking, dynamics, audio, ai, etc. I haven't read it, but also heard good
things about the real time rendering book. I did look at the game engine
architecture book looking for info about deferred rendering and it was fairly
shallow (but thats not the goal of the book). If I remember correctly--the
book actually points you to more detailed resources (papers) and books.

~~~
zfedoran
I agree, each topic in the game engine architecture book could be another book
on its own. However, it does give a good introduction and briefly touches on
most of the topics in this diagram
[https://i.imgur.com/SxydAoF.png](https://i.imgur.com/SxydAoF.png)

(diagram is from the book)

------
RoboTeddy
Sheesh. How many engineer-hours have gone into Unreal?

~~~
krige
A hell of a lot, the engine itself was in development for around three years
IIRC

~~~
Nition
According to the Wikipedia page at least, even just the current version
(Unreal Engine 4) has been in development for 14 years now! With 2003-2008
apparently being just Tim Sweeney.

~~~
visarga
I have high expectations from simulation. I hope they will be able to reach
99.9% realism so we can use it to recreate any situation we want, for fun
(games) and profit (pre-training robots).

<rant>Self driving cars already train on simulated roads because that allows
the creation of any scenario. Even human pilots use simulators to train,
especially for those rare situations. And since robot training is expensive
and slow in the real world, the only alternative is to do it in a sim.

Simulation allows the composition of any scene that might be very hard to
record in the wild - for example, an octopus sitting as a hat on the head of
an elephant... where would you be able to get that photo? but surely you
imagined it in 0.1 seconds, using your imagination - a powerful simulator
humans have in their heads. Such images are crucial in training AI.

I think AI will reach human level when it will be equipped with a sandbox
where it can try out its ideas and concepts, similarly to how scientists use
labs to test their theories. When AI gets its world simulator, it will be able
to learn reasoning and meaning that is grounded in verification. Just like us.
We have the world itself as our fundamental "simulator" and experiment on it
to learn physics, biology and AI. AI needs a simulator too.</>

~~~
pault
Cool, but what does this have to do with the comment you replied to?

------
ilaksh
I personally believe much of this will become passe within five years or so.

I have seen a few demos of real-time path tracing software running on GPUs. I
know GPUs are fast, but I wonder if there is a way to do the same math on an
ASICs or FPGA that could be even faster? The main issue with existing things
seems to be being able to do enough iterations to get a clear picture.

Anyway I believe a lot of the tricks related to triangle meshes and lighting
approximations will be thrown out and replaced with procedural generation and
real time path tracing.

~~~
katastic
Perhaps. Too bad we hit a GHZ wall, so the only thing we can do now is add
more cores.

FYI: For 6K and 8K, both nVidia and AMD tech reps have said that "multi-GPU"
(not CORE) solutions will be required to reach those. But they also said once
they hit 16K, they'll have "real eye quality" in terms of DPI which comes with
lots of extra "realism" for free. Like, watch the new Jungle Book in 4K and
certain scenes of mountains will blow your mind and feel "real" (without any
3d glasses) and your brain is like "holy shit, I'm not watching a movie (for
this split second), this is something real." But most scenes still don't.
We're so close to photorealistic, I can taste it! (Like that GTA 5
photorealism mod on HN yesterday.)

That's why they're all moving to Vulkan. OpenGL is single-threaded and a PITA
to easily exploit multiple GPU solutions. (Global state, one draw thread.)

------
kevindqc
> For stationary lights and dynamic props, Unreal uses per object shadows,
> meaning that it renders one shadowmap per dynamic prop per light

Isn't that a TON of shadowmaps?

------
jokoon
This is why I like minimalist 3D graphics.

