
Doom Eternal – Graphics Study - todsacerdoti
https://www.simoncoenen.com/blog/programming/graphics/DoomEternalStudy.html
======
jameskilton
And with enough processing power, you can get a computer that does all of this
1,000 times per second!

[https://slayersclub.bethesda.net/en/article/48xD6yVj0VsulONX...](https://slayersclub.bethesda.net/en/article/48xD6yVj0VsulONXKAnr7n/doom-
eternal-overclocked-at-1000-fps)

I work with computers every day and I still just have no concept of how
powerful they really are.

~~~
srgpqt
Computers are powerful enough to host exactly one Wordpress site serving a few
dozen users simultaneously.

(Point being, any extra computer power is wasted by software until it performs
no better than a computer from 20 years ago)

~~~
MaxBarraclough
> any extra computer power is wasted by software until it performs no better
> than a computer from 20 years ago

The existence of Doom Eternal disproves this.

In domains where high performance is important to the product's success, such
as graphically impressive video-games, computational power isn't wasted. Bloat
is tolerated in many other domains, though. Anything made with Electron, for
instance.

------
Agentlien
A few thoughts on this from someone who happens to work with graphics
programming.

Mega textures: I always thought their insistence on this technique was weird
and was never quite convinced by it. I used to have a colleague who praised it
to the skies and swore everyone would be doing it in a few years. It's
interesting that they've moved away from it now.

Forward rendering: It's fascinating that they're going this way. I feel like
everyone is all in on deferred rendering nowadays, except when targeting
mobile. I wonder if this is because they want to target older systems and
Nintendo Switch. I have heard that Doom Eternal renders surprisingly well on
older systems and my own experience optimizing for Switch is that it's not
worth it trying to deferred rendering to work efficiently on Switch.

Shadow mapping: the use of simple 3x3 PCF makes me feel a bit sad that
variance shadow mapping never really took off. I remember reading about it and
implementing my own version in 2010 and it seemed like the future.

Uber shaders and draw call merging: I really like this from a technical
standpoint, but I wonder how it affects artist workflow. I know a number of
artists who love using special crafted shader graphs for everything.

Reflections: I wonder if this was just an optimization for memory bandwidth,
as suggested, or a compromise to allow to go fully forward rendering.

~~~
jiggawatts
Out of curiosity, have you played the game itself?

I played both the predecessor Doom 2016 game and Doom Eternal on the exact
same hardware, and I was blown away by the performance _and_ quality
difference.

Never before have I seen a game crank up the graphics quality this much _and_
improve performance to this extent without requiring a hardware upgrade.

To put things in perspective, I had to play Doom 2016 at 1080p on an NVIDIA
RTX 2080 Ti for acceptable framerates, but I played Doom Eternal at 4K and it
produced a silky smooth 60 fps throughout.

Whatever they're doing... it's _working!_

~~~
Bombthecat
What they are doing is: they got help from google to optimize shit out of the
game for stadia. (online streaming service for games)

~~~
flohofwoe
Such problems are not solvable with money or manpower, and I don't think the
Id team depends on Google engineering knowledge for GPU optimizations ;)

------
mattigames
All that super complicated math to produce incredibly immersive graphics but
you still can't jump or walk on an elevator while its moving (I still love the
game, just find such thing amusing).

~~~
swivelmaster
Game physics are hard.

------
smabie
Serious question: what is the value in an in-house engine these days?
Performance? Ease of use? Profit margin?

I don't know anything about game dev, but the cost of maintaining an engine
sounds very high, and the benefits seem unclear. I imagine that Unreal is more
advanced than these homegrown engines, and the engineering cost of constantly
upgrading a homegrown engine with new algorithms and such must be staggering.

What's the value?

Ps: Doom Eternal runs really great, but I'm sure with tuning Unreal can
produce similar results, right?

~~~
bob1029
I don't write in-house game engines, but I do write in-house devops tools and
I assume the reasons are similar. Instead of reaching for things like Jenkins,
Docker, Kubernetes, et. al., we developed a unified software solution that
ties all of these concerns together in a way that aligns perfectly with our
business objectives. Our biggest concern has always been the turnaround time
of our software changes. We are at a point of doing multiple builds per day
per customer, and pushing further into this realm requires some machine
assistance.

For me, having 100% visibility of the entire vertical in code is the most
compelling aspect. I can set a breakpoint somewhere in business logic and then
step into the low-level if I need to. This sort of approach makes it really
easy to spot things that are conceptually tricky. E.g. After stepping into an
internal library method you discover that you are using ASCII and not UTF-8
encoding which is resulting in loss of fidelity of persisted business data. If
this library method was in some external source you have no control over, it
would either be entirely invisible to you, or taunting you with its unchanging
gaze. For us, we simply edit the low level problem, push a commit, and its
done. Knowing that we _can_ step into the lowest level code encourages us to
do so. F11 takes us all the way to the bottom of the rabbit hole. Debugging is
a lot easier when you have confidence that you are passing the correct byte
sequences into database/os/framework libraries. This also means that stack
traces are very useful since you can view the source behind everything.

There is obviously a cost associated with maintaining this solution. We have
found that it is substantially more expensive to maintain than just using a
set of existing tools. For our case, 1 full-time developer is approximately
what it takes to maintain all of this infrastructure. However, there is also
the angle of opportunity and efficiencies. With the custom tools we have
built, we are now able to cycle our software development process 4-5x faster
than before. Our project managers can click a few buttons in a web UI to
trigger a build of our software followed by automatic packaging and scheduled
release to selected customer environments. This also includes the ability to
review all errors (fully-automatic reporting) combined with snapshots of
business state at time of stack trace generation. We have the ability to click
on a commit hash and see a list of all errors which occurred on that tree up
to that point in time. All of this links back into our source control tool. I
would say that our devops process is ~95% full-auto at this point. I have not
seen anything like what I am describing offered in any public marketplace.

Sometimes if you want something really nice, you are just going to have to
build it yourself. Many times, the cost of in-house custom will not be
justifiable at face value. You have to look beyond this and consider the
higher-order consequences of your decision over longer time frames. Our
solution started out very humble and gradually grew into the comprehensive
monster that it is today. If we hadn't taken the leap away from Jenkins 3
years ago in favor of implementing the build logic in code, we would have
never had the opportunity to incrementally add all of the other amazing stuff
that we did.

~~~
thr0w3345
Sorry to say, but this was the completely wrong approach and I’m amazed the
business let you do it.

Some poor bastard is going to have to undo all your mess down the road, I hope
you buy them a beer if they ever track you down. I mean, at what point do you
make the call that writing your own in-house copy of _DOCKER_ is a smart move?
Let alone Jenkins.. The ‘magic’ abilities you describe this platform as having
are utterly standard features and are yours as soon as you install e.g
gitlab...

Of course, you actually have to spend some time and learn them and how to use
them properly, but that time is VASTLY less than learning how to implement
them badly and minuscule compared to the amount of time it’s going to take to
migrate off your in-house pile..

~~~
bob1029
We did not write a 1:1 replacement for docker. We made technology choices
which allowed for us to completely sidestep the conversation.

See: .NET Core Self-Contained Deployments and the inherent advantages of
SQLite.

Just because we don't use a particular vendor's product does not mean that we
have made the decision to re-implement 100% of their feature sets.

------
desi_ninja
Can anyone tell me how one dan get multi pass screenshots like this ? I am
aware of RenderDoc. Does it allow this or there is a different tool ?

~~~
Agentlien
You should be able to do this in any competent rendering debugger: PIX,
RenderDoc, Nvidia's graphics debuggers

------
willis936
iD has done excellent work with id tech and I’d love to actually play it, but
the Denuvo scandal leaves a bad taste in my mouth. I want to reward good
industry behavior. I’m not sure if excellent technical advancement, competent
design, and alluring art outweigh hostile behavior. They probably do on a 50%
sale.

~~~
frio
Didn't they back out Denuvo in the end? It seems fair to choose not to
purchase a game if it has DRM, but conversely, if the company responds to the
community by removing it, that seems like "good industry behaviour" worth
rewarding :).

~~~
willis936
They did, but they also did add it suddenly, without notice, two weeks after
the game launched (after return periods of day one sales expire). Pivoting to
save face does not nullify that bad behavior. They’ll do it again and will
only back out if they get caught.

------
arbirk
I feel that Doom Eternal is a bit slow/unoptimized for the actual visuals you
get. Love the game though

~~~
cgrealy
Really? My gaming rig is over 7 years old now. While it was pretty decent for
it's time, it's definitely due an upgrade.

But I was still able to run D:E @ 1440p and get a respectable framerate with
decent visual quality.

Compared to most other games, I think it's extremely well optimised.

~~~
arbirk
Yeah it runs fine, but not great. I was also put off by this demo (rtx 3080 4K
maxed out) [https://youtu.be/A7nYy7ZucxM](https://youtu.be/A7nYy7ZucxM)

~~~
zamadatix
I think the exact opposite from that demo, they chose Doom Eternal for the
first showing of FPS from the 3000 line precisely because it's the best
looking title they could get to run at 4k 120+ on the card not in spite of its
looks.

Perhaps you just don't like the art style or prefer something that's prettier
at low framerates than decent looking at high frame rates? I mean when it came
out there were articles literally written about how optimized it is for the
quality [https://www.tomshardware.com/features/doom_eternal-
graphics_...](https://www.tomshardware.com/features/doom_eternal-graphics_cpu-
performance-comparison). I'm not saying it's the best optimized game of all
time but it certainly is in the top quartile.

