
Epic’s Stunning RTX-Powered Ray-Tracing Demo Wows GDC - bcaulfield
https://blogs.nvidia.com/blog/2019/03/20/epic-rtx-ray-tracing-gdc/
======
westoncb
I am actually underwhelmed by this demo... Not because it doesn't look
great—it does—but because this is about par for the course for a next-gen GPU
demo. They always look fantastic for definite reasons, not strictly hardware
improvement.

Remember nvidia's 'Dawn' demo from 2003?
[https://youtu.be/4D2meIv08rQ?t=48](https://youtu.be/4D2meIv08rQ?t=48)

They put together super high quality demos like that for every new generation
of card they released. The games that followed never looked quite as good
since there are far more demands put on one's computational resources to
generate a playable game world vs. a single scene that needs to look great.

What would be potentially very interesting is to see the difference in work
put into producing this with raytracing vs. traditional techniques. Being able
to displace the unholy nightmare hodgepodge that comprises contemporary real-
time rendering algorithms is the most exciting thing about raytracing. At
least that's potentially the case; not clear to me how much improvement has
been made possible by the RTX.

EDIT: and, making me think this new one is part of a lineage of sorts, here's
a later Dawn demo from 2012 (I think):
[https://www.youtube.com/watch?v=bI1_quVr_3w](https://www.youtube.com/watch?v=bI1_quVr_3w)

EDIT2: rewatching, other aspects of this demo (i.e. not rendering
specifically) _are_ beyond 'par for the course', imo. But it's just generally
higher production values, indicating more time/money/attention was put into
producing the short itself, independent of the hardware backing it.

~~~
_bxg1
There's truth to this, although I find the demo easier to appreciate when I
focus specifically on the things that benefit from RTX. That said... I think
they could've picked a better scene to that end. The original stormtrooper one
was great, but an outdoor scene (fewer surfaces reflecting indirect light)
with a body of water (a flat reflection which can be easily achieved using
non-RT techniques) doesn't really play to the technology's strengths.

Side note: Geez, that linked video says _so much_ about how much gaming
culture has evolved in a decade and a half.

------
zaroth
So much for _uncanny valley_ , to me that just looked fabulous.

I don’t quite comprehend the development process to build a short line this.
Everything from the surroundings and environment, to the humanoid model and
making natural movement, obviously the facial modeling and expressions, the
water effects.

The sound track is probably the only thing I feel like I have a decent mental
model of how it comes together.

I took one 3D graphics class in undergrad and it was all so absurdly low level
basics, most just mathematics, it never remotely showed how something like
this could actually be put together.

~~~
stronglikedan
> So much for uncanny valley

To me, the mouth always seems to give it away. Fabulous nonetheless though.

~~~
noneeeed
Yeah, that's always been my issue too. There seems to be something about the
elasticity of the human face, and the interactions of all the small muscles,
that seems to be especially difficult to model.

This demo does at least seem to deal better with the darkness in the mouth.
That has also always been problematic and made a lot of characters look
positively demonic :)

~~~
noir_lord
Micro expressions are a pain to animate and yet every human we interact with
has them pretty much, our brains are very effective in reading them as well
since for our ancestors they where literally life and death.

Without them I think our brain rings all the alarm bells, a human face without
them is odd and possibly threatening.

------
jeswin
One of the things I hate about modern games is that while the environment gets
closer to realism, humans (face, skin) still look like corpses. I prefer
cartoonish characters instead of being in uncanny valley.

The woman in the demo looked somewhat real; if this is real time and
reproducible in games then it's a huge step forward.

~~~
worldsayshi
This is real time. It's a bit annoying that the article doesn't lift that
point since that is the most impressive point. Then again, I guess unless you
don't include interactive aspects there's no telling what is actually real
time and what is "baked".

~~~
JohnBooty

        there's no telling what is actually real time and what is "baked".
    

Yea, that distinction is... fuzzy, and a little frustrating.

Artistically, this demo is wonderful!

Technically, it's somewhere between "yawn" and game-changing, depending on how
much of the lighting and animation is pre-baked. Are the deformations of her
skin, clothing, the water, etc. being calculated in real time, or was all the
hard stuff precalculated and baked in?

Reason why this matters is because the more dynamic this is, the closer we are
to creating actual emergent/interactive experiences with this kind of
technology, rather than just a way to render static cutscenes better.

~~~
worldsayshi
Well put.

------
AWildC182
Can someone explain why the motion capture for this animation looked so much
more realistic than it usually does? Character animation always looks over-
smoothed or something to me but this is probably the first time I've seen
where it doesn't have a stilted appearance.

~~~
baddox
It could just be that they spent an inordinate amount of time and money
cleaning up the motion capture data for this demo. It's my understanding that
the mo-cap in big-budget films is heavily modified by professional animators.
Most hours-long video games probably don't have the budget for that.

~~~
AWildC182
What about the source data makes it so difficult to clean up? It feels like
nothing has progressed since the days of the first ping pong ball suit even
though cameras and image processing have become infinitely more capable.

~~~
blihp
The fact that there's so much of it and the sources are indirect and
imprecise. All motion capture with actors is from the surface of their body
and then you have to reverse-engineering all that data into skeletal and
facial muscle positions.[1] The techniques are very cool but not perfect and
there is error at every step of the process.[2] So then you still need tools
and people go back through the data to clean it up / smooth it out... and even
then, it's still not going to be perfect.

[1] For some strange reason surgical procedures to implant hundreds of hyper-
accurate mo-cap sensors in the actors or otherwise hack into their nervous
system hasn't become a thing. ;-)

[2] Camera lenses aren't perfect, dot placement (for marker based systems)
isn't exact, tracking software slips since the markers aren't infinitely small
and can move around a bit etc.

~~~
danmaz74
This looks like an area where machine learning could really help... Have you
got any idea if there is progress here?

~~~
blihp
I agree that it's a likely application which would reduce the manual effort
involved. Someone is probably doing something with it (is there anything
someone isn't trying to apply machine learning to these days?), but I'm not
familiar with anything specific.

~~~
jzl
[https://schedule.gdconf.com/session/a-new-era-of-
performance...](https://schedule.gdconf.com/session/a-new-era-of-performance-
capture-with-machine-learning/860568)

------
ripsawridge
"Tell us how you really feel," asked no one, ever, but it's a free country
so--

Yet another way to draw us away from actual connection with a painfully and
joyously real world outside.

Better to long to just once, enumerate and follow the subsurface scatter of
light on the face of your beloved than to coo and preen over mere mechanism.

A dying culture turns away from whats out there. Rolls over and wishes to
slumber in dreams where everything is controlled and safe. Hence, the
computer.

I finally figured out what I hate about CGI the most: the animated figures
convey no dismay at the weight of their bodies. It's how that dismay is coped
with that injects grace into movement. The princess here looks nice enough
until she moves. Then it becomes clear there is no soul within the frame that
first suffers the pain of an ungainly body coping with gravity, then masters
itself, and resolves to move as best it/she can anyway.

That is beauty: the acceptance of limits. The carrying on despite them. You
can see this in real people when they walk.

Until the entire inner world is modeled, these constructions have less life
that a good puppet. A puppet at least telegraphs the spark of life from the
hand and soul that control it's strings. That spark is constant, microsecond
communication and feedback.

Stop trying to create simulacrums. Live the life you've been given.

[Returns to computer job. Resolves nonetheless to walk under the moon
tonight].

------
mikepurvis
Alicia Vikander credited at the end, presumably for the mocap and vocal
performance? It's cool the amount of cross-over there this these days between
high performers in different disciplines of media. How many Hollywood triple A
stars would have been down to do mocap for a 90 second gaming tech demo video
even ten years ago?

~~~
sorenjan
Also has music by Ludwig Göransson, who won an Oscar for the music in Black
Panther.

The video is made by a Swedish production company (Goodbye Kansas), and it's
inspired by the art of John Bauer, one of the most recognized artists in
Sweden, so maybe that helped getting the A-lists interested.

------
kowdermeister
Another awesome RTX demo from Crytech:

[https://www.youtube.com/watch?v=1nqhkDm2_Tw](https://www.youtube.com/watch?v=1nqhkDm2_Tw)

~~~
sorenjan
That's rendered on an AMD Vega card, RTX is Nvidia's Raytracing tech.

[https://www.cryengine.com/news/crytek-releases-neon-noir-
a-r...](https://www.cryengine.com/news/crytek-releases-neon-noir-a-real-time-
ray-tracing-demonstration-for-cryengine)

------
JabavuAdams
And then we're going to use these awesome visuals to run around burning the
forest, murdering the creatures, tea-bagging the photo-real corpses of our
fallen foes, all while unable to have a good in-character conversation with an
NPC, and being forced to listen to teenage voices hurling racial and sexist
slurs.

Plus ca change.

------
sigi45
The first time I read about rtx and raytracing I thought "holy shit so that's
how it will happen and I will see it happening".

The Transition started and now it's just a question of a little bit of time.

Really looking forward to everything happening now on that level :).

------
kraig
This looks great.

In a few short years I'll be able to get a VR headset and install a computer
inside of one of these games so I can work remote from basically any virtual
world. Then i'll only need to unplug and face reality to get the Postmates.

------
m0zg
Somehow the mouth of the model still looks "weird". Maybe because humans are
hardwired to pay more attention to it?

------
shmerl
DXR and Nvidia sounds way too locked-in. There should be more collaborative
effort to do that through Vulkan and on all GPUs.

~~~
skrowl
DXR works with AMD cards as well - [https://www.dsogaming.com/news/amd-states-
that-all-of-its-dx...](https://www.dsogaming.com/news/amd-states-that-all-of-
its-dx12-gpus-support-real-time-ray-tracing-via-microsofts-dxr-fallback-
layer/)

~~~
shmerl
It's still MS only, so a bad option.

~~~
esyir
Gaming on non-MS PC platforms is a pretty small niche anyway though. I can see
the appeal, but my guess is that the work to newly available marketbase ratio
here isn't all that great.

~~~
shmerl
So, good effort would be to help it grow, instead of perpetuating MS lock-in.

------
aresant
Do they release these as executables ?

~~~
mikepurvis
Historically, similar demos have been released as downloads, so you can run
them locally and compare your FPS with everyone else on your favourite
overclocker forum:

[https://www.geforce.com/games-applications/pc-
applications/a...](https://www.geforce.com/games-applications/pc-
applications/a-new-dawn)

I'd expect this one to be released eventually as well.

------
craftyguy
How is this article praising nvidia products on nvidia's website _not_ spam?

------
sagebird
the smoke that comes from the fairies is a bit deterministic for my tastes --
looks like it is being driven by a trigonometric function and doesn't have
much turbulence or interaction with local environment/surfaces.

~~~
eps
No wireless. Less space than Nomad. Lame.

:)

~~~
darkpuma
cmdrtaco is wrongly maligned, the ipod would not become massively successful
until further revisions were released.

~~~
0-_-0
Did they have wireless and more space than a Nomad?

~~~
darkpuma
Connectivity (USB, rather than firewire) and more storage were certainly among
the facets improved in subsequent versions. USB isn't wireless, but with USB
rather than just firewire, the lack of wireless is less of a problem for mass
market adoption, isn't it? I never had a nomad, I couldn't tell you how much
storage they had, but I do know that my first ipod had a lot more than a
measly 5GB; it was a 5th generation with 30GB.

ipod sales were basically fuck-all until Gen 4/5, meaning cmdrtaco was
basically right for three or so years:
[https://en.wikipedia.org/wiki/File:Ipod_sales_per_quarter.sv...](https://en.wikipedia.org/wiki/File:Ipod_sales_per_quarter.svg)

I don't think it a coincidence that Gen 4 and 5 were the first to fully
support USB (gen 4 also supported firewire, gen 5 did not.)

The iphone took 6 quarters to breach 5 million sales per quarter _(remember
that as it was originally released, the iphone was an impressive touch screen
web browser with subpar data and no appstore!)_ The ipod took _14_ quarters to
breach 5 million sales per quarter. After that, both took off. My point here,
is the ipod was much slower off the line than the iphone. Simply put, it was
lame (until it wasn't.)

------
izzydata
Maybe I'm crazy, but I don't feel the need for such realism in video games.
Ray tracing is an awesome technology for things like pre-rendered video, but
real time raytracing that takes up a huge portion of the graphics card
computational power seems like a waste. I'd rather see more frames per second
at higher resolutions.

~~~
Crespyl
From discussions I've seen elsewhere and my own limited knowledge, I think a
lot of the potential benefit in the near future is likely for the
artists/designers more than the end users.

Being able to quickly place lights and design materials for a scene with
realistic ray-traced rendering gets believable results in simpler fashion than
by having to painstakingly recreate all the details of natural lighting with
existing conventional methods.

It requires a lot more of the end-user hardware, but overall gives the artists
more room to focus on things other than making sure the lighting and
reflections aren't distractingly wrong.

~~~
izzydata
That sounds like a good application of this technology. If Nvidia had some
kind of split product line for professionals and gamers. Kind of like this
exact thing they already did with the Quadro cards..

~~~
mattnewport
You're missing the point, for realtime rendering it's not enough for the
content creators to have this technology to get the benefits, it has to be
used on the target end user hardware too. Getting things to look good with
traditional rasterization based pipelines involves a lot of special case hacks
and tricks and work arounds for situations where they break down or don't
compose. That is a lot of work for the artists to accommodate the limitations
of the target platform.

The promise of raytracing is that you can get great results without a lot of
tricks and special casing which frees up the content creators to focus on the
visuals they want to create without worrying so much about technical
implementation details. That only really works if the target platform supports
performant ray tracing as well as the content creation tools.

------
devilmoon
Am I missing something? This doesn't really seem all that great compared to
what we have been shown with Ray Tracing. The fact that it's not even real
time but pre-rendered makes it even less wow-ing for me

~~~
013a
Yeah I could use some clarification on whether this is real-time or not. If
its real-time, then its among the most impressive tech demos ever showcased.
If its pre-rendered then how is it different from the films Pixar and
Dreamworks put out every year? Its visually stunning but ultimately you can
make computers do anything if you're willing to sit around for a 2 week render
process.

~~~
ska
The text suggests it is realtime; Beyond that though, it's hard to see what
novelty would lead you to showcase this as a demo if it were offline.

