
Adopting Lessons from Offline Ray-Tracing to Real-Time Ray-Tracing [pdf] - jeffreyrogers
http://advances.realtimerendering.com/s2018/Pharr%20-%20Advances%20in%20RTR%20-%20Real-time%20Ray%20Tracing.pdf
======
edoo
I've seen a couple white papers on modern rendering techniques that were
actually developed in the 50's or 60's and one day someone happened noticed
that modern hardware could easily do it in real time now. I wonder how many
gems there are out there from the past awaiting rediscovery and reuse. Ray
tracing may not really fit that but it is cool to see hardware opening up more
and more real time use cases. The images in that paper make me feel like we
are right around the corner from a neat trick to use the low resolution ray
tracing to modify traditional lighting towards realism.

------
pasta
Transparent object like glass are also a problem for path tracing.
Bidirectional path tracing can improve this but it's hard to predict which
rays from a light source will be visible to the camera.

It's amazing we can now have photorealistic images in seconds. But I think it
will take some more years before studios can render the latest Avatar in
realtime.

~~~
0-_-0
An example of a hard to render situation (not just for path tracing) is
visualizing caustics on the other side of glass, since you can't connect paths
directly to light sources or the camera. (S[SD]*S type paths in path tracing
notation). If it begins or ends with a diffuse reflection then it's
_relatively_ simple to form a path by starting at the other end.

------
VikingCoder
I've always been interested in knowing a "Real Time Pixar" value... The
question is roughly, "What Pixar movie could we now convincingly render in
real time?"

* Given, say, a $2,000 budget for a new PC (for CPU / GPU / Motherboard / Power Supply / Drives)

* Same resolution as the original movie

* Same frame rate as the original movie (24 fps?)

* You can use whatever rendering pipeline you want. It doesn't need to be RenderMan. And we explicitly encourage cheating, using the modern cheating techniques, if they exist.

* It does not have to be identical in output. You can cheat. But the differences need to be very minimal, even when you do a frame-by-frame still image comparison. Maybe blades of grass, or strands of hair, can be in different places, etc. But basically the same quality. The point being that if you had handed this image to the original team who made the movie, they wouldn't have had a problem using it instead of what they did produce.

~~~
dragontamer
Modern techniques seem to be more about ease-of-use for the artists rather
than getting things to run faster.

Physically based rendering is mostly about ensuring that your models look the
same across many different lighting conditions. IIRC, something like Toy Story
"cheated" by having the artists change the lighting settings between scenes.

While Wreak it Ralph is the famous Pixar Movie for using a singular "uber-
shader" that worked in all situations across the movie.

[https://disney-
animation.s3.amazonaws.com/library/s2012_pbs_...](https://disney-
animation.s3.amazonaws.com/library/s2012_pbs_disney_brdf_notes_v2.pdf)

The "Principled" BSDF was mostly about making things easier to control for the
artist. IE: Making settings between 0 and 1, arbitrarily. And other such "user
interface" issues.

\-------------

Because the "cheating" way to get things to run at runtime the easiest is to
encode it to H264, and then hit the playback button. :-)

One thing I learned is how a lot of video-game models have baked-in shadows
(especially 2d perspective games like fighting games). The sun is assumed to
be coming from a certain direction. So any shadows created by the sun are
"baked into" the texture itself, and no need to calculate it during runtime.

Other shadows remain dynamic (Ex: a lamp in the background may still cast a
dynamic, runtime shadow). And the combination is enough to trick most people
into thinking they have fully dynamic lightning.

~~~
corysama
Fun fact: the Wreck It Ralph paper is widely cited as a starting point for
many physically based shading systems used in modern game engines.

[https://cdn2.unrealengine.com/Resources/files/2013SiggraphPr...](https://cdn2.unrealengine.com/Resources/files/2013SiggraphPresentationsNotes-26915738.pdf)

