Hacker News new | comments | ask | show | jobs | submit login
Adopting Lessons from Offline Ray-Tracing to Real-Time Ray-Tracing [pdf] (realtimerendering.com)
67 points by jeffreyrogers 75 days ago | hide | past | web | favorite | 13 comments

I've seen a couple white papers on modern rendering techniques that were actually developed in the 50's or 60's and one day someone happened noticed that modern hardware could easily do it in real time now. I wonder how many gems there are out there from the past awaiting rediscovery and reuse. Ray tracing may not really fit that but it is cool to see hardware opening up more and more real time use cases. The images in that paper make me feel like we are right around the corner from a neat trick to use the low resolution ray tracing to modify traditional lighting towards realism.

Transparent object like glass are also a problem for path tracing. Bidirectional path tracing can improve this but it's hard to predict which rays from a light source will be visible to the camera.

It's amazing we can now have photorealistic images in seconds. But I think it will take some more years before studios can render the latest Avatar in realtime.

An example of a hard to render situation (not just for path tracing) is visualizing caustics on the other side of glass, since you can't connect paths directly to light sources or the camera. (S[SD]*S type paths in path tracing notation). If it begins or ends with a diffuse reflection then it's _relatively_ simple to form a path by starting at the other end.

I've always been interested in knowing a "Real Time Pixar" value... The question is roughly, "What Pixar movie could we now convincingly render in real time?"

* Given, say, a $2,000 budget for a new PC (for CPU / GPU / Motherboard / Power Supply / Drives)

* Same resolution as the original movie

* Same frame rate as the original movie (24 fps?)

* You can use whatever rendering pipeline you want. It doesn't need to be RenderMan. And we explicitly encourage cheating, using the modern cheating techniques, if they exist.

* It does not have to be identical in output. You can cheat. But the differences need to be very minimal, even when you do a frame-by-frame still image comparison. Maybe blades of grass, or strands of hair, can be in different places, etc. But basically the same quality. The point being that if you had handed this image to the original team who made the movie, they wouldn't have had a problem using it instead of what they did produce.

Modern techniques seem to be more about ease-of-use for the artists rather than getting things to run faster.

Physically based rendering is mostly about ensuring that your models look the same across many different lighting conditions. IIRC, something like Toy Story "cheated" by having the artists change the lighting settings between scenes.

While Wreak it Ralph is the famous Pixar Movie for using a singular "uber-shader" that worked in all situations across the movie.


The "Principled" BSDF was mostly about making things easier to control for the artist. IE: Making settings between 0 and 1, arbitrarily. And other such "user interface" issues.


Because the "cheating" way to get things to run at runtime the easiest is to encode it to H264, and then hit the playback button. :-)

One thing I learned is how a lot of video-game models have baked-in shadows (especially 2d perspective games like fighting games). The sun is assumed to be coming from a certain direction. So any shadows created by the sun are "baked into" the texture itself, and no need to calculate it during runtime.

Other shadows remain dynamic (Ex: a lamp in the background may still cast a dynamic, runtime shadow). And the combination is enough to trick most people into thinking they have fully dynamic lightning.

Fun fact: the Wreck It Ralph paper is widely cited as a starting point for many physically based shading systems used in modern game engines.


Pretty sure we passed that threshold for Toy Story a long time ago, but there's lots of enthusiastic discussion online regarding this topic: https://www.google.ca/search?q=toy+story+realtime

A good comparison is looking at the kinds of things being rendered in realtime at 4K/30fps on modern game consoles— even games that are a few years old look pretty comparable to 90s-era CG, or the areas that don't are constrained to things like facial animations, where the limitations are more about assets/artwork than technology.




> where the limitations are more about assets/artwork than technology.

As an aside, it seems to be there is an argument that being dependent on artwork for this is a product of inadequate technology, in that you don't have a good enough general model that you can just provide a simple description (perhaps with iterative refinement) and script to do what you want, with the technology doing what we currently demand of an artist.

Monsters University was the first pixar movie that used global illumination, which means that modern games with VXAO will be better in some aspects than any pre-MU pixar movie, but the level of detail will be obviously a great deal lower.

Monster's University was not the first Pixar movie to use global illumination, but it was the first Pixar movie that used path tracing (combined with rasterization for the primary visibility). Earlier movies such as Up and Toy Story 3 used Point Based Global Illumination.

Why are you asking about Pixar movies in a thread about ray-tracing? Most of them were not ray-traced. For example Toy Story was not ray traced. Monsters University was the first one.

I think OP was using it as a benchmark for what used to take 24hours per processor now being something that can be done in real time.

Also, I believe they used Blue Moon Rendering Tools (BMRT) to Ray trace Buzz’s visor. BMRT was developed by Larry Gritz, who later joined Pixar. At some point they came up with a way to “farm out” specific areas of rat tracing from renderman.

Presumably that point was somehow related to Ratatouille?

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact