
Pixar's Chris Horne Sheds New Light on Monsters University - randall
http://thisanimatedlife.blogspot.com/2013/05/pixars-chris-horne-sheds-new-light-on.html?m=1
======
jarrett
For a bit of background: The article talks a lot about global illumination.
Here's what that means.

First, you have to understand the cheaper alternative, which is called local
illumination. With local illumination, for each pixel, you figure out what
object you're looking at, and where on that object. You take into account the
normal (direction of the surface at that point) and the optical properties of
the object at that point. You also take into account the position, intensity,
color, etc. of any light sources in the scene. Optionally, you may also take
into account any shadow casting. That's it.

What's missing from that list? It's a big one: You're not taking into account
the way _other objects_ in the scene affect that little point. In the real
world, light bounces all around. Each little point is affected by pretty much
each other little point. All the points are interdependent.

But with local illumination, you ignore the way other surfaces contribute to
the point's illumination. You're just looking at that one point and the light
sources. That's why it's called local.

Global illumination, by contrast, does take into account the interplay between
different points in the scene. Its main purpose is to simulate light bouncing
between polygons.

As you can imagine, managing the complexity of all those interactions is a
tall order. We have quite a few algorithms for this; all are approximations.
It's worth noting that some of these approximations can converge towards a
provably physically correct result if you let them run long enough.

In any case, running global illumination often causes a major increase in
rendering time. So it's understandable that Pixar, which has to render a huge
number of frames at huge resolutions, did not traditionally use it much.

~~~
primigenus
I wonder what would happen if you offloaded the rendering of a movie like
Monsters University to something like Google Compute Engine. Would it cost an
arm and a leg? Or would it solve a lot of scaling/cost/time challenges?

~~~
wisty
I think Pixar will do low-resolution runs all through the day (so people can
get immediate feedback on what they are doing), and a high resolution run
overnight (so they can see the current product). If you're at close to 100%
utilisation, and have a large cluster, renting hardware is very cost
inefficient.

------
masklinn
> ray tracing is a relatively advanced CG lighting technique

Not really correct. Ray tracing is more like the most simple GC lightning
technique you could imagine, but so incredibly computationally expensive
people have been mostly waiting for the hardware to be good enough for the
last 50 years.

And in the meantime, they've been using an incredible pile of hack and tricks
to try and approach levels of visual quality and complexity trivial on a
raytracer, except said pile of hack could actually be computed before the heat
death of the universe.

------
aaronbrethorst

        I was surprised that ray tracing in Pixar
        was historically a clunky, haphazard process.
        I always thought of it as this smooth, polished
        machine like something you would see at an Apple store.
    

Life inside the sausage factory never quite looks like what outsiders would
expect.

~~~
KaiserPro
A lot of the tools in pixar are wonderfully clunky.

They went through a phase of hiring hip young things fresh out of MIT to write
tools. Instead of nice friendly tools that play well together, they got a lot
of domain specific languages.

~~~
yardie
It's been this way in the visual effects industry for decades. Most of the
tools are used through a Posix layer with Perl to keep things tidy. Going into
the VFX forums even a few years ago I was amazed at illustrators knowing
enough Perl and Python to get the job done, because, "Designers can't code
good!"

Even the commercial stuff looks like some graduate students thesis work. A
basic Java GUI and about 50+ commandline arguments.

------
berkut
This is wrong actually - they've been using ray tracing for ages (all ambient
occlusion in PRMan is done with raytracing, by sending out occlusion rays in a
hemisphere around the shading point).

What's new with MU is they're using both physically-plausible shading (where
the shading is based off physically-based BRDF lighting algorithms, which
gives much more realistic results), and global illumination path-tracing for
the entire light transfer equation.

~~~
ubercow13
They explain the extent to which they used to use raytracing in the article -
is it misrepresented? It's in a direct quote so I doubt it.

~~~
berkut
Yes, because the article (and the person in it) doesn't seem to understand the
difference between path tracing (ray tracing with global illumination -
multiple bounces even with diffuse surfaces), and ray tracing = sending rays
around a scene and bouncing them off specular reflective/refractive surfaces -
which Pixar have been doing for years. It's been possible to write raytraced
shaders in PRMan for over 12 years now.

~~~
splt
Cars was in production about 10-12 years ago. Maybe you're thinking Cars 2?

~~~
berkut
No, the guy in the article was wrong - it's been possible to do in PRMan since
at least 2000, but it was very slow (they didn't have any decent acceleration
structures for the ray intersection), so it generally wasn't used that much.
But it was possible.

For PRMan 13 (which Pixar used for Cars in 2006), they added semi-decent
acceleration structures which sped up raytracing a bit. But you still had to
use custom shaders to cast rays.

With PRMan 17, ray tracing is now a first-class citizen in PRMan, and it can
also trace rays from the camera instead of doing the traditional (pre 17)
REYES rasterization of the surface and then shading that surface for
reflection based on ray tracing.

------
joshuak
Actually there's some incorrect information here, Pixar started using ray
tracing in films as far back as A Bug's Life. I can't find a picture on line
but you can see it in the scene with the glass bottle. That was done by
integration with a separate render but since then RenderMan as added support
for GI and other ray tracing features.

I'm sure this update is significant, and sounds like a ground up reworking of
the engine, perhaps replacing REYES? But it's not at all accurate to say the
Pixar is moving to raytracing. Pixar has been in that neighborhood for more
then a decade.

~~~
gdubs
Yep. A "Ray Server" technique where PRMan farmed out calls trace() calls to
BMRT was used in "A Bug's Life" for the glass bottles at the grasshopper's
hideout. All other reflection and refraction were done using 'standard-issue'
environment maps.

Source: Apodaca,Gritz, Advanced Renderman -- MK 2000.

------
mratzloff
I was wondering what Toy Story would look like if they re-rendered it with
today's technology. Well, it turns out they already did! It was part of the
theatrical re-releases of Toy Story and Toy Story 2 in 3D.

[http://www.bigscreenanimation.com/2008/09/toy-story-re-
relea...](http://www.bigscreenanimation.com/2008/09/toy-story-re-releases-
being-re-rendered.html)

Were these re-rendered versions released in 2D on Blu-ray?

~~~
baby
I wonder if you really see a difference.

~~~
mackwic
You won't see it but for sure you will perceive it ! ;)

~~~
devindotcom
Well put!

------
newgre
An article about a new dimension in rendering quality, and they're
demonstrating that quality with two images the size of a stamp...WAT?

~~~
uptown
Larger versions:

[http://bluraymedia.ign.com/bluray/image/article/104/1044870/...](http://bluraymedia.ign.com/bluray/image/article/104/1044870/monsters-
inc-20091112044224899-000.jpg)

[http://pixartimes.com/wp-content/uploads/2012/08/Monsters-
Un...](http://pixartimes.com/wp-content/uploads/2012/08/Monsters-University-
Research-1.png)

~~~
kyberias
Wow! There's actually a huge difference! Not that I'm really surprised but
still.

~~~
ginko
You also have to consider that Monsters Inc. was one of the earlier Pixar
movies.

The contrast to Brave isn't that extreme.

------
mtgx
From what I gathered from John Carmack, ray tracing is done much more
efficiently with voxels than with triangles, so hopefully this will push game
engine companies to incorporate voxels sooner into their engines, too.

~~~
angersock
"Let's add voxels!!!111one" is one of the most irritating refrains of gamers.

There are a lot of very nice mathematical properties of triangles, and the
trend towards graphical fidelity has really only served to balloon budgets in
the gaming space. We don't need voxels, and honestly neither do we need ray
tracing.

It's not even new tech--games going back to _Outcast_ , some Build engine
games, Novalogic stuff, and so on have used voxels. Ray tracing has been used
in a handful of nifty tech demos, but otherwise nobody cares.

Ray tracing and voxel tech is of marginal utility for games, and things have
moved on--it'd be like switching the US construction industry over to metric;
too late to make a difference and too minor to matter.

EDIT: In spite of all this, voxel cone tracing looks sexy as hell though.

~~~
criley
Why do people assume that voxels and polygons don't play well together?

There are voxel engines now that store world data using voxels while what the
player actually sees is polygons based on that voxel data.

I'm reminded of one guy's side project[1] that accomplishes exactly that.

[1][http://procworld.blogspot.com/2012/12/videos-of-caves-and-
bu...](http://procworld.blogspot.com/2012/12/videos-of-caves-and-
buildings.html)

~~~
IanChiles
There's an awesome library that goes by the name of Polyvox that does exactly
this. It's worth a look if you're interested in voxel-y things.

------
glut
What sort of raytracing are they using?

Are they going all the way to an unbiased global light transport algorithm
(like LuxRender) or just using basic raytracing (like PovRay)?

Are they using an existing renderer? If not, are they releasing their own like
they did with RenderMan?

Are they rendering with CPUs or with GPUs?

How much time per frame does it take them with how many cores of what sort?

~~~
joshuak
Interesting questions. I don't have the answers, but I can say that I can't
imagine that Pixar would ever 'go all the way' to an 'unbiased' GI renderer.

Our entire function in the filmmaking business is to tell a story visually,
and for that you need complete control and directability of the image. This is
the opposite goal of unbiased renderers. Nevertheless more tools in the tool
box is always good.

~~~
berkut
Unbiased actually isn't slow - Arnold has proved this - what's slow is using
hundreds of bounces per path and bi-directional path tracing (like Maxwell,
Indigo, LuxRender) which takes a lot longer.

Biased generally means it's interpolated with a point or irradiance cache.

------
gfodor
Wow, I had always assumed Pixar was doing GI/ray tracing by now. Looking
forward to seeing MU to check out the graphics porn.

~~~
KaiserPro
Ironically Renderman has had global illumination and ray tracing for over ten
years.

Having said that, what renderman is and what pixar animation do/use are rather
orthogonal. For example pixar made heavy use of Subsurface scattering in the
incredibles. Something that was at the time rather time intensive.

The title is a bit misleading as they've been using raytracing for years 1

[1]<http://graphics.pixar.com/library/RayTracingCars/paper.pdf>

------
RivieraKid
Btw, there's a pretty impressive real-time ray tracer in development:
<http://raytracey.blogspot.cz/>

------
pygoscelis
Here's a paper from last summer on how Pixar is doing the computations for
global illumination more efficiently with a "multiresolution radiosity caching
method." <http://graphics.pixar.com/library/RadiosityCaching/paper.pdf>

I don't know much about graphics but maybe some of you will find it
illuminating (no pun intended....)

------
ChuckMcM
I would love them to re-generate Brave with ray tracing, I'm guessing though
that would take too many resources even for a blue-ray edition.

~~~
cscheid
I think the worst part is the amount of man-hours it would take, not compute
hours (but maybe that's what you meant?). If the change is as deep as it
sounds like, the entire tooling support changed, so all scenes would have to
be re-lit by artists, not-entirely-from-scratch but worse than you think.

REYES-rendered scenes that fake global illumination are pretty arcanely hacked
together. Just making a legacy scene ray-traced would make it look _worse_ ,
not better.

~~~
ChuckMcM
Absolutely, which is why it won't happen. You'd have to actually place lights,
remove the fakes, and render check every scene. Basically the only part of the
movie you _wouldn't_ have to do is come up with dialog, sound, and geometry.
(not to mention I have never met a movie person who, given a chance to reshoot
a scene says "Yeah, it was perfect when I shot it, the only changes here are
mechanical." :-)

------
planckscnst
I'm guessing the disco ball scene is Pixar flexing their muscles with the new
tech. It's pretty impressive.

------
38leinad
Anybody knows how other companies like universal's animation department (ice
age, despicable me, ...) stand technology-wise? From the visuals, i always
assumed pixar is setting the standards but now knowing that they just start to
use unified raytracing, the gap might not be that big...

~~~
Ricapar
Why does it matter that much what technology is in the backend? You lead the
industry with results, not with the means to get to those results.

If I can make my webapp better (however you define "better") than my
competitors' using PHP and MySQL, while they're making theirs using Ruby on
Rails, MongoDB, etc,etc. Does the tech stack in the background matter, aside
from making a nice article?

~~~
berkut
Yes, it makes a lot of difference.

There's the obvious render time, but actually render time isn't that important
- studios are happy to wait up to 30 hours for a 4k frame on the farm if
that's what it takes for a shot. But they don't want artists waiting around,
so they want very quick iterations and previews of what the artists are doing,
as it's the artists who cost money.

This is why global illumination has taken off over the last 5/6 years (thanks
largely to SPI and Bluesky showing it could be done), as although the render
times are slower, it means lighting the scene (by phyiscally-based shading) is
much quicker and you don't need as many hacks as you did with PRMan (light
maps, shadow maps, reflection maps, point caches, etc). You can literally
model scenes with lights as they are in the real world.

On top of this, there's how easy it is to do very complex shots and change
just bits of it - tools like Katana allow hugely complex scenes to be managed
and rendered very efficiently, with very little work from artists. Studios who
don't have similar tools often duplicate and waste a lot of time doing things
that should be easy.

For example, Weta on IronMan 3 wasted a lot of time doing all the different
suits, as they didn't have a decent asset-based pipeline that would have
allowed them to re-use a lot of shaders, assets for each suit.

------
asperous
Now that Pixar is only ray tracing, can real time move to scanline? I'd really
like to see a lot more smooth shapes and complicated geometry in games if
possible.

~~~
wtracy
That's pretty much what games already use. The only difference is polygon
count--doubling the polygons tends to halve the framerate.

That said, of the magic comes from a toolchain that lets artists work with
curved surfaces (NURBS or similar) and that converts those surfaces to
polygons at the last minute. We finally started getting hardware support for
that sort of thing with DirectX 11 and OpenGL 4.

------
jtchang
I suppose the holy grail is when ray tracing can be done in real time. At that
point things would look so real we might as well call it quits from real life.

------
kstenerud
_sigh_ WHY has he locked the font size down on his blog such that CMD + and -
only change the text area width and image sizes?

~~~
cheald
It works just fine in Chrome and Firefox here. Are you perhaps using an old
and/or awful browser? He's using a px unit on his font size, but Firefox and
Chrome have done full-page scaling for ages now.

~~~
kstenerud
I'm using the latest chrome for mac. I tested on a different blogspot blog and
it works perfectly there.

