

Six Myths About Ray Tracing - orangeduck
http://theorangeduck.com/page/six-myths-about-ray-tracing

======
trimbo
Hi, ex-visual effects guy here. You didn't really convince me that these are
"myths", and get a few things wrong in the process. I'll hit on a few
highlights.

"Ray Tracing, which has major issues with memory and cache"

Memory is impossible to make generalizations about with rendering but there
are situations where ray tracing can use _far less_ memory than scanline.

Consider if you have a NURBS surface you'd like to render. With a raytracer,
you can solve the ray to surface directly. With a scanline renderer, you must
tesselate that surface. Or consider a simple sphere. The memory used to define
a sphere can be 4 floats -- xyz position and radius. Solving a ray-sphere
intersection is as simple as it gets. With a scanline renderer...it's a lot
more. And say you have a velocity and want to motion blur it. Raytracer: 7
floats, stochastic sample the rays in time. Scanline render... it's even more
RAM.

Renderman is not a pure scanline renderer anymore. It has had raytracing for a
long time.

"Almost all animated movies are rendered using RenderMan"

No, all of Pixars are. That's not "almost all". Dreamworks uses a proprietary
renderer. Blue Sky uses a pure raytracer for their movies (Ice Age 1,2,3,
Robots, Horton Hears a Who). I'm sure others use Mental Ray.

"Without going into the technical details..."

If you're interested in the graphics space, you should really learn a lot more
about how and when raytracing is useful and efficient. I've used pure
raytracers and pure scanline renderers in production, I by far preferred the
pure raytracer. A compromise between the two is best in general.

~~~
erichocean
Sony is using Arnold for their animated films (a path tracer).

~~~
trimbo
Cool, thanks. I wanted to give them a shout-out but haven't followed the
business enough lately to know what they use.

------
agumonkey
Pixar added raytracing to Renderman, PRMan is still REYES but it has
raytracing IIRC.

btw, why no mentions of other rendering approaches ? Stochastic MLT, Frameless
? Raytracing seems a bit oldfashioned to me.

------
voxx
you can stop posting this now. it wasn't popular the first time, just let it
die.

~~~
orangeduck
As the submission hit the front page the site went down and I had to kill the
link to prepare the server for a second (I couldn't even SSH in). Spamming
wasn't my intention.

~~~
its_so_on
I can't see the article, so this is not a response.

My favorite FACT (not myth) about raytracing:

 _the real world is ray-traced._

~~~
coderdude
I could be off the mark here but I'm fairly sure the world isn't ray traced.
Ray tracing involves projecting light from the "eye" towards the scene,
whereas in real life the light is being projected from the scene towards the
eye.

By the way I was able to read the article. One thing I'd like to point out is
that ray tracing lends itself better to parallelization because you can
calculate each pixel's color independently of the others. That could be a
major plus in the future that helps it win over scanline rendering.

~~~
buff-a
>Ray tracing involves projecting light from the "eye" towards the scene,
whereas in real life the light is being projected from the scene towards the
eye.

I can't write a decent reply to this without having terrifying flashbacks of
late night cramming for third year physics exams. But basically, raytracing's
idea that "if it didn't hit your eye, who cares" is somewhat backed up by
theory =)

~~~
toemetoch
You mean the law of reversibility?

~~~
buff-a
I was thinking more along the lines that without the eye, the idea of light
being projected in any direction at all has no meaning =) And then we could
get started on the concepts of time and ordering.

But since I'm no longer a third year physicist contemplating the nature of
existence, I just simulate light statistically. And sometimes that means doing
things like "Photon Mapping", which is precisely raytracing, but the kind of
raytracing you get when you know how to optimize an algorithm implemented as a
computer program.

[http://en.wikipedia.org/wiki/Double-
slit_experiment#Delayed_...](http://en.wikipedia.org/wiki/Double-
slit_experiment#Delayed_choice_and_quantum_eraser_variations)

[http://en.wikipedia.org/wiki/Interpretations_of_quantum_mech...](http://en.wikipedia.org/wiki/Interpretations_of_quantum_mechanics)

