
Ray Tracing Is No New Thing - fallingbinary
https://www.bytecellar.com/2018/08/31/ray-tracing-is-no-new-thing/
======
jdmoreira
Sure, but real-time raytracing at home is quite a new thing and that's what
the hype is about. Right?

~~~
georgeecollins
Doom was real time ray traced.

~~~
delinka
definitely not.
[https://doomwiki.org/wiki/Doom_rendering_engine#Rendering](https://doomwiki.org/wiki/Doom_rendering_engine#Rendering)

~~~
georgeecollins
You are right-- sorry

------
vvilliam0
Obviously. Good article but clickbait title. "A History of Ray Tracing" would
have done the job.

~~~
blakespot
What prompted me to write the article was running across several tech news
articles declaring ray tracing to be a new technique on pieces about the RTX
hardware -- real-time aside. To clarify that ray tracing is not new but real-
time ray tracing (generally) is was my purpose in writing the piece.

------
rollulus
I'd say that it is rather obvious that ray tracing is not a new thing, since
it is simulates how light physically behaves.

I consider this 3d rendering as a spectrum: rasterization requires little
computation but has little to do with physics. Ray tracing is what requires a
lot of computation and has everything to do with physics. Somewhere in between
are hybrid methods: rasterization with ray tracing components added to it, or
ray tracing with approximations.

For instance, pure rasterization cannot do shadows. It is approximated by
rendering the scene from the viewpoint of a light and test the rasterized
scene for occlusions casting shadows. And the other way around: real time ray
tracing cannot compute all indirect lighting paths, a subset is only
considered at the cost of e.g. variance.

~~~
mbel
> For instance, pure rasterization cannot do shadows.

Well... the technique that you describe (shadow mapping) is actually _pure_
rasterization it just requires more than one rasterization pass. This also
ignores the fact that there are other techniques for getting shadows in
rasterizing renderers (stencil volumes and other stuff that is rather
considered historical today).

I get your point that rasterization doesn't support shadows "naturally" like
ray tracing, but in my opinion your wording and the example is rather
unfortunate. The same goes for reflections, I would say SSS or caustics
probably are better examples since they are really only done with techniques
based on ray tracing.

~~~
swerner
SSS has been done with point clouds for a long time. Only recently have films
switched to ray tracing for SSS.

Games these days implement SSS in screen space using rasterisation, no rays
tracing either.

------
bitL
Can someone from AMD/NVidia please jump in and explain the difference between
Radeon Rays (2016?) and RTX (2018)? Thanks!

~~~
ksec
And if someone from Game development communities could explain how hard is it
to do RTX or DirectX RT on current games? Would companies now have to do two
set of graphics, design, and code path? One that does it with DirectX RT the
other doing it with all the Rasterisation technique such as Shadow mapping
etc.

Basically I am interested in the Cost of such Direct X RT implementation. And
what sort of time frame could we see these on the market.

~~~
pjmlp
Naturally multiple code paths are required, as always.

~~~
ksec
Yes but it means another set of path to test and optimise. Another set of
Graphics design asset for RT. As if the current AAA titles budget are not
expensive enough. Instead of making quality games cheaper, we are making
graphics intensive games even more expensive to build. And I don't think it is
healthy at all.

------
arayh
Gamers have long been clammering about how they can't wait until in-game
graphics match those of pre-rendered cinematics (that said, a lot of
cinematics these days are no longer pre-rendered). Ray tracing is such an
expensive operation. According to this quora, it took 29 hours to render a
single frame of "Monsters University": [https://www.quora.com/How-long-does-
it-take-to-render-a-Pixa...](https://www.quora.com/How-long-does-it-take-to-
render-a-Pixar-film)

We're probably nowhere close to getting real-time Pixar-quality rendering in
our games right now, but we've definitely made leaps and bounds over the last
few decades.

~~~
learc83
That's because it's a moving goalpost. Pixar keeps increasing the quality to
take advantage of faster hardware and better algorithms.

We'll never be close to getting today's real-time Pixar-quality until Pixar
rendering is good enough that they stop meaningfully improving it. What we can
have is yesterday's Pixar quality rendering in-game.

~~~
MisterTea
That should be a new goal post: can it render a pixar film in real-time. e.g.
Do we have a Toy Story capable card yet?

(Note: I know offline rendering and real-time raster rendering we use in GPU's
are completely different methods. But there is a point where the raster
trickery can catch up and match the offline stuff.)

~~~
gmueckl
Today's GPU are probably much closer to what Renderman was doing back when Toy
Story was made than you think. They used an algorithm called REYES, which has
nothing to do with raytracing and in fact can only barely be made to combine
with ray tracing at all [1]. It was completely thrown out of Renderman only on
the last couple of years for that reason.

REYES really is an early take on rasterization with tesselation, designed for
hardware with extreme memory constraints. Although the actual tesselation
algorithm works differently from GPU hardware tesselation, the basic idea of
tesselating dynamically to the required level of detail for the current frame
carried over intonthe hardware.

[1]
[https://en.m.wikipedia.org/wiki/Reyes_rendering](https://en.m.wikipedia.org/wiki/Reyes_rendering)

------
bni
Im an Amiga fan but in what way is the Juggler demo relevant here at all? Ray
tracing and storing the 2D result, was surely done much earlier on
workstations from SGI, Sun etc

~~~
blakespot
I wrote this article. The Juggler was the first time I'd seen raytracing on a
computer and watching these pre-rendered animations on the Amiga was one of
the thrills of the system, as it surpassed all consumer micros of the day
graphically, as far as on-screen colors. The Amiga's Hold-And-Modify (HAM)
mode could render the full 4096-color palette on-screen and was very well
suited for displaying ray traced scenes with their realistic coloring and
shading and could do so at a resolution of 320x400 (4:3 aspect) and at a
sufficient framerate, given the flexibility and power of the Amiga's blitter
and memory architecture. As such, it seemed worth a mention here.

------
Coffeewine
The article concludes with a succinct TL;DR, but it's worth a read if you're
at all interested.

 _So, ray tracing. It’s a rendering technique that has been around for over 45
years. It’s nothing new. Finally seeing the benefits of this technology
enhance the environments in our games and VR worlds — in real time — thanks to
a new API and dedicated consumer hardware, that’s the New Thing._

~~~
geforce
We did learn POV-Ray in high school computer class. We rendered on our
faithful Sun Microsystems workstations. Good times.

