
Investigating ray tracing, the next big thing in gaming graphics - evo_9
http://www.extremetech.com/gaming/135788-investigating-ray-tracing-the-next-big-thing-in-gaming-graphics
======
jpxxx
Oh ray tracing, the nuclear fusion of computer graphics. It's coming! It's
amazing! Meanwhile the industry still keeps burning coal.

For applications that deal in perfect lighting, yes it's probably a useful
technique that would benefit from hardware support. But where is the real-
world benefit for the rest of us? For most consumers, lighting provides about
2% of a game's value and 0% of everything else.

~~~
norswap
I for one, believe it would considerably simplify the implementation of
graphics engines. To make rasterization look good you have to do a lot of
optimizations and go trough a very long pipeline. You don't have this problem
with ray-tracing. Plus the same base technique (ray-tracing) can be used for a
lot of problems in graphics: visibility checking, lighting, reflections,
diffraction, motion blur, ...

~~~
jpxxx
I've heard as much too. It'd be interesting to see today's best raster effort
put against the same game assets in a ray-traced engine with a real-world game
going on.

Anything can look good when you've got 10 minutes and 500 processors. I want
to see what happens in 16ms. :)

~~~
zxcdw
Since you're concerned with frametimes, let's also add in the asset production
pipeline and effort. As it's been mentioned, cutting edge rasterizing consists
of lengthy pipeline for every single effect imaginable(also sometimes to the
graphics assets, mind you) while raytracing does all that almost by
default(soft shadows and global illumination are problematic for example, path
tracing fixes this but is some two orders of magnitude slower).

The real problem with raytracing is that it's asymptotic computational
complexity is far worse when you count scene objects. To compensate this you
either lower the resolution or lower the lifetime of a ray(how many bounces
it's allowed to make).

~~~
jpxxx
Very interesting. Thank you for the additional detail!

------
blackhole
Ray-tracing as it currently stands will never dominate computer graphics,
because it is essentially brute force. Progress is being made in
reconstructing a scene out of a very noisy, low sampling rate [1], which is a
much better indication of the future direction of raytracing, where a series
of advanced prediction and reconstruction algorithms are used to efficiently
render scenes, rather than simply throwing more hardware at the problem.

[1]
[http://www.tml.tkk.fi/~samuli/publications/lehtinen2012siggr...](http://www.tml.tkk.fi/~samuli/publications/lehtinen2012siggraph_paper.pdf)

~~~
nitrogen
I've only looked at the images in the PDF, and not the text, but could a
similar technique be applied to image demosaicing?

------
corysama
One of the best real-time raytracers around is Brigade 2:
<http://www.youtube.com/watch?v=QF2c1uHiYyY>
<http://www.youtube.com/watch?v=Qdw1HvzKt1M>

It could really use a 3rd GTX580 to run the reconstruction filter that
blackhole linked. But, it's getting better at a steady rate.

Meanwhile, the voxel cone tracing technique mentioned in the article
(<http://www.youtube.com/watch?v=fAsg_xNzhcQ>) is a great hybrid. Rasterizers
are great at primary visibility, but are terrible when the algorithm needs
non-local information. By supplementing the rasterizer with a fast-but-blurry
ray-tracer, you get most of the benefit of both techniques.

------
varelse
Increased use of ray tracing effects in games? Sign me up!

Fully ray-traced game engines? Oh no not again. Damn you Nintendo for starting
this nonsense with those faked Project Reality demos...

Graphics engines have more in common with street magic than simulation in that
the best ones use just enough physics and mathematics to get the job done and
not a bit more. Wake me up when John Carmack says ray-tracing is the next big
thing.

That said, ray-tracing has all sorts of unconventional uses outside of strict
computer graphics i.e.:

<http://www.tomsarazac.com/tom/Bicycles/headlight-tracer.html>

~~~
Lerc
John Carmack has said a number of times that it is not the next thing, but it
is the thing after next (or the thing after).

The benefits of ray tracing come in precisely because it is more like
simulation than street magic. Once you get to sufficient complexity, it is too
hard to maintain an illusion. The truth is more consistent than an elaborate
lie.

We are not yet close to that level of complexity. We will be eventually.
Anyone putting a date on that time before it arrives is almost certainly going
to be wrong.

------
Xcelerate
This is exciting. Indeed, it has been "just a few years away" for a few
decades now but at some point I'm sure it will show up.

I'm sure I'm not the only one on HN who has written a ray tracer, and once you
have it's easy to extend it to something like Metropolis light transport or
photon mapping. I remember when my code first worked (without error) how
amazed I was that such a small amount of programming can achieve such
realistic light simulations!

A website known as OMPF used to be about the only forum online for people
interested in ray tracing and global illumination. For some reason, it seems
as though it has gone down :(

------
DanielBMarkham
I would like to see somebody take consumer gear and matrix it to the point
where it supports real-time fully photorealistic ray-tracing.

Yes, it would be crazy expensive. Yes, it would be impractical. But it would
show that a) it is possible to create without inventing anything new, and b)
it costs $X dollars. Leave Moore to figure out the rest.

The industry could use a benchmark to tell us just how far away this is and a
model of how to get there without creating a billion-core CPU.

Also, it seems like this would be something that would be amenable to a FPGA
system. If so, it might be a lot closer than we realize. Don't know.

~~~
ChuckMcM
Uh, you mean like this ? <http://www.youtube.com/watch?v=blfxI1cVOzU>

Intel was pushing their Larrabee architecture as the answer for a while.

~~~
DanielBMarkham
Actually no. I've looked at a couple of these videos, and all they show is
that current technology can be pushed to somewhat approximate realtime ray-
tracing. Here's another example of that:
[http://www.youtube.com/watch?v=zbokPe4_-mY&feature=relat...](http://www.youtube.com/watch?v=zbokPe4_-mY&feature=related)

So I'll spec it out. HD video, 30fps, editable with in-world physics.

Don't see anything off-the-shelf that's close to that today. My question was
if it were possible to hack something together. If so, what did the
configuration/cost look like.

Perhaps the answer right now is "no". That's cool. Just wondering.

~~~
ChuckMcM
It is certainly possible to hack something together, the national labs have
done this for a while using off the shelf parts to create a 'video wall' that
allows them to visualize their simulations. Taking the linked example and
applying that technique would suggest you could use somewhere between 12 and
16 'copies' of their setup driving each driving a tile of your HD display.

'in world physics' is well managed by things like the PhysX engine that nVidia
sells.

------
kybernetikos
I find it strange that people talk about ray tracing as if it is _the
solution_ when in fact it's not even the best of the approximations to the
rendering equation that we have. For example caustics can't be adequately
represented with ray tracing.

Photon mapping, radiosity, etc can give much closer to photorealistic results.

