
Introduction to Nvidia RTX and DirectX Raytracing - mstich
https://devblogs.nvidia.com/introduction-nvidia-rtx-directx-raytracing
======
hyperpallium
Raytracing is great, but are we really close enough to get reasonable quality
in realtime?

My feeling is that, like VR and strong AI, we are still a ways off.

OTOH, even if raytracing just gives very low-quality, but acceptable results,
it could be a great way to sell more powerful GPUs. The thing is, for low-
quality to be accepted, it must be better at _something_ else. (Like,
smartphones were less powerful than PCs, but portable and convenient).

What does raytracing have going for it, even without throwing pixar-scale
resources at it?

e.g. without the need to fine-tune layers of hacks, it might be better in
dynamic, non-designed environments; or, for graphics to be put together with
less (expensive) expertise and time.

~~~
fwilliams
Having first class ray tracing primitives on the GPU gives us an easy way to
implement a wide variety of effects, even with a relatively small number of
rays. While we are certainly far from getting real time performance in
generating images comparable to those made by offline rendering techniques,
being able to trace rays into the scene efficiently does allow for new degrees
of realism in real time renderers.

For example, diffuse indirect radiance tends to have be low frequency in
space, so we can construct good approximations of it using only a few samples
(see for example, voxel cone tracing). Tracing rays is an easy way to sample
the geometry of the scene and construct a diffuse GI approximation.

Doing proper reflections and refractions are also not really possible with
rasterization pipelines. Game engines do a lot of hacks to approximate
reflections and don't really do proper refractions at all. With a ray tracing
primitive, these effects become possible. For a good example of what a modern
rasterization engine does see
[http://www.adriancourreges.com/blog/2016/09/09/doom-2016-gra...](http://www.adriancourreges.com/blog/2016/09/09/doom-2016-graphics-
study/).

Rasterization is very good at generating the visible geometric information in
the scene which serves a purpose, but for general purpose lighting we need to
be able to sample what's going on in non-visible parts of the scene.

I think having GPU support for ray tracing is exiting as a complement to
existing rasterization techniques: You can generate the geometric information
and attributes of your visible objects with rasterization and use ray tracing
to sample your scene and get accurate light transport effects like reflections
and global illumination.

~~~
hyperpallium
It doesn't hurt to have raytracing primitives with efficient implementation.

I see that GI and ambient occlusion are generally low frequency in space, but
don't reflections and refractions require higher fidelity?

uh, I'm thinking mirror-like reflections, but there are softer reflections,
like off a matt red wall (seems GI-like).

~~~
fwilliams
Caustic reflections are very high frequency in space and for non-trivial
geometry likely won't work well in real time (unless you fake it somehow).

That said, just rendering mirror surfaces or refractions doesn't require more
than an extra ray per interaction (one in the reflection direction and one in
the refraction direction). You can add a texture and get bumpy reflectons or
refractions as well.

The difference between these easy effects and the high spatial frequency
caustic effects is indirect versus direct lighting. It's very easy to ray
trace a a simple transparent model but a bright spot on the floor (e.g. [0])
below a reflective objects requires a lot of ray samples to capture directly.

[0]
[http://news.povray.org/povray.binaries.images/attachment/%3C...](http://news.povray.org/povray.binaries.images/attachment/%3C3d0337be%40news.povray.org%3E/Caustic-
Ring.jpg)

------
tapirl
What are the differences to Intel Larrabee?

~~~
thechao
LRB & DX Raytracing are completely different beasts. LRB was a many-core
device with some GPU features (the addsets, and fmaddXYZ instructions;
texture-units at the turnstiles). DX Raytracing is a high level software API
for driving specific classes of compute- and/or fragment- shaders onto plain-
old GPUs. I believe the intention is to have dedicated Raytracing HW support
in the future. I don't think dedicated HW is currently available.

------
mishurov
It's a typical raytracing / pathtracing pipeline. Ray intersection accelerated
via BVH or KD-tree and computations of BxDF at intersection points.

It would be great if they make an API for the ray intersection routines
outside Microsoft's DirectX it can be very useful in post-production
renderers.

------
smaili
Link to the Microsoft announcement --
[https://news.ycombinator.com/item?id=16620423](https://news.ycombinator.com/item?id=16620423)

