
Interactive real-time raytracing in WebGL - sxp
http://www.jonathan-olson.com/tesserace/tests/3d.html
======
yuriks
For anyone who suffers from terribly long load times/freezing when compiling
the shader, this is caused by the ANGLE wrapper Firefox and Chrome use to
convert the WebGL calls to Direct3D 9. You can get better results by disabling
it: [http://www.geeks3d.com/20130611/webgl-how-to-enable-
native-o...](http://www.geeks3d.com/20130611/webgl-how-to-enable-native-
opengl-in-your-browser-windows/)

I think the rationale for using ANGLE is to avoid driver bugs, but in my
experience, it causes much more damage than the problems it solves.
Practically any page with a non-trivial shader freezes Firefox for up to a
minute or sometimes even more on my machine.

~~~
hollerith
Since I am generally skeptical about "web technologies", when I read that
WebGL had landed in Firefox, I disabled it[1].

>Practically any page with a non-trivial shader freezes Firefox for up to a
minute or sometimes even more on my machine.

That makes me glad I disabled it.

[1]: In about:config, I set webgl.disabled to true.

~~~
flohofwoe
You should check whether your drivers and browsers are uptodate. Most WebGL
pages work just fine these days, although some (e.g. many shadertoy shaders)
tend to use too expensive shaders which might be slow on older integrated GPUs
from Intel or generally mobile GPUs.

------
hiddentao
Great work. I love seeing how the renderer quickly draws the basic details and
then ever more slowly over time fills in the finer points. I know that's
pretty much how ray tracing is working but still cool to see.

------
3rd3
I’m learning about rasterization based computer graphics at the moment and it
somehow bothers me that all of it might be obsolete in a couple of years when
we can do path tracing in real-time. It’s such a fast moving field.

~~~
dualogy
Don't think raytracing will kill off rasterization for the crucial "first
bounce" step anytime soon if we're talking about increasingly-detailed ever-
content-richer realtime 60FPS game renderers. Usually as soon as RT gets semi-
real-time for simplistic scenes at a rather-too-low resolution (without AA),
consumers move on to double the previous standard resolution (four times per-
pixel workload) such as the move from FHD to retina/hiDPI, we're back to
square one. Especially if say you find you need 4 split-screens stereoscopic
at 4k with full shading fidelity.

Rasterization is just too cool of a hack to avoid the tracing of primary rays.
In fact, it's rather all the hardware and software hacks that evolved _around
rasterization_ for 2 decades combined that make it so, not rasterization by
itself per se. Various GPU & CPU based culling methods, early & hierarchical
Z-buffer, today's realtime rendering pipeline is an amazing combination of
cool techniques that keeps getting better. Raytracers say "pff, with real
raytracing we wouldn't need all those dirty hacks", _but_ for one most of them
come with the GPU or are well-known and intuitive to implement, _plus_ to get
anywhere near-realtime they need to implement another set of even more (and
less comprehensible/intuitive/hardware-accelerated) hacks of their own.

All shading stages are also designed around rasterization and matured over the
last decade. It's good fun to write some pixel-shader-based raytracer but that
alone doesn't make it the better fit for current/next-gen gamedev at all.

The oft-antipicated "hybrid" approach however is finally arriving. Once you
have some "depth-aware screen-space" (whether voxelized or N-layers), you can
shoot specialized and rather simple rays to create diverse outstanding effects
in post-process, from much smoother water surfaces to of course "screen-space
raytraced reflections" (SSRR).

~~~
3rd3
When we are able to trace multiple rays with multiple bounces per pixel in
real-time, would the first bounce really carry that much weight compared to
the subsequent bounces that it would justify a separate rendering technique?

Besides that, what would you say to ilaksh’s comment?
[https://news.ycombinator.com/item?id=8060197](https://news.ycombinator.com/item?id=8060197)

~~~
dualogy
He's referring to demos that have been around for years and I've been
following the space, too. His argument doesn't really refute any of my points.
Knowing how the rasterizers work is never going to hurt you, quite the
opposite. If you're serious about computer graphics you'll really want to have
internalized both raytracing basics and rasterization basics. If I didn't know
how triangle meshes are fed into the GPU and processed until the "final fully
lit&shaded output pixel" for any one of my 94 favourite games, I'd feel pretty
uneasy. Rasterization will always be orders of magnitude faster by necessity:
that means when raytracing can finally render ca. 2004 scene complexity (think
GTA:SA -- hint, it still can't even at quarter-res), rasterization can render
ca. 2016 (or 2018 or whenever) scene complexity at full-res. Guess what the
folks at Rockstar, Ubisoft, Crytek wanna do? Throw more details, more content
variation, more procedurally-generated or -perturbed geometry at the screen at
60+ FPS. Sure Brigade can reach almost/barely 30FPS at some low resolution in
a small limited preprocessed scene and then there's no more room for any
physics, any AI, any animated crowds etc etc at all. They're doing important
work and one day the big payoff will come, but for the next 10 years knowing
how our rasterizers works will not be wasted at all.

------
bhouston
Amazing work. The most complex path-tracing shaders I've seen in WebGL so far.

My project, [https://Clara.io](https://Clara.io), also features interactive
ray-tracing but we do it server side using V-Ray. Example here:

[https://clara.io/view/a82f8a80-8a5e-4e61-8faf-3de9eb4313c2/r...](https://clara.io/view/a82f8a80-8a5e-4e61-8faf-3de9eb4313c2/render)

------
DiThi
In kubuntu 14.04, Intel HD4400, Firefox:

    
    
        Error: WebGL: drawArrays: incomplete framebuffer
    

Chrome:

    
    
        GL ERROR :GL_INVALID_FRAMEBUFFER_OPERATION : glDrawArrays: framebuffer incomplete (clear)
        GL ERROR :GL_INVALID_FRAMEBUFFER_OPERATION : glDrawArrays: framebuffer incomplete (check) (repeated many times)

~~~
muyyatin
Sorry about that! I had someone else report that it fails across all browsers
with the HD4400.

~~~
flohofwoe
It works well on OSX 10.10 with an HD4000 in Firefox, Chrome and even in
Safari. The "framebuffer not complete" warning looks like you're using an
extension which isn't supported everywhere, probably OES_texture_float or
WEBGL_depth_texture?

------
IvanK_net
I have made a Path-traced online game 2 years ago.
[http://pog.ivank.net/](http://pog.ivank.net/)

------
ilaksh
Why does the hn title say ray tracing. Its path tracing which is different. Is
this another one of those moderators editing things?

~~~
dualogy
Raytracing is a _very_ broad family of algorithms, path tracing _is_ certainly
"a raytracing algorithm".

~~~
ilaksh
Seems like they are similar but different.

~~~
huuu
Raytracing is the art of tracing a ray.

Pathtracing is the art of tracing paths of rays using raytracing.

~~~
ilaksh
Is it an art or a specific technique used in 3d computer graphics rendering?

~~~
huuu
Both ;)

------
hitlin37
On my ubuntu 12.04 with on-board intel graphics+ chrome browser. Error: GLSL
link error: for vertex shader: <fragment source>

~~~
luke-stanley
I got "Error: GLSL link error: for vertex shader" also. With Xubuntu / Ubuntu
12.10 running an Asus VivoBook S200E.

------
shurcooL
I'm looking forward to trying this on my iOS device once iOS 8 comes out.

------
spingsprong
It crashes my browser.

~~~
muyyatin
Sorry about that! Do you mind if I ask what browser and/or graphics card
caused this?

~~~
spingsprong
Firefox 30.0, Radeon HD4870

------
grimmfang
Incredibly beautiful. Thank you for sharing.

------
acomjean
long long time to load (over a minute). But it worked well once loaded. You
can move through the floor...

~~~
bowmessage
I don't think the author took effort to focus on control logic like motion
clipping through objects. It's more of a showcase of how quickly his
raytracing algorithm is able to render images. That was the real takeaway from
me, having written a raytracer in c++ it is amazing to see this running so
quickly, and in the browser too.

~~~
daeken
This is a damn impressive demo, but it's not really "in the browser". The
browser is the shell, but absolutely all of the heavy lifting is being done on
the GPU. No matter what you're using to push shaders over to the GPU, you're
really looking at the same performance (so long as that's all that's happening
-- it's obviously easy to do horrible things and slow down WebGL or D3D or
whatever you happen to be using for a graphics API).

~~~
muyyatin
Author here! It's very true that the GPU does the heavy lifting, although I'm
not sure exactly where an "in the browser" line could be drawn (conceptually
any page uses the browser as an execution engine and library).

It's a bit trickier due to restrictions in WebGL (OpenGL ES based) compared to
the desktop (e.g. no bitwise operators makes it a pain to get randomness that
doesn't bias the result), but it's basically the same.

~~~
cdi
Could you please list literature/papers that you found especially useful while
making that renderer? I plan to do the same thing for education purposes.

~~~
muyyatin
I've done some similar implementations years ago (but with C/OCaml, nothing on
the GPU), so I didn't refer to many things for this.

Fresnel reflectance is based off of [http://mathinfo.univ-
reims.fr/IMG/pdf/Combined_rendering_of_...](http://mathinfo.univ-
reims.fr/IMG/pdf/Combined_rendering_of_polarization_and_fluorescence_effects_-
_Wilkie.pdf), although I'm not doing any spectral or polarization-dependent
code right now (I wanted to leave that open, and it allows accurate metal
simulation).

I tinkered with a blend of some pseudorandomness functions until getting
something that worked.

[https://www.siggraph.org/education/materials/HyperGraph/rayt...](https://www.siggraph.org/education/materials/HyperGraph/raytrace/rtinter3.htm)
was used for ray-box intersection.

The distance function experiment for the drinking glass is based off of the
concept of
[http://www.iquilezles.org/www/articles/distfunctions/distfun...](http://www.iquilezles.org/www/articles/distfunctions/distfunctions.htm)
(like raymarching, includes normal computation).

I consulted [http://madebyevan.com/webgl-path-
tracing/](http://madebyevan.com/webgl-path-tracing/) to see the best way to do
accumulation (and made some realism fixes in
[https://github.com/jonathanolson/webgl-path-
tracing](https://github.com/jonathanolson/webgl-path-tracing), see
[https://github.com/evanw/webgl-path-
tracing/pull/1](https://github.com/evanw/webgl-path-tracing/pull/1) for
details).

Many other things can be pulled from online or from books like
[http://www.amazon.com/Game-Engine-Design-Interactive-
Technol...](http://www.amazon.com/Game-Engine-Design-Interactive-
Technology/dp/0122290631).

Please let me know if you have any questions, (see my email at
[http://jonathan-olson.com/about](http://jonathan-olson.com/about)), and
please feel free to use my code however you like (things I wrote are MIT, but
I use CC-by and CC-by-non-commercial HDR images).

~~~
cdi
Thanks!

------
arcameron
Excellent work!

