
PowerVR GR6500: Ray tracing is the future and the future is now - filipncs
http://blog.imgtec.com/powervr-developers/powervr-gr6500-ray-tracing
======
bhouston
Interactive raytracing is the future both in mobile and desktop. My project
[http://clara.io](http://clara.io), an online 3D modeler + renderer, recently
gained the ability to do embeded interactive ray traced 3D embeds using V-Ray
(one of the best and most accurate renderers in the world).

It works best on Chrome, Firefox and Safari:

[https://plus.google.com/u/0/+BenHouston3D/posts/DYq2RKJENC5](https://plus.google.com/u/0/+BenHouston3D/posts/DYq2RKJENC5)

Here is another example:

[https://twitter.com/exocortexcom/status/443538733661704192](https://twitter.com/exocortexcom/status/443538733661704192)

I can only see raytracing becoming more popular.

~~~
s-macke
That's really great. Did you use emscripten or is the renderer programmed in
Javascript?

~~~
bhouston
It is cloud streaming, this is because we do not want to be limited by the
CPU/memory capabilities of the client. V-Ray is an amazing renderer, but for
complex scenes it can require +32GB and use up to 40 cores or even clusters of
machines. We want to virtualize that infrastructure for our users.

------
modeless
John Carmack is a fan: "I am very happy with the advent of the PVR Wizard ray
tracing tech. RTRT HW from people with a clue!"
[https://twitter.com/ID_AA_Carmack/status/446021820290842624](https://twitter.com/ID_AA_Carmack/status/446021820290842624)

------
fidotron
It's best to remember that ray tracing isn't the be all and end all of
graphics since it's far harder for artists to control the results than the
other hacked up ways. One consequence of this is their example comparison,
where I think it's entirely subjective to say the ray traced output is better.
Neither is particularly great.

The big win is around indirect lighting, but the development of that in
standard rasterisers in the last decade has exceeded even my wildly optimistic
expectations.

New options = good, but there's no such thing as a graphics silver bullet.

~~~
tinco
I'm sorry man, but that sounds crazy. Raytraced scenes are much easier for
artists to control the results because things actually behave like you would
intuitively expect (if the amount of rays is high enough).

If raytraced scenes were hard to control, why would every single animated
movie be raytraced?

Raytracing really is the be all and end all of graphics. The more rays you can
render, the more realistic your scene will be, up to 100% realism.

Of course, we can come pretty far with the hacks as well, and it's hard to say
if hardware raytracing can come close to the quality hardware shader hacks can
achieve on traditional GPU's in real time.

~~~
chrisseaton
"If raytraced scenes were hard to control, why would every single animated
movie be ray traced?"

I'm not entirely sure that's actually true.

In this paper
([http://graphics.pixar.com/library/RayTracingCars/paper.pdf](http://graphics.pixar.com/library/RayTracingCars/paper.pdf))
Pixar talk about how the first movie they even tested using ray tracing on was
Cars. Before that they were using scanline rendering.

~~~
berkut
That's Pixar - they're behind the curve, and PRMan 17 and 18 (their commercial
renderer they sell) has been pretty poor at full raytracing (monte-carlo
integration) due to poor acceleration structures and the overhead of their RSL
shading language.

PRMan 19 looks like it's going to fix these issues to some degree.

Other renderers like Arnold and VRay (full raytracers) have been being used
for the last 5 years by other studios.

~~~
pjmlp
Funny to see Pixar getting behind. I remember when I was amazed what Renderman
could do in NeXT stations.

------
higherpurpose
What's the difference between ray tracing and path tracing? And could we have
a path tracing chip?

This demo was pretty impressive. I think they said they used 4 Titans at 720p.

[https://www.youtube.com/watch?v=aKqxonOrl4Q](https://www.youtube.com/watch?v=aKqxonOrl4Q)

~~~
InclinedPlane
Ray tracing works backwards from the way actual optics works. It sends out a
ray for each pixel and then finds out what it falls on then does math to
figure out what it should look like. In a simple example, you have a few light
sources and you trace a ray back to an object (polygonal surface), then you
shoot off other rays from there toward the light sources (or you shoot off
potential reverse reflection rays). If something is blocking a light source
you adjust the amount of shadow being rendered or you bounce off of that
surface again toward the light sources, and so on. Making it possible to
render shadows, reflections, and so on with decent fidelity.

In contrast to that technique you have "radiosity", where the illumination at
each point on a surface is calculated based on its environment. Radiosity is
better at rendering diffuse light, ray tracing is better at rendering
reflected light.

The underlying difficulty is that illumination is a maddeningly combinatorial
problem. In principle every part of every object in a scene contributes
illumation to every part of every other object. The light falling on a desk
lamp from the LEDs of a clock also illuminates the desk itself, and so on.
Radiosity and ray tracing attempt to solve those problems by making
simplifying assumptions and performing a subset of the calculations necessary
to illuminate and render a scene completely faithfully. In principle a more
proper ray tracing algorithm would be to send out a huge number of rays in
every direction for every image field ray, then follow each of those through
their evolution, sending out yet more for surfaces each fall on, and so on.
But that's far too computationally intensive (it's an O(n!) level problem).

Path tracing is similar to these techniques but makes different simplifying
assumptions. And the most important aspect is that instead of
deterministically calculating everything in a specific way it uses a monte-
carlo method of statistically sampling different "paths" then using the data
to estimate the resulting illumination/image rendering. Path tracing results
in a compromise between ray tracing and radiosity of being able to render both
reflective and diffuse light well, though it has its own short-comings.

~~~
gfodor
This sounds somewhat wrong. Ray tracing is just another term for path tracing,
as paths are made up of traced rays. A naive camera-to-light witting ray
tracer is still a sampling path tracer but has bias towards a specific subset
of paths, namely those which follow specular bounces from the eye to the light
or those which have one diffuse bounce.

Much of the progress in photorealistic rendering has come from improving the
techniques used to sample paths. Bi-directional path tracing samples paths by
tracing paths from both the light and the eye and performing a technique
called multi-importance sampling in order to weight them appropriately. This
allows the algorithm to pick up more light paths that would otherwise be hard
to reach, such as those which form caustic reflections.

One of the more recent developments on this front is the use of the "specular
manifold", which allows a path sampler to "walk" in path-space that allows
capturing of even more partially-specular paths. (See Wenzel 2013.) This
technique allows efficient sampling of the main light transport paths in
scenes where a specular transparent surface surrounds a light source, ex a
light bulb casing.

Edit: for this specific hardware, it sounds like they are using a hybrid
approach so may very well be doing a basic eye-to-light ray tracing algorithm
and then using raster for diffuse surface shading.

~~~
InclinedPlane
I thought I was clear that path tracing and ray tracing shared similarities.
But ultimately they differ in several key ways.

For example, ray tracing tends to be deterministic, and it starts at the view
frame and moves toward the scene and then light sources. Path tracing has no
such constraint. Modern path tracing techniques use a combination of ray-
tracing-like paths as well as paths beginning from light sources (gathering
rays and shooting rays, in the parlance of the field).

In principle you could consider path tracing to be some sort of specialized
subset of ray tracing (or vice versa for that matter) but in practice it is
different enough in its specifics and implications that it makes more sense to
treat it as merely related.

------
zurn
Interesting to see if they can sell this, as ~everyone is invested in doing
OpenGL. Maybe if Apple put it in all their iOS devices...

------
kayoone
I dont know much about the differences but the samples images (like the yacht)
don't look particularly impressive or real to me.

~~~
girvo
Mobile. They can do that on mobile. :)

~~~
Pacabel
Today's mobile devices, especially the higher-end ones, are quite powerful.
They offer more processing power, memory and storage than laptops did just a
few years ago, and desktops just a few years before that. So I don't think
that "doing it on mobile" is really that powerful of an argument any longer.

~~~
girvo
Real-time raytracing isn't something we've had on mobile. Which is kind of the
point of this article...

------
PhasmaFelis
"... the potential that these technologies have to revolutionize the user
experience from mobile and console gaming to virtual and augmented reality."

I've been interested in ray-tracing since the early '90s, and I'm glad it's
finally coming to real-time, but this isn't going to "revolutionize" shit.
It's going to make 3D games and VR slightly prettier than they were. It's not
going to enable new styles of gameplay or new modes of interaction. We will
never again see anything like the enormous forward leaps in realtime graphics
that happened during the '90s.

I'm also a bit put off by their comparison showing that PowerVR has better
reflections, shadows, and transparency than a raster engine with reflections
and some shadows turned off and a very poor choice of glass-transparency
filter.

On another look, I don't even know what they're going for with the shadows.
The rasterized image has "NO SHADOWS" printed right between the shadows of a
building and a telephone wire, and their hybrid render has the light from the
diner windows casting shadows across outside pavement in _broad daylight_.
Bwuh?

------
zokier
So what is the API and libraries like? Is there any sort of built-in OGL
fallback or do devs need to write two completely different renderers? Is there
standards relating to this; are any other vendors going to be implementing
same API? Is this new hardware compatible with the older Caustic2 cards?

------
r4pha
It never fails to amaze me how powerful raytracing is. Last year I took an
"Image Synthesis" class and did a quick presentation about a post I've seen
here on HN about a raytracker in 1337 bytes [0]. It is amazing how such a
small program can generate an image with depth of view, shadows and texture.

0:
[http://fabiensanglard.net/rayTracing_back_of_business_card/i...](http://fabiensanglard.net/rayTracing_back_of_business_card/index.php)

~~~
touristtam
Thanks I completely forgot about that one. :)

------
plaes
Open source drivers is the future.. but it's nowhere now :(

~~~
zanny
I see headlines like this and I'm like "are they still in the driver dark
ages? Yep? Call me when they stop acting like a 2 year old and play ball in
Mesa"

------
stcredzero
_high-performance ray tracing, graphics and compute in a power envelope
suitable for mobile and embedded use cases._

If true, color me impressed!

------
bond
This brings back some memories. Used to do some experiments using POV Ray back
in the day.

I remember how slow the process was, it could take several hours/days to
generate an image full of reflections, but in the end the results were usually
stunning...

Link: [http://www.povray.org/](http://www.povray.org/)

~~~
beagle3
Oh, the memories ...

Before it was renamed "povray", it was called "dkbtrace". I printed the entire
dkbtrace source and studied it, to see how a real Ray Tracer worked; and
through it, I learned how efficient vtbls work (it was 1990 C, and though C++
was already starting to become visible and popular, it was still "that new
language that may or may not become popular" \- dkbtrace implemented all the
OO inside).

Thanks, D.K.B.

------
ohwp
I think the examples used are not very impressive. And I think path tracing
(which is a kind of ray tracing) is more interesting for the future.

A nice blog about real time path tracing is:
[http://raytracey.blogspot.nl/](http://raytracey.blogspot.nl/)

------
ilaksh
I think the sample frames demonstrate that hybrid ray tracing is less
realistic that pure path tracing. I hope that someone figures out how to make
stuff like the Brigade 3 demos based just on feeding geometry and textures to
hardware.

------
eigenbom
I'd wager that the current rasterising pipeline is more flexible than one
based on raytracing - capable of generating a range of styles, not just
photorealistic ones. And therefore having an extra dedicated chip for
raytracing seems uneconomical.

The comparison examples given in the article were slightly ridiculous. How
does a non-reflective car represents 'traditional' rendering? Look at any
great AAA game and you'll see reflections, refractions, radiosity, etc., that
are all pretty amazing. I don't think that general demand will be there for
alternative rendering hardware for quite a while.

~~~
dualogy
> I'd wager that the current rasterising pipeline is more flexible than one
> based on raytracing - capable of generating a range of styles, not just
> photorealistic ones

Nope. I'd say the "raytracing pipeline" really does replace only the
rasterization step, and once the triangle to draw is "found", you do whatever
shading/lighting/fx/fragshader you want to draw it with? Wouldn't this be the
most sensible approach?

Rasterization in lay-man's terms really is just "figure out which triangle, if
any, is 'hit' at this pixel", so a historically much faster and very neat hack
to avoid the tracing of a ray..

But you don't get cheap soft-shadowing / ambient occlusion / reflect-refract
and you'd need to do occlusion culling seperately for current-gen "complex"
scenes to avoid a drawcall for all kinds of hidden objects. That's where at
some point raytracing as it becomes more feasible also becomes much more
attractive. Potentially also reducing geometry-LODing headaches etc..

------
mey
Two thoughts: I would love to see a video of a dynamic environment (aka
actors/objects moving in a scene). Also, how long before cryptocurrency miners
use these to gain step function over current gpus.

~~~
duaneb
> Also, how long before cryptocurrency miners use these to gain step function
> over current gpus.

I can't imagine these ever being cheaper than ASICs.

------
Kiro
I thought Wolfenstein 3D was an example of ray tracing but reading this
article and Wikipedia it seems to be all about lighting effects. What do I
misunderstand?

~~~
faddotio
I think you're thinking of raycasting.

------
Geee
This is incredible! Mobile GPU does 300 million rays per second without using
any shading GFLOPS? This is the GPU that is going to be on the next iPhone,
right? Brigade 3 does 750 million rays per second on Nvidia GTX 580 with full
power. I just wonder what this thing could do when scaled up.

------
jbverschoor
The biggest problem here is content.. Producing content is the most expensive
aspect of a game.

Improvement in tooling and reuse are the only way we can actually properly use
better rendering

~~~
zurn
Wouldn't ray tracing make content production easier, needing fewer tricks to
fake behaviour of light and shadow?

------
herokusaki
Would be fun if this relegated traditional GPUs to being used only for mining
cryptocurrency.

------
touristtam
is this not from the same guys?
[https://www.youtube.com/watch?v=rfgz90Y93c0](https://www.youtube.com/watch?v=rfgz90Y93c0)
< google tech talk

------
mjcohenw
How about FP64?

~~~
jjoonathan
They'd have to either sacrifice yield or sacrifice FP32 units.

Also, if they put FP64 support in their gaming cards they wouldn't be able to
charge scientists and engineers extra.

~~~
duaneb
> Also, if they put FP64 support in their gaming cards they wouldn't be able
> to charge scientists and engineers extra.

Bingo. Not many entertainment problems demand such precision. Why not charge
more for it?

~~~
jjoonathan
Well, if AMD had taken the need to get their GPGPU offering out the door
seriously or if they had later taken the need for CUDA compatibility seriously
then Nvidia would have had to compete.

Alas, we're talking about a company that doesn't even think it's important for
their installer to reliably replace existing drivers. The result: I get to pay
through the nose for my predecessor's CUDA lessons. Yay.

------
bobsgame
I for one welcome our new PowerVR overlords.

------
leeoniya
"life-like reflections"[1]?

allow me to disagree. the reflection is too green. probably not the hardware's
fault, or is it?

[1] [http://blog.imgtec.com/wp-
content/uploads/2014/03/5_-PowerVR...](http://blog.imgtec.com/wp-
content/uploads/2014/03/5_-PowerVR-Ray-Tracing-hybrid-rendering-1.jpg)

~~~
leeoniya
hmm, looks better now on desktop than on phone

