
Next-Gen Lighting Is Pushing the Limits of Realism - RaSoJo
http://kotaku.com/next-gen-lighting-is-pushing-the-limits-of-realism-1625324795?
======
berkut
Should be "Next-Gen Game Lighting".

This is just Physically-based shading basically, which has been done for the
past 4/5 years in VFX. Essentially it's energy-conserving materials with the
correct fresnel effect based on the surface's IOR which takes things to the
next level (for games at least). Doing this properly for layered surfaces
(e.g. diffuse wood layer with a clearcoat varnish layer) gives very nice
looking results.

Some complex materials can have up to 3 spectral BSDF lobes for reflection,
which can only really be done with pathtracing.

For VFX, people are starting to push into spectral rendering now, and trying
to optimise things like volumetric rendering for things like SSS which are
needed for ultimate realism.

~~~
unwind
OK, I'll don the Captain Jargon outfit and give that a go:

\- VFX is just "visual effects",
[http://en.wikipedia.org/wiki/Visual_effects](http://en.wikipedia.org/wiki/Visual_effects).

\- Fresnel might be a reference to "Fresnel refraction",
[http://en.wikipedia.org/wiki/Fresnel_diffraction](http://en.wikipedia.org/wiki/Fresnel_diffraction).

\- IOR is of course "index of refraction",
[http://en.wikipedia.org/wiki/Refractive_index](http://en.wikipedia.org/wiki/Refractive_index).

\- BSDF references "bi-directional scattering distribution functions",
[http://en.wikipedia.org/wiki/Bidirectional_scattering_distri...](http://en.wikipedia.org/wiki/Bidirectional_scattering_distribution_function).

\- Path tracing is a Monte Carlo method of rendering images with global
illumination,
[http://en.wikipedia.org/wiki/Path_tracing](http://en.wikipedia.org/wiki/Path_tracing).

\- SSS means "sub-surface scattering",
[http://en.wikipedia.org/wiki/Subsurface_scattering](http://en.wikipedia.org/wiki/Subsurface_scattering).

~~~
berkut
+1 - sorry, I do this stuff all day, so am used to the slang :)

Fresnel also effects just reflection as well - it's the reason you don't often
(depending on the material) see much of a reflection head-on on a shiny
surface, but do at a glancing angle (glass or car paint for example).

~~~
Kiro
What do you work with?

~~~
berkut
Renderers and shaders for one of the biggest VFX companies in the world :)

------
mischanix
"Next-gen" is a bit misleading of a description: the GPU (GTX 670) that this
scene is rendered in real-time on has about 30% more power than the GPU in the
PS4, although it was released 2 years ago. The CPU used to bake the lightmaps
has more than twice the power of the PS4's CPU.

~~~
sp332
It's not next-gen hardware, it's next-gen lighting. It doesn't matter how
powerful your GPU is if no one has written a good lighting algorithm for it.

------
doorhammer
Something of an aside: I'm most stoked about this kind of thing because of the
oculus rift. Even just playing through HL2 with my DK2 is an amazing
experience.

What I'm really interested in with things like the Rift is foveated rendering,
in order to get much higher quality graphics out of less computing power.
Basically, using high precision fast eye-tracking to only render the portion
of the screen your looking directly at, since your eye can't resolve detail to
any great degree outside of a fairly small area in the center of where you're
looking.

Foveated imaging in general:
[http://en.wikipedia.org/wiki/Foveated_imaging](http://en.wikipedia.org/wiki/Foveated_imaging)
MS Research paper on foveated rendering:
[http://research.microsoft.com/pubs/176610/foveated_final15.p...](http://research.microsoft.com/pubs/176610/foveated_final15.pdf)
MS trial on foveated rendering:
[http://research.microsoft.com/pubs/176610/userstudy07.pdf](http://research.microsoft.com/pubs/176610/userstudy07.pdf)

~~~
intruder
For foveation to go undetected, ideally the central region should update
within 5 ms. That's unreachable even with top hardware right now.

The MS paper achieved some nice results using a 120 Hz screen and a very fast
tracker, but the rift is far from that.

It's interesting, but comes with all kind of issues aside from delay
(periphery degradation strategy). I've been trying to adapt shadow mappin to
foveation and latency is killing it. Especially since shadows are high
contrast information, quality popping and self shadow flicks are so
distracting.

~~~
scott_karana
Would dual GPUs help the problem?

You could have one low-resolution GPU (such as an onboard Intel HD) and
another focusing _solely_ on rendering the foevated patch.

5ms per 320x240 section might not be implausible?

~~~
evo
The problem isn't solely in rendering fast enough--it's end-to-end latency.
Keep in mind that if your eye is being tracked by a 120Hz camera, that's 1/120
=~ 8.3ms latency just from inter-frame time alone. Add to that all sorts of
communication bus overhead, image processing the eye tracking imagery to
figure out eye direction, potential buffering in the display adapter, 5ms is
not really possible, yet.

That said, I don't think there's theoretical limitations that prevent
extremely low latency systems from happening, it's just not something we've
kept in mind when designing modern consumer hardware.

~~~
doorhammer
Yeah, I'm not an expert in this area, but my casual reading seems to make it
out to be similar to the problems you have with a lot of VR stuff, where the
hardware just hasn't needed to be ultra-low-latency before now, but that there
aren't necessarily hard limits that would prevent it.

If they ever get it ironed out, it seems like the kind of thing that could
offer pretty huge advantages for gaming, if just one person is looking at the
screen (I have no idea if anyone's considering multiple sets of eyes to track
or if that introduces particularly novel problems that wouldn't be covered in
solving the single set of eyes)

------
kristiandupont
Interesting choice with the Mies Van Der Rohe Pavilion, because as stunning as
it is in real life, it's simple geometry actually makes for a bit of a boring
tech demo. Still looks very impressive though.

~~~
ibuildthings
I this is a bit of legacy ( like the Stanford Bunny ). One of the seminal work
of lighting was Photon mapping by Henrik Jensen , and he used Ludwig Mies van
der Rohe's structure as an example (
[http://graphics.ucsd.edu/~henrik/animations/jensen-
the_light...](http://graphics.ucsd.edu/~henrik/animations/jensen-
the_light_of_mies_small.mpg) ). Keep in mind this was done 14 years ago! It
was a genuine wow moment for me back then!

------
halfcat
This reminds me of playing with POV-Ray years ago. I was so amazed what could
be done on a low powered PC, granted it took a long time to render, but still
was still somehow magical, like the first time you realize you can program a
computer. POV-Ray is free and can create some cool realistic looking
images[1]. This one is my favorite[2].

[1][http://hof.povray.org](http://hof.povray.org)
[2][http://hof.povray.org/images/ChristmasBaubles.jpg](http://hof.povray.org/images/ChristmasBaubles.jpg)

------
adrusi
Lighting does a lot to improve a scene, but unfortunately I'm almost certain
that it's pre-rendered lighting, which isn't all that interesting since any
dynamic objects present ruin the effect.

~~~
ZaneA
Not that this is any sort of proof but if you watch the last few videos you
can definitely see the shading change on the tree leaves that move around in
the wind.

~~~
berkut
Trees are static though - they only deform - they don't transform around the
scene itself.

------
imaginenore
These videos reminded me of the architectural renders of "The Third & The
Seventh" by Alex Roman

[http://vimeo.com/7809605](http://vimeo.com/7809605)

~~~
1ris
Thats very impressive. I wonder what is rendered and what is actually filmed.

~~~
Zecc
There is a compositing breakdown:

[http://vimeo.com/8200251](http://vimeo.com/8200251)

You'll probably find there is a lot more rendered than you expected.

------
hyperion2010
The one thing that is still hard is doing the radiosity. The way that I spot
CG stuff that can be rendered real time is that there simply are not enough
rad bounces and objects without direct lighting in the scene end up being too
dark. Baking lightmaps takes a long time, and realtime radiosity methods are
still CPU bound.

~~~
tripzilch
Sooner or later you will be making this statement about a piece of actual RL
video footage that just happens to have a bad exposure/gamma/colour balance :)

------
azinman2
Drool... Very impressive especially in a video game engine? I wish I knew more
about the 3d world and could do stuff like this....

~~~
adventured
If you're interested in it, you can work directly with the Unreal4 Engine at
an extremely low cost ($19 per month for access to everything, source code,
tools etc).

I ran a gaming startup in the days of Quake. It has really never been easier
or cheaper to acquire access to incredible tools, learning material, and
technology to get into the 3D space (whether for gaming or whatever).

Easy enough to at least download the Unreal 4 engine and begin messing around
with it, to see exactly how deep your curiosity goes. You'd find out quickly
if it's something you're really interested in.

Every time I see the Unreal 4 engine on display it tempts me to get back into
it all (I'd go toward VR now).

------
danmaz74
This is incredible. It's only a pity that many straight lines are just too
straight.

~~~
coldtea
You see many crooked lines in real life architecture when the wall/surface is
supposed to be straigt?

~~~
danmaz74
In real life architecture the wall/surface is never _perfectly_ straight. This
is one of the most common signs that give away 3d renderings.

~~~
coldtea
> _the wall /surface is never perfectly straight_

Where you got the idea that a wall is not perfectly straight? You mean some
marginal differences in curvature that the human eye cannot distinguish from a
straight line anyway?

That would be trivial to mimic in a 3D renderging anyway, literaly just a
modelling command away.

I'll have to agree with TFA that it's the lighting and textures that give away
3D renderings.

~~~
danmaz74
In my experience the human eye can spot the lack of that kind of imperfections
very well. Not quantify them - just notice that something is wrong.

Regarding walls (and edges in general) not being perfectly straight, apart
from what I learned from my engineering studies, I've done my fair share of
home repairs :)

------
sirmarksalot
If I were an architect, I would be using this to pitch to clients. I know
virtual fly-throughs are a thing, but the existing demos I've seen just don't
give the same sense of presence as this does.

~~~
GrantS
Indeed, architects have definitely been using the Unreal Engine for quite
awhile, even before it started looking this good. Here's an article from 2007:

[http://www.businessweek.com/stories/2007-12-21/unreal-
archit...](http://www.businessweek.com/stories/2007-12-21/unreal-
architecturebusinessweek-business-news-stock-market-and-financial-advice)

~~~
talmand
I once worked for a company creating pool designs and entire backyards for
pool and landscaping companies in UnrealEngine 2.5.

------
huuu
Ofcourse the images look very nice but I'm not very impressed. A lot of
artists create photo realistic images. For example the Ikea catalogue[1] has a
lot of CG images.

These are just pre-rendered textures and light maps combined with real time
lighting and reflections in Unreal Engine 4.

But I agree that UE4 does a very very good job at realistic real time
lighting! Also take a look at the blog of Paul Mader:
[http://paulmader.blogspot.nl/](http://paulmader.blogspot.nl/)

[1]
[http://www.cgsociety.org/index.php/CGSFeatures/CGSFeatureSpe...](http://www.cgsociety.org/index.php/CGSFeatures/CGSFeatureSpecial/building_3d_with_ikea)

~~~
3rd3
What about the changing angle of light incidence in this fast-forward scene?

[https://www.youtube.com/watch?v=rOkJ1-vnh-s](https://www.youtube.com/watch?v=rOkJ1-vnh-s)

It somehow looks like it would be dynamic diffuse indirect lighting. Wouldn’t
that need to be pre-rendered for each position of the sun then? Or it’s just
really good use of shadow maps, ambient occlusion and screen-space specular
reflection:

[https://docs.unrealengine.com/latest/INT/Engine/Rendering/Po...](https://docs.unrealengine.com/latest/INT/Engine/Rendering/PostProcessEffects/ScreenSpaceReflection/index.html)

Edit: This particular scene uses light propagation volumes, in the other
scenes it’s UE’s pre-computed (hence static) Lightmass GI, as parent comment
said.

[https://docs.unrealengine.com/latest/INT/Engine/Rendering/Li...](https://docs.unrealengine.com/latest/INT/Engine/Rendering/LightingAndShadows/LightPropagationVolumes/index.html)

[https://docs.unrealengine.com/latest/INT/Engine/Rendering/Li...](https://docs.unrealengine.com/latest/INT/Engine/Rendering/LightingAndShadows/Lightmass/index.html)

~~~
huuu
What UE4 can do is very impressive. But when it comes to real time photo
realism I think path tracing is even more impressive. For example this Brigade
3.0 demo:
[https://www.youtube.com/watch?v=BpT6MkCeP7Y](https://www.youtube.com/watch?v=BpT6MkCeP7Y)

~~~
1ris
I'm not that impressed, that noise is really anoing and the detail level is
not that high. There are no interesting surfaces and no demo of excessive
refraction or reflection. I'd like to see how a stature of glass would look
like with this renderer.

Why don't they add a very cheap real time renderer that produces fast and high
resolution images and later add the low resolution output from that path
tracker? I guess that would make noise a lot less distracting, as there where
no black pixels anymore.

~~~
3rd3
I can't find it ATM, but I think I saw a technique that did exactly that using
something like joint bilateral upsampling: [http://research.microsoft.com/en-
us/um/people/kopf/jbu/](http://research.microsoft.com/en-
us/um/people/kopf/jbu/)

~~~
3rd3
[https://www.youtube.com/watch?feature=player_detailpage&v=ZZ...](https://www.youtube.com/watch?feature=player_detailpage&v=ZZAD7ysok6w#t=245)

The filter_indirect in this demo does exactly that.

------
adam-a
It's worth noting that these scenes take around 10 minutes to render a frame,
so it's still a long way from real time.

Detailed in the thread -
[https://forums.unrealengine.com/showthread.php?28163-ArchViz...](https://forums.unrealengine.com/showthread.php?28163-ArchViz-
Lighting&p=118443&viewfull=1#post118443)

Very pretty though still!

~~~
akavel
Hm, from another post[1]:

> _is that video was rendered real time?_

> Yep (well, the lightmaps are pre rendered but it take like 10 min. In
> engine, it runs at 50-60fps on a gtx670)

So, if I understand correctly, that's 10 minutes for some "lightmaps
prerendering", but then "real time scenes rendering"? But I'm totally not into
rendering, so sorry if that's what you meant, and if that's something obvious
I'm just missing.

[1]:
[https://forums.unrealengine.com/showthread.php?28163-ArchViz...](https://forums.unrealengine.com/showthread.php?28163-ArchViz-
Lighting&p=121658&viewfull=1#post121658)

~~~
ShinyCyril
To add to that, the prerendering is done as part of the preparation of assets
- if someone were to download and run this project then the assets are ready
to go!

------
eli_gottlieb
That's nice. How many game or film studios will be driven bankrupt by the
_yet-again_ increased cost of content production?

But yes, Blood Soaked Shoot-em-Up 93x-treme is going to look very realistic.
It will be almost as if I was really invading Iraq!

~~~
ejr
I see another avenue for this. "Reality simulation" \- not virtual reality -
will, I think, be something to explore. Especially with something like Oculus
down the line, there would be a market for it. Why surround yourself with your
humdrum home when you can live in a much better one with better surroundings.
Watch TV virtually. Read a book virtually or even take a nap on your virtual
sofa in front of a magnificent view while on your "real" bed surrounded by
boring walls you don't see.

~~~
eli_gottlieb
That's just... ew. No. That's just _wrong_. The average person's living
conditions shouldn't be so poor in the first place that he feels the need to
step into an Oculus to get a _nice sofa_.

~~~
ejr
Living conditions don't need to be poor. Think of it as a hyper-realistic
Second Life. There's a reason Sims are popular too.

~~~
eli_gottlieb
I'm pretty sure that on an absolute scale, Second Life and The Sims are not
actually very popular at all.

