
Experimenting with Light Fields - dEnigma
https://www.blog.google/products/google-vr/experimenting-light-fields/
======
legitster
I just played their demo. Even on a VR rig, this is one of the more resource
intensive experiences.

Definitely very cool; cooler than it sounds. You get to experience a summation
of all the subtle touches missing from VR today - like how a reflection on
something changes as you rotate your head.

------
daenz
My understanding is that there is an insane amount of data stored in the light
field captures that must be streamed seamlessly to the headset. Can someone
shed some light (heh) on the technical issues surrounding light fields?

~~~
erikpukinskis
Light fields are large, but not insane, because they are highly compressible.
Think of them as a matrix of images, each representing a micro lens at a point
in the field. Most patches of the image seen through each lens are roughly
similar to a patch in the non-light field JPEG.... just skewed to a slight
different place.

This isn’t 100%... new image can be exposed by looking around a corner,
shifting reflections, etc. But each lens requires adding less and less to your
compression table.

So really, the light field is an index into a slightly expanded compression
space more than it’s a whole new set of data.

As for the complexity of rendering... it’s just a JPEG with a gnarly
projection equation. There’s nothing fundamentally more complicated about
that... although it’s complex math.

As with anything it takes time for code to mature, but in the limit there’s
nothing fundamentally more complicated about it than playing compressed video.

There’s probably some efficiency to be gained with custom compression
algorithms and custom hardware, although I don’t know enough about existing
video compression hardware to say for sure how far the existing chips will get
us.

------
ucaetano
Which comes just in time for the Lytro acquisition rumors:

[https://www.androidcentral.com/google-reportedly-buying-
lytr...](https://www.androidcentral.com/google-reportedly-buying-lytro-no-
more-40-million)

------
jrrohrssen
I have to say this was a really cool demo, although the subject's eyes
following me as I moved was a bit freaky

------
Cthulhu_
Is there a video that can show the result anywhere?

~~~
queltos
You can check this one out:
[https://www.youtube.com/watch?v=a-JX3ZPi720](https://www.youtube.com/watch?v=a-JX3ZPi720)

But like with all stuff VR, to "get it" you really have to experience it in
VR. Especially how different this feels from 360 pics (stereoscopic or not) or
photogrammetrie isn't obvious I think if you haven't seen it yourself.

------
mishurov
It is a sort of geometric optics approximation as I understand, otherwise it
was an electromagnetic or quantum field. It looks more for entertainment than
for research purposes.

~~~
sp332
The author is Paul Debevec. Give him a little credit.
[https://en.m.wikipedia.org/wiki/Paul_Debevec](https://en.m.wikipedia.org/wiki/Paul_Debevec)

It's a better approximation than you can make with a normal camera, so its
applications are even broader. Instead of a normal camera image which captures
the position of focused light rays on a plane, this captures the position and
direction of light rays as they move through a volume. Personally I would not
call such a versatile instrument useless.

