
Holographic optics for thin and lightweight virtual reality - onurcel
https://research.fb.com/blog/2020/06/holographic-optics-for-thin-and-lightweight-virtual-reality/
======
rsweeney21
For analysis of new AR/VR technology I highly recommend Karl Guttag's blog:
[https://www.kguttag.com/](https://www.kguttag.com/)

He's an engineer that understands the limitations of physics, especially when
it comes to optics and light. He is very good a parsing marketing hype vs
reality. (He called BS on Magic Leap years before it launched.)

His posts are exceptionally well researched and explained, even if you don't
have a background in physics or optics.

He recently did an analysis of the Apple Glass leaks. I expect he'll post his
thoughts on this new technology from FB soon.

~~~
oh_sigh
I'd be careful about statements about 'limitations of physics'. Yes, there are
actual physical limitations, but we frequently have beliefs about the
limitations of physics that are not actually limitations of physics.

~~~
lrhegeba
to strengthen your argument here an example where a beliefed limitation was
circumvented (and resulted in a nobel prize): see
[https://en.wikipedia.org/wiki/STED_microscopy](https://en.wikipedia.org/wiki/STED_microscopy)
and the Abbe limit

i suppose parent was downvoted bc some thought he was questioning the "hard"
laws of physics. but perhaps he could correct this by traveling back in time
and changing his wording ;-)

------
dougmwne
I'll say this from a consumer/product point of view and not a technical or
engineering, but WOW! Assuming they could pull off a finished, high
performance product, (big if) this could be the form factor that bring VR to
dominance. The silly looking, awkward to wear headset is a big hindrance to
adoption because it absolutely matters if you look silly or cool when it comes
to consumer tech.

~~~
Valgrim
The fact that these glasses are clearly opaque remind me of Marty's reaction
when he sees Doc's brown futuristic glasses in "Back to the Future". It does
look a bit silly, but at the same time it looks useful, and not heavy and
encumbering like most VR headsets.

~~~
buu700
To be fair, they wouldn't look weird if you just held a cane while wearing
them. They could probably even fit some electronics into a cane form factor.

~~~
gumby
Would make for some excellent ad hoc light saber games too!

------
lukevp
I've been curious about something in AR for a while, and I can't seem to find
the right terms to query this. Why can't ambient light be used for
illumination, and an LCD + polarizer to darken each pixel? If you can
approximate the illumination of each point of light going through the LCD (for
example, with an outward-facing camera above each lens capturing a full-color
image and interpolating it down to where it would be overlayed on top of the
LCD) you would know how much you would need to darken each sub-pixel's color
to compensate for the light coming in. Then it would be super low-power since
there would be no backlight. Also, if you wanted "Transparent" mode you could
make it fairly clear. I'm sure there's some reason this is untenable but I'm
not really sure what it is - perhaps the inability to be accurate with the
compensation?

~~~
ahupp
If you can only subtract light the contrast of what you're viewing against the
background will always be poor, almost by definition. Also doesn't handle
color.

~~~
lukevp
Why could it not handle color if it had R/G/B subpixels and adapted the
intensity of each pixel based on the color of the light passing through that
area of the screen?

The intensity of light is quite high outside, could an LCD not darken it
enough to have sufficient contrast if you were OK with it only working
outside?

~~~
numpad0
LCD can’t darken, aren’t that transparent in the first place, also can’t focus
without microlenses and you won’t be able to look through microlenses. If you
still think it viable you can prototype the optics you described using old
Nokia phone LCDs or some later GameBoy models that has easily removable
backlights.

~~~
anmorgan
Are you being specific about the word darken? Because LCDs are design to
either allow or block light. So technically an LCD can "darken" incoming
light. But I think the point of inconsistent lighting from the environment
would be the biggest challenge.

The best example of transparent display I've seen in production
is:[https://glas.johnsoncontrols.com/](https://glas.johnsoncontrols.com/)

But the material on that seems darker and I believe they use an edge light to
provide higher contrast.

I think the distance from the eye, as well, would be difficult if you aren't
doing some type of projection, like what is used in the the article.

~~~
numpad0
Darken as in response to “can’t LCD darken enough” in parent.

Anyways any LCD can be a transparent LCD if you peel off backlight layers if
you want to experiment with. Search for “DIY transparent LCD” or “LCD side
panel casemod”

------
tmabraham
I don't know too much about optics and VR/AR, but can this technology can be
adapted for AR applications (a la Google/Apple Glass)? Personally, I think AR
applications are much more interesting than VR (especially in the short
term)...

~~~
PaulHoule
That is what Hololens 2, Magic Leap, and some other products are. These are
"cool" but have not set the world on fire in terms of product-market fix.

If you are building an AR system there is always an awkward balance between
"letting the environment shine through" and "having projected items be bright
enough to be visible". If you put something black in front of the holograms at
least now you have just one problem instead of two problems.

~~~
tmabraham
So you think that instead of something black, if something clear was kept in
front, it would technically be an AR system? Of course, then the question is
how to make sure the projected items are bright enough with the environment
shining through.

~~~
PaulHoule
Yep. Also that the grooves in the hologram don't screw up the light that
passes in through the front. For instance you might get "lens flare" effects
which could get obnoxious.

It is a lot of details to work out, which is why current AR headsets are still
at the bridesmaid and not the bride phase.

~~~
tmabraham
Oh okay very interesting. I didn't realize that AR headsets were significantly
behind compared to VR headsets. But the question is how behind is it, because
clearly Apple is able to somehow pull it off, right?

~~~
PaulHoule
Well, to "pull it off" involves multiple levels of development.

VR headsets are workable, but somewhat expensive, and content is lacking.

AR headsets are very expensive, have poor image quality, and even less
content. It seems every defense contractor and electronics conglomerate got
patents for holographic waveguides in the 1990s when the F-35 was under
development; that headset is not so bad but it costs $250,000 and the original
version was heavy enough to break your neck when the election seat fires.

Apple may be working on an AR headset, it may be a big hit in the end, but I
will believe in product-market fit when I see it.

------
praveen9920
Are there are AR/VR technologies which are addressing software developers? I
would very much like to replace my shitty monitor with AR/VR glasses for
development.

I know this may not seem like "intended use case", but the developer
experience can use some innovation for a change. Also one way to bring these
technologies close to developers.

~~~
pdehaan
Not for coding. The resolution just isn't there yet.

Roughly, per-eye resolution is in the same ballpark as HD displays, but
stretched over a 90+ degree field of view. Fonts need to be very large to be
legible. You can create a theater sized virtual monitor, but it's just taxing
to use. Aliasing artifacts make it worse.

At least for text-focused tasks, I'd take virtually any display built in the
past 40 years over a modern VR headset.

~~~
moron4hire
> Not for coding. The resolution just isn't there yet.

Spoken like someone who hasn't programmed in VR yet.

I don't think you'll be programming any operating systems in VR anytime soon,
but there is still a lot of programming, specifically object scripting, that
could be done in VR. A number of people--including myself--have built demos
that prove out the concept.

One of the reasons is that text legibility is not strictly about display
resolution. Motion within the view improves legibility significantly. Yes, the
fonts render to very large pixels. But the specific pixels they render to are
constantly changing. Your brain fuses those images over time. I'm not able to
find the paper right now, but the US Navy did a study that proved pilot visual
acuity improved when they were in a dynamic scenario. The study performed a
visual acuity test where pilots had to identify letters in view from within a
flight simulator. One group had full use of the simulator in motion, one was
told the simulator motion systems were broken, but they still sat in it to
perform the same test rendered on the same screen.

And as you said, larger fonts are easier to read. There is a lot of spatial
resolution in VR that is not used very often. You're used to thinking about
organizing your code on a 2D display, but you have an entire 3D environment
around you. That environment could be a zoomable interface where code editors
are linked to live objects. Use individual editors for individual code units.
Organize them in a tree structure linked to the object. Trees organizers are a
lot easier to navigate in 3D than on a 2D screen, especially if you eliminate
window scrolling.

Window scrolling was created to account for the limited spatial resolution of
2D displays. But in the process, you lose spatial memory of where things are
located. Things like windows and tabs and desktop workspaces were invented to
try to wrangle that problem more, but they are not as good as a real, spatial
filing system.

Think about it. You probably know exactly where your favorite book is on your
bookshelf. You could probably walk over to it and pick it off the shelf
without even opening your eyes. But there is very little chance you can pick
any particular file you want in a 2D GUI system, specifically because of the
absence of spatial relationships.

So a combination of "text legibility is not as bad as you think it is" and
"code could be a lot more organized than it is on 2D displays" means that
programming in VR is a lot better than you're making it out to be.

------
gavinray
If anyone here is in the know with the AR/VR scene, could they shed some light
on why AR hasn't taken off yet?

VR is cool, but it's seems like such a more useful concept to me to have
actual reality with enhanced information.

Imagine wearing glasses and looking at a plate of food then having it estimate
+ track calories and macros, or paint GPS direction arrows on surfaces
realtime, or put people's names you've met before above their head so you can
avoid awkwardly admitting you've forgotten it.

Is it technical limitations, or cost?

\---

Edit: Many people replied with really informative answers to this already. I
genuinely appreciate your time and insight, thank you :)

~~~
fossuser
It's in progress, currently limited by both hardware and cost.

There used to be a good blog post from Michael Abrash when he was at Valve
that also talked about two main issues. Latency, and drawing black
effectively.

Latency is critical since low latency is a requirement for things looking real
(since humans have fast visual systems), but that's ultimately a hardware
problem that should get solved in time.

Drawing black is harder because AR uses ambient light and putting a black line
on the screen in front of your face doesn't work for focus.

Unfortunately it looks like Valve killed their blog, but the way back machine
has it:
[https://web.archive.org/web/20200503055607/http://blogs.valv...](https://web.archive.org/web/20200503055607/http://blogs.valvesoftware.com/abrash/latency-
the-sine-qua-non-of-ar-and-vr/)

My bet is that Apple will pull it off Apple watch style with front facing
Lidar:
[https://www.youtube.com/watch?v=r5J_6oMMG7Y](https://www.youtube.com/watch?v=r5J_6oMMG7Y)

Probably at first they will be mostly for notifications and interacting with
apps in a window in your visual field, getting most of the power from the
phone. Things like looking at food for calories and names, etc. will come
later when a front facing camera is acceptable and there's existing UI in
place.

I think this is probably the next platform after mobile devices, looking at
little glass displays is a lot worse than having a UI in your visual field (if
it can be done well).

[Edit]: A more recent blog post from Abrash on this topic
[https://www.oculus.com/blog/inventing-the-
future/](https://www.oculus.com/blog/inventing-the-future/)

~~~
Chris2048
Does AR require ambient light? couldn't you use a VR approach where everything
is drawn in, and reality is derived from cameras?

~~~
fossuser
I’d argue that’s not really AR then (though no need to dispute definitions
[0]), the blog post talks about that too - whatever that is, it’s not really a
satisfying approach and wouldn’t be the next platform.

You want to be able to use the full power of human vision when looking at the
world, not literally be looking at some subset in a display right next to your
face all the time.

[0]:
[https://www.lesswrong.com/posts/7X2j8HAkWdmMoS8PE/disputing-...](https://www.lesswrong.com/posts/7X2j8HAkWdmMoS8PE/disputing-
definitions)

~~~
Chris2048
TBH though, you link seems to describe situations where people leverage the
confusion to win arguments, but point taken (thanks for the link btw,
interesting article, I added my own comment).

It seems it might be a useful distinction though; I always took AR to be a
distinction of interface: reality plus augmentation; but it might also be a
tech type too: augmentiong normal vision versus "virtual" AR, or AR in VR..

That said, I don't understand your comment "use the full power of human
vision"; If VR headsets improve to the point VR environments are as detailed
(wrt human perception) ad reality, then virtualised AR shouldn't differ
either.

TBH my own concerns are how hard VR is to use while is block you from your
surroundings: noticing when people approach, handling
headset/controllers/keyboard etc. I can't replace my monitors with VR b/c I
cannot see my keyboard in VR, I see my coffee mug, or notice when people
approach in order to not jump every time someone taps my shoulder; VR needs to
be partially augmented with my true surroundings just to operate within a
normal space.

------
stanlarroque
Impressive, but I doubt even FRL can overcome this design limitations.
Digilens tried to do that since forever, and when you introduce more colors
you will face the same kind of problems seen in Hololens 2.

~~~
piercebot
Would you mind elaborating on (or provide a link to) some of the problems you
mentioned Hololens 2 having? Also, I've never heard of Digilens; do you have a
preferred source for learning more about that device and what its limitations
are?

Thanks!

~~~
stanlarroque
This article does a good job:
[https://www.kguttag.com/2019/12/18/hololens-2-not-a-
pretty-p...](https://www.kguttag.com/2019/12/18/hololens-2-not-a-pretty-
picture/)

Digilens is a company that's been around for 15 years doing all kinds of eye
wear stuff.

------
mojomark
I remember reading Mamoine's original Pinlight display concept [1]. It blew my
mind.

1\.
[https://www.researchgate.net/publication/266659406_Pinlight_...](https://www.researchgate.net/publication/266659406_Pinlight_Displays_Wide_Field_of_View_Augmented_Reality_Eyeglasses_using_Defocused_Point_Light_Sources/stats)

------
sebringj
"The First" series on Hulu reminded me of this in terms of the glasses
technology they use quite frequently but coupled nicely with voice and
gestures. I was watching this yesterday then saw this thinking this could be
just around the corner. Maybe Apple is doing something similar?

------
jayd16
What has a better chance of miniaturization. A full lens screen like this or a
projector in something like the hololens? The projector has the benefit of
supporting passive pass through AR with a semi-reflective mirror.

I suppose you could incorporate both.

------
disposekinetics
We're getting closer and closer to the virtual light from William Gibson's
novel of the same name.

------
kmonsen
I think Oculus is pretty cool, but I don't really see the synergy with
facebook here.

~~~
shafyy
VR and AR are the next computing platform and Facebook wants to own the
hardware.

~~~
Chris2048
surely they are just the VDU, not the platform?

~~~
moron4hire
That's what most of us independent developers in the industry had hoped would
happen, but unfortunately it's not happening. Facebook locks the Oculus app
store down harder than Apple theirs. They spend a lot of development effort on
Oculus-exclusive social integration features, like avatars, lipsyncing of
avatars to user speech, and a bunch of other stuff that _nobody ever asked for
and nobody is using_ , yet they still keep pushing it. And they're spending a
lot of investment dollars to buy up independent studios and lock them into
platform exclusives.

Facebook _definitely_ takes a platform approach with Oculus. The Oculus Quest
is a great device at a great price. It has done the most to make VR a
mainstream accessible thing. And yet because of Facebook's behavior and what
we know about Facebook as a company in general, we really, really must not let
it become the dominant system. Hopefully, other companies will improve their
user experiences to match (because Oculus certainly doesn't have a hardware
advantage, everyone is pretty much running the exact, same hardware profile
with only minor differences).

~~~
Chris2048
I don't know much about Oculus - Is in not possible to use alt-app-stores?

TBH, this thing happens. look at gog vs steam vs <shall not be named>; this
strategy doesn't always pay off though, consider the Nintendo console lineup
(curated, small but higher quality) compared to Sony PlayStation - in the end
Sony won that one, and my own belief is the reason that less curation led to
some indie (not developed by Sony) gems.

~~~
moron4hire
There is one project called SideQuest that--if you enable developer options on
your Go or Quest, which requires a developer account with Oculus--uses Android
side-loading from a PC to get apps onto the Quest. When you side-load an app
on the Go, you also have to have a key file embedded in it that was generated
from the device's own serial number, so SideQuest has to have you generate an
OSIG for your device, upload it to your profile, then hack your OSIG into any
APKs you download from their store. Quest doesn't have the OSIG requirement,
but you still need the developer account to enable developer options.

And you're still stuck using Oculus' APIs. There are no open source APIs for
accessing the device sensor data or rendering. Regular Android apps (minus
Google Play Services) can run on the devices as tiny windows, and then you can
get some super-hacky input as touch events on the app view, but it's really
not usable. I think it even forces software rendering, because I've seen some
really bad performance out of it (I found out about it after not having an app
configured incorrectly before uploading it on my own for my own development).
It's basically there as a fallback for Android's default Settings view, which
you use to configure the developer settings like enabling USB debugging.

------
aj7
Coming soon. The Magic Leap auction.
[https://www.theverge.com/2020/4/22/21231236/magic-leap-ar-
he...](https://www.theverge.com/2020/4/22/21231236/magic-leap-ar-headset-
layoffs-coronavirus-enterprise-business-shift)

