
How Magic Leap Works – From Field of View to GPU - rawnlq
http://gpuofthebrain.com/blog/2016/7/22/how-magic-leap-will-work
======
vernie
I can't wait to see their next cooked demo video. Just another day in the
office at Magic Leap...

~~~
jemeshsu
I learn from Magic Leap and Theranos is there is no need for MVP. Just need a
MHP Maximum Hyped Product to get started. And then spend the funding to try to
get actual product out.

~~~
erikpukinskis
It's because they convinced Google to give them a half billion dollars. That
is seen as verification of their claims.

------
IshKebab
I understand it all except how they supposedly cancel background light to make
black. It would only be theoretically possible if ambient light were coherent
- which it isn't - and even then it would only be possible if everything were
_perfectly_ still.

~~~
anotheryou
Same here. But I also wondered how they get the fiber projection from the side
to show up all across the screen/glass nicely.

I'm always amazed at the super even result of LCD backlight diffusers (are
they asymmetrical, when the light comes from just one side?), but they can
work more messy than some pixel projection.

edit: probably it just doesn't work. Maybe they could add a simple one segment
LCD for variable dimming.

They where very careful to show nothing black on white. The tip of the
mountain touches the white thing on the table shortly here:
[https://youtu.be/GmdXJy_IdNw?t=63](https://youtu.be/GmdXJy_IdNw?t=63)

~~~
IshKebab
LCD backlight diffusers have a cool method of accounting for light
distribution. Basically the diffuse etching is a grid of circles, and the
circles are larger where there is less light. Like printer dithering.

------
iamleppert
Isn't a 0.6 micron pixel size at the limit of diffraction? How on earth do you
get an accurate XY attitude control with a piezoelectric element at that
scale? What about vibrations? Timing/sync issues, etc? Anyone care to
speculate?

~~~
Lorkki
It's not like it's the only thing that sounds a bit too hopeful. Driving
content to a light-field display at a decent frame rate and a resolution "much
higher than [4375 x 2300]" is orders of magnitude beyond any mobile GPU now or
within anyone's manufacturing plans.

Speculation is all that the article is, and it sounds rather like it's
intended to feed the hype.

~~~
IshKebab
Unless you only draw to part of the display. Given that it is AR, maybe they
rely on most of the display being blank.

Still, I agree that sounds infeasible.

~~~
dharma1
driving the display is just a part of it - they need to do super accurate
SLAM, object recognition/machine learning inference, and if the visuals are
interactive and not pre-rendered then all the lighting, texturing, rigging,
physics etc of the rendered models - all this on the same cpu/GPU.

It's just about doable on desktop VR setups with latest desktop GPUs

~~~
sapphireblue
With their budget they can afford to make a custom SLAM ASIC, push more cores
into their SoC and create some special GPU. Such capital-heavy ASIC approach
can give 10-100x performance boost.

------
hyperpallium
I'd had the idea of moving processing off the eyegear and into your pocket,
connected by cable, but using a fibre optic cable is even better.

The scanning display also enables different resolutions for different parts of
the field of view (the retina center "fovea" is high resolution; peripheral
vision much lower). Eye-tracking allows them to tell where your fovea is
looking.

But I fear we're still not ready for camera always-on constant surveillance.
And that seems necessary for the light-cancellation of solid blacks, and of
course for object recognition. But these are severable, and the other features
will still work.

Pokemon Go helps prove the ground for all this.

~~~
aedron
VR backpacks are already here (and are probably the future):
[http://www.theverge.com/2016/6/7/11874762/vr-backpack-pc-
han...](http://www.theverge.com/2016/6/7/11874762/vr-backpack-pc-hands-on-
computex)

You can pack a lot of computing power into a backpack these days.

------
jaytaylor
I'm unfamiliar with a few of the acronyms used in the article:

    
    
        SLAM
    
        DOE
    

Definitions would be appreciated :)

~~~
rawnlq
Example of SLAM:
[https://www.youtube.com/watch?v=C6-xwSOOdqQ](https://www.youtube.com/watch?v=C6-xwSOOdqQ)

~~~
bhouston
Alignment errors build up over time -- there is a good 1.5 ft misalignment
visible here:
[https://youtu.be/C6-xwSOOdqQ?t=216](https://youtu.be/C6-xwSOOdqQ?t=216)

~~~
rawnlq
Alignment and scale drift is usually solved via loop closure. Clearer example
in the LSD-SLAM:
[https://www.youtube.com/watch?v=GnuQzP3gty4&feature=youtu.be...](https://www.youtube.com/watch?v=GnuQzP3gty4&feature=youtu.be&t=80).

By the way it seems like the researcher behind DSO and LSD-SLAM is now at
oculus research!
[http://vision.in.tum.de/members/engelj](http://vision.in.tum.de/members/engelj)

------
eltoozero
Nice summary of what's probably going on over there.

The tech demos are quite impressive and if we assume ML isn't misrepresenting
the results it'll be really interesting to see what kind of hardware
materializes.

------
kimburgess
For anyone interested in diving a little deeper into the workings of their
imaging engine, here's a really well put together paper by Brian Schowengerdt
et al:

[http://wenku.baidu.com/view/c40a8242e45c3b3567ec8b68.html](http://wenku.baidu.com/view/c40a8242e45c3b3567ec8b68.html)

------
aedron
Sounds cool and at least 10 years out. So why are they hyping it now? What
with hiring Neal Stephenson and so on.

~~~
swalsh
In the last interview they are talking about releasing in about a year. Their
hope is to have 80% of the country wearing it by 2020. They want to be the
next Apple. I think they mentioned that they are currently debugging the
production line.

------
KaiserPro
its worth looking at the first video on the magic leap website:
[https://www.magicleap.com/#/home](https://www.magicleap.com/#/home)

its a Whale, jumping out of the floor. Nothing too hard. What is hard is the
motion tracking, and the water sim.

The water sim is just about possible on the GPU, There is some cheating.
However I suspect thats a houdini[1] special.

So you're looking at carting round a dual top end nVidia GPU rig in your
pocket, just for the particle sim, let alone the motion tracking/SLAM/dynamic
lighting.

Sure, people will say, but you can use the cloud! you try adding 25-70ms of
latency to your vision, and see how immersive that feels.

[1][https://www.sidefx.com/](https://www.sidefx.com/)

~~~
andygates
Alternatively, all the physics work is pre-generated and it's just displaying
stuff. The floor is flat and there are no interactions with other objects.
That black sure is impressive though.

------
austindeadhead
IMO, instead of a GPU, much of the rendering could be done on a "Vision
Processing Unit" like the Movidius device - cheap, low-cost, power efficient
[http://www.movidius.com/](http://www.movidius.com/)

------
justifier
ha! somewhere in my comment history is a question about fiber optic projection

i remembered seeing a youtube video showcasing it but was unable to find it
again after the initial viewing

i was told it was impossible and assumed my memory was flawed

but the picture in the section Fiber Scanning Display is the same tech

i remember that butterfly image

~~~
filoeleven
The inspiration for their display was pulled from someone's ass.

Seriously. There was an article discussing it a while back. There is a new
kind of colonoscopy camera that uses a fiber optic cable with a rapidly-
spinning business end to build up images rather than a camera with a
conventional lens. You can imagine the impetus for shrinking the probe. They
"just" reversed the light flow and now they have a fiber optic projector!

------
jsemrau
It's funny how Pokemon Go took away a lot of AR marketing Buzz from Magic
Leap.

~~~
StavrosK
Pokemon Go's AR is a gimmick, though, and doesn't work anywhere close to where
you can claim it's mean to be serious. It's just so you can tell your friends
"it shows you Pokemon in the room!".

~~~
drewstiff
Is mapping a game over real-life not a form or AR? The normal game screen in
Pokemon Go takes "the real world" (streets, locations) and augments it to make
it part of the game.

~~~
milesokeefe
I suppose it could be considered AR. But where do you draw the line? Is Google
Maps with certain businesses outlined AR because it's an augmented version of
the real world?

Personally I limit what I consider AR to be graphics superimposed on a real
world view. With the maps in Pokemon GO, the real world and game are distinct,
almost like VR.

------
grabcocque
How Magic Leap works: from vapor to snake oil.

~~~
milesokeefe
If that does turn out to be true it will be very impressive, has there ever
been a vaporware project that raised this much money?

------
amelius
> It sounds like the exact technology that Apple wishes it had.

Except... people just don't want to wear glasses. Especially not clunky ones.

~~~
obj-g
People don't want to wear glasses _all the time_ \-- but I don't think this is
meant to be worn 24/7 like Google glass or something. To me, your comment is
akin to, sure people want to play games but they don't want to hold a
controller. And look at VR -- people are definitely willing to wear
goggles/visors/whatever for limited periods of time.

~~~
amelius
I honestly believe people don't like glasses (in general, I'm not speaking of
the geek crowd). Also, VR isn't a big thing yet.

Please be cautious with analogies, because often there's no logic in them. For
example, in this case, we can further dumb it down to: people like to eat but
they don't want to hold a spoon. With all respect, an analogy just makes no
sense. I was talking about glasses specifically.

~~~
soylentcola
What about sunglasses? They've always been popular/fashionable and they're
also glasses that you wear in specific situations. They're also glasses that,
underneath any styling or branding, still exist for functional reasons rather
than solely for looks.

I think of headsets more like headphones. No, you don't want to have to wear
headphones all day long but when you're getting something out of them, people
seem willing to wear them for extended periods of time.

Even if the glasses thing is a hurdle to this becoming ubiquitous consumer
electronics, I think the engineering and tech being developed still has quite
a bit of promise and I'll be glad to see if they can make some real progress
and show something off beyond teasers.

------
aaron695
Until "AI" is good enough to create exciting environments nothing spectacular
will be coming from Magic Leap (As per their demo videos) to soon.

The hardware is not the hard part, the artwork/design/integrating it with
society is.

~~~
epistasis
Clearly it's entirely different from Vive and Rift, but I think they have been
working just as hard on the content side of things as the hardware itself.
Most of what I know about MagicLeap comes from content people joining the
team, the other comes from patent filings.

~~~
aaron695
A team of 100's can create a nice movie.

Then to make that 3d. Then to integrate that around the world.

Easy to hire people and make a 2d video on you tube on what it might do.

~~~
milesokeefe
Movies are one of the last things we'll see adapted to AR. Even adapting the
medium to VR is a difficult challenge. Disney has a lab that's taking a crack
at it though.

