
Nvidia CEO: We'll be solving VR immersion problems for the next 20 years - dmmalam
http://www.trustedreviews.com/news/nvidia-ceo-good-vr-20-years-away
======
vessenes
The actual quote is excellent (and already on this page). He's right -- the
list of things to be done is obvious, and will take time.

And, I'd like to imagine he has some enjoyment at the thought -- if you saw
his keynote showing off a precomputed(ish) raytraced walkthrough of their new
office space, he was really enjoying himself and the possibilities.

What Hsun has is an incredibly good window into the silicon side of the
problem, and he knows how many chip cycles he's got to address the amount of
computation needed for physics, visual realism and presumably a latency budget
to allow wireless support. Plus he can estimate what screen tech will be
doing.

It's a prodigious amount of R&D, and it is a bit breathtaking. Finally having
a reason to do it, and a taste of how appealing it can be is pretty cool.

~~~
_yosefk
You mean how many chip cycles he _needs_ , right? How many he's _got_ we all
know: maybe 2. Then CMOS scaling is over. (Not that things will completely
stop evolving, but they'll surely slow down.)

~~~
dogma1138
well TSV have evolved greatly which means that 3D dies can come soon. 3D
transistors already expanded the limit of how small parts can get before
physics hits you in the face with a baseball bat. On top of that memristors
and graphene could open a whole new world for chip designers. If we have only
2 cycles worth of process left we are doomed :)

~~~
Certhas
> 3D transistors already expanded the limit of how small parts can get before
> physics hits you in the face with a baseball bat.

Yes, but we are just starting 10nm now. Go two more nodes (if I understand
correctly that is meant here by cycles) and you are at 5nm.

[https://en.wikipedia.org/wiki/5_nanometer](https://en.wikipedia.org/wiki/5_nanometer)

to put that into perspective: "An Australian team announced that they
fabricated a single functional transistor out of 7 atoms that measured 4 nm in
length."

Maybe we can make our way down to single atom transistors eventually, but who
knows how long it will take to make that commercially viable.

~~~
hexane360
We already have a single atom transistor
([https://en.wikipedia.org/wiki/Single-
atom_transistor](https://en.wikipedia.org/wiki/Single-atom_transistor)), but
you're right that it's far from commercially viable.

------
chris_va
The really big, really difficult one will be enabling dynamic focal depth. The
lack of focal planes makes it difficult to imagine VR providing a lifelike
experience.

Some volumetric/panoptic displays are getting closer, along with eye tracking,
but this seems like one of those things you cannot fix in software. Just
adding higher pixel resolution, improving "physics", or creating a more
beautiful environment will not solve this. So, 20 years is probably right.

~~~
hodwik
But 'all' you have to do is track the pupil, and push focus on the object
they're looking at. We're already rendering the focal length live in most
games, there's no reason we couldn't shift the focal length affording to what
the viewer's pupils are pointed at.

Alternatively, they could use a modified biology, where the direction the head
is pointing chooses the relative DOF, basically you render a fairly wide depth
of field around whatever the head is pointed at.

Anyway, you can get a pretty lifelike experience with just some basic
atmospheric perspective. Look at photography with very wide depth of field
[1], so everything is in focus simultaneously. You sorta just get used to it,
and accept it.

[1]
[http://cdn.shutterbug.com/images/0814picture03.jpg](http://cdn.shutterbug.com/images/0814picture03.jpg)

~~~
modeless
As someone who works on eye tracking for VR headsets, this is much easier said
than done. Doing it well requires levels of accuracy, latency, and reliability
that are not available in any eye tracker on the market today.

~~~
rjvs
What about SMI already tracking pupils at 250Hz?
[https://www.youtube.com/watch?v=Qq09BTmjzRs](https://www.youtube.com/watch?v=Qq09BTmjzRs)
That tech is still expensive and I don't know how accurate or reliable but if
it is good enough for foveated rendering, I would have expected it to be good
enough for focus as well. Care to elaborate as to why it wouldn't work, or is
it just beyond your horizon of "today"?

~~~
modeless
It's not accurate or reliable enough (in addition to being very far from a
consumer price point). Reliability in particular is difficult to assess and
usually lacking in eye tracking systems. Also just because the cameras are
running at 250 Hz doesn't mean the latency is 1/250 seconds or even close to
that.

------
rl3
Eventually it'll make more sense to just use the human brain as a renderer.

Perfect photorealism is fairly worthless without a solid method of interaction
and the ability to literally suspend disbelief.

Direct neural interfaces will be capable of everything, not just rendering.

That said, I can't help but think of a scenario where we have functional
neural interfaces but haven't yet mastered the human brain's grammar, and thus
resort to "driving" the brain with traditionally-rendered images rather than
feeding it the information required to create a scene from scratch.

In such a scenario, the requisite quality of the traditionally rendered images
might be interesting. Would photorealism be required, or simply a low-quality
rendering to convey general structure? Presumably the brain would enhance the
latter case to the point of photorealism—or at least to the dream equivalent.

~~~
richmarr
Agreed, considering the amount of processing and information loss that happens
in the optic nerve (in people with healthy vision) you could save yourself a
lot of processing, and get a higher fidelity result (potentially
indistinguishable from reality).

~~~
ricksplat
I expect it would be some kind of a merge of the two (without severing the
optic nerve that is) - neurons work by accumulation so you would effectively
have the two realities summed - which would be neat for augmented reality. Two
fully immerse you could close your eyes or put on dark glasses. A friend told
me this is how visual hallucinations on LSD work ...

------
lewisl9029
I just came across this today (VR Backpacks):
[http://www.anandtech.com/show/10370/hp-and-msi-
demonstrate-b...](http://www.anandtech.com/show/10370/hp-and-msi-demonstrate-
backpack-pcs-for-vr-gaming)

At first I thought the idea was pretty ridiculous, but now that I've given it
a bit more thought, I'm beginning to think having to carry around a backpack
like this might not be such a bad compromise between the traditional
stationary wired VR setup and the holy grail of fully wireless VR.

~~~
jamesrcole
Which makes me think, eventually technology will probably enable the processor
to be in the headset itself. Then possibly an implant.

~~~
pavlov
The Microsoft Hololens integrates the computer running Windows into a fairly
light AR headset.

------
mtgx
I think many have misinterpreted what he was saying, especially if they
haven't read the actual article. Nvidia's CEO is saying that VR won't be a
"completely solved problem" for at least another 20 years. And that's very
much true and has already been known by VR enthusiasts.

Carmack was even saying stuff like you need a 12k or 16k display to have the
"ideal" resolution in VR, years ago. Well, we're not going to get those kinds
of displays in VR headsets anytime soon, and even if we do, that's orders of
magnitude more performance that's needed compared to what we're using in
today's PCs.

On top of that, you also need "virtual reality" to be as close as possible to
actual reality in terms of graphics fidelity, so that's a few orders of
magnitude more in terms of performance needed. If you want "Matrix-style VR",
then yeah, that won't be achieved for at least another 20 years. The headsets
will also have to become "glasses-like" to increase adoption and make them
more convenient, which is going to take at least another decade, too.

All of that said, while I do think the 1440p resolution is _way_ too low, and
probably even 4k won't be the "optimal" resolution for VR, I do think VR will
start becoming "mainstream" within the next 5 years. I think many took his
comments to mean that VR won't be mainstream for another 20 years, but that's
not what he was saying.

------
ricksplat
How many years from John Logie Baird's first invention did it take to get TV
right. Mainstream? Colour? Picture & sound quality? Ditching the cumbersome
CRT ... best part of a hundred years but people where ready to use it long
before that and pay big wodges of cash for it too. I'd say there'll be a big
market for sub optimal VR :)

------
martinsb
I wander why Huang did not mention -- in my opinion -- the main issue with VR
- the fact that many people (including myself) cannot use those headsets for a
longer time than say 10 minutes without getting dizzy and sick. Or are these
the consequences of those issues he mentioned - like "the physical worlds do
not behave according to the laws of physics"?

~~~
Kiro
Is this really true with the newer iterations? I've been to VR events and not
a single person became dizzy or sick afaik.

~~~
empath75
I don't think it really is. I've had the oculus for a couple of weeks and the
only thing that's come close to making me sick has been a wreck while playing
Project Cars. I'm fine even with minecrift and ethan carter without comfort
controls. There are a few sorts of movements that make me a tiny bit queasy
(like moving backwards and then forwards quickly), but they're pretty easy to
avoid doing.

~~~
pessimizer
So what you're saying is that for you, it's true.

~~~
Kiro
How else would we measure this if not by peoples' experiences?

------
kriro
I still think interaction is the major shortcoming and probably will be for
the foreseeable future. Most of this seems to be a technology-psychology
hybrid. I want to grab a virtual coffee mug with sufficient feedback so I
don't grab through it, feel the weight of it etc. Another part is finding good
interaction paradigms which I suspect will fall out of better technology being
available and in more hands (easy to prototype interactions). Wide spread
first gen VR-Headsets and primitive interaction means (controller) will
probably be good enough to drive that.

------
_greim_
Can't remember where I read this, but the idea is that the brain compresses a
vast amount of data down to a small amount of meaningful information. VR's
current approach is to meticulously spoof that vast datafeed in such a way
that the result gets compressed down to something that seems meaningful. Which
is a lot of work. Why not go in post-compression and merely spoof the
meaningful bits? VR's job could be a lot easier.

I don't know how it would play out, but speculating, you might render a low-
poly monochromatic "skeleton" scene and then simply suggest to the brain ways
to flesh it out with feeling and texture. Show the user photos of war-torn
Europe in 1947 beforehand for example. Even if it means direct neural
interfaces, considering that we're 20 years out using the current approach,
maybe that isn't so far fetched.

~~~
captainmuon
I always wondered if you could use adversarial neural networks to find images
that the brain recognizes as something else, just like you can fool NNs. You
could use this to find low-poly representations of your objects that are
indistinguishable from the real ones. Probably you'd have to calibrate this
for every user though.

~~~
usrusr
You might need to reset the user occasionally during training, otherwise that
adversarial neural network would hunt a moving target.

------
golemotron
I suspect that both VR immersion and self-driving cars will always be "just
around the corner." We want to believe in them, but in both cases there are
obstacles that may be insurmountable or will at least force us to scale back
our hopes. In the former (VR) they are technical/biological and in the latter
(self-driving cars) they are technical/environmental.

Or visual systems are too tightly integrated with our motion systems and other
sensory systems to be handled independently, just as traffic lanes are too
exposed to other influences (unpredictable people and things) to allow
vehicular autonomy.

In both cases we want to see the thing we want to do by itself but there are
deep interdependencies.

~~~
milesokeefe
The difference is that your brain fills in the gaps when there are very minor
discrepancies between sensory systems. That's why presence exists with current
VR platforms.

------
Negative1
Not interested in Mobile unless it is for autos? That is a pretty big
declaration from one of (if not the) worlds biggest GPU manufacturers. Does
this spell the end of the Shield or is that not considered mobile?

~~~
m_mueller
It seems apparent to me that the strengths of desktop chip manufacturers don't
translate well to mobile. Both Intel and Nvidia have had tremendous problems
to compete against ARM / PowerVR. While Nvidia has done a better job, I
imagine they are still loosing money on that front. Maybe this is the time for
them to acknowledge that and instead concentrate on markets where local data
parallel computations are actually useful _and_ currently feasible (due to
power requirements). Maybe mobile might get there as well (real time on-chip
image recognition), but it's still years off I think.

~~~
CountSessine
_Both Intel and Nvidia have had tremendous problems to compete against ARM /
PowerVR. While Nvidia has done a better job_

Has Nvidia had _any_ success at all in mobile? At least Intel has managed to
get some Atom chips in some obscure Dell tablets and Asus mobile phones.

I'm very curious about this - Nvidia's whole mobile business looks like a flop
and Huang's circumspect dismissal of it sounds like sour grapes to me.

~~~
m_mueller
Well at least they have some performance benchmarks to show for [1] (I wasn't
able to find a comprehensive sustained performance-per-watt comparison though,
which would be much more interesting) and at least they sold more than half a
billion dollars worth of Tegras in 2015. Looking at [2] that's at least
substantially more than Intel sold when their mobile business crashed and
burned in 2014 [3]. It's however not as much as Intel sold in 2013 and the
last quarter of 2015 wasn't good for Nvidia either. Maybe it's fair to say
that Nvidia is just learning the same lesson with a bit of delay, but at least
they have a very strong pivot going after the car market with rapidly
increasing compute requirements.

[1] [http://wccftech.com/nvidia-tegra-x1-benchmarks-
apples-a8x/](http://wccftech.com/nvidia-tegra-x1-benchmarks-apples-a8x/)

[2] [http://marketrealist.com/2015/03/why-nvidia-continues-to-
foc...](http://marketrealist.com/2015/03/why-nvidia-continues-to-focus-on-
tegra/)

[3] [http://www.extremetech.com/computing/227816-how-intel-
lost-t...](http://www.extremetech.com/computing/227816-how-intel-lost-the-
mobile-market-part-2-the-rise-and-neglect-of-atom)

------
kordless
In 20 years we might be able to implement something that works with our
visualization systems to produce images at will...at least for those of you
without Aphantasia.

~~~
manmal
This would likely be undistinguishable from schizophrenia, no? Intrusive
images and sounds - no thanks.

~~~
anchpop
They're not intrusive if you pay to have them

~~~
VLM
I pay for internet access and also have to install an adblocker.

------
hathym
I imagine that in 20 years, they will invent something that plugs directly to
the optical nerve or your brain to simulate virtual reality.

------
Raphmedia
Can anyone paste me the list of shortcomings that VR has to solve according to
him?

~~~
milesokeefe
• VR displays are a little too cumbersome. [They have] to be much more
elegant, being connected by a wire has to be solved.

• The resolution has to be a lot higher.

• The physical worlds do not behave according to the laws of physics.

• The environment you’re in isn’t beautiful enough.

~~~
Raphmedia
That's what was in the article... I was expecting a longer list somewhere.

Those are not game killers. If points like that were game killers, we wouldn't
have had the Atari or the Nes.

------
TrevorJ
Is there a video of this talk available?

------
clarkrinker
Just got our Vive today. Only 1/3 people threw up.

------
LoSboccacc
so, what about the need of having a huge open space to appreciate virtual
environments in freedom?

~~~
milesokeefe
It shouldn't need to be /too/ huge with smart redirected walking algorithms.
But still probably too big for the average person to have available to
themselves privately.

This is where I feel most people in the VR space agree that "VR arcades" will
become a lucrative industry.

------
naringas
what about AR? e.g. a hololens?

~~~
mtgx
30 years. At least if you're talking about mapping everything at the same
scale, graphic fidelity, resolution, price, and field of view as in VR, then
AR will always be 10 years away compared to VR, in my opinion. If you're going
to make it much more limited than VR (like say only show a chess table in
front of you, or some text on top of stuff, but not much else), and make it
cost several times more, like the Hololens does right now, then yeah, I guess
you could make AR _seem_ almost as usable as VR, but it will never be a 1:1
comparison this way.

If you're doing 1:1 comparisons, then it should be about 10 years behind VR
because you have to transpose the same world that you do in VR but over the
real world.

------
moloch
The actual quote was:

“First of all, VR displays are a little too cumbersome. It has to be much more
elegant, being connected by a wire has to be solved. The resolution has to be
a lot higher. The physical worlds do not behave according to the laws of
physics. The environment you’re in isn’t beautiful enough. We’re going to be
solving this problem for the next 20 years.”

"Solving over 20 years" / "20 years from being solved" are very different
things.

~~~
tbolt
Once I read the quote I probably had the same reaction as you. Bad journalism.

------
vegabook
Nvidia of course has a stock price incentive to pitch a very long period of
further development ahead.

------
alberthartman
Some of this may be misdirection. VR is getting tremendous attention and will
be very competitive. Look how he backed away from mobile. Complaining about
wires and resolution? They're getting new revs every 12-18 months now. Look
for hidef wireless hmds in 20-36 months, not 20 years. No physics? Look again.
I think he wants to avoid competitive low margin businesses when better
choices are available.

~~~
milesokeefe
>Look for hidef wireless hmds in 20-36 months, not 20 years.

How do you suppose that will happen? On-device computing hardware or a
wireless tether?

The latter appears to only be maybe possible with microwave radio, but
requires constant line of sight which is a huge issue.

The former may be possible depending on your definition of hidef. IMO I doubt
it though, much of what makes VR immersive is having a HMD
lightweight/comfortable enough that you forget it's on. That's hard enough
when the only hardware in the HMD is essentially screens and an IMU. I can't
imagine the added weight of CPU/GPU/batteries/etc not making a significant
enough impact on comfort to make it not worth it, in the next 20-36 months.

For reference, my standard of hidef VR right now, not in 20-36 months, is Vive
level resolution/FOV/tracking accuracy and latency; I wouldn't put GearVR in
that group, or any platform without submillimeter positional tracking for that
matter.

------
greenspot
_the piggy trough is so full of cash, cream and credit, that they ignore
credible assessment(s)_

about the VR industry from the top comment of that page

Edit: Thanks for the downvote

------
dclowd9901
Ha, 20 years. I'd guess more around 100, and graphics card companies, unless
they pivot, won't really be a part of it. If we're talking about true VR, it's
the replacement of reality, e.g. piping experience directly into your neurons.
For instance, how are we going to augment smell or touch with a headset and a
video card?

Sensory augmentation will probably start with dream augmenting tech (headsets
that are able to give you "good" dreams by learning your neural firing
patterns), then as delivery systems, nanotech, processing and brain knowledge
gets better, we'll see it evolve into waking experience augmentation.

~~~
bradhe
> If we're talking about true VR, it's the replacement of reality, e.g. piping
> experience directly into your neurons

Yeah that's not what anyone here is talking about.

~~~
dclowd9901
I'm sorry, what was your definition of virtual reality? I sort of read it as
"a reality that is virtually reality itself."

My point was those problems might be on the horizon of nvidia in the next 20
years, but its not even scratching the surface of what VR needs to be.

------
mozumder
And then you have the fundamental design problems of VR, where it blocks out
your environmental awareness, causing people to crash into walls, leading to
liability issues: [https://imgur.com/umYTJP1](https://imgur.com/umYTJP1)

The first generation VR also faced the same liability issues. Specifically,
the Atari Jaguar VR was cancelled because of this.

VR will never make it as a consumer product.

Ever.

~~~
agildehaus
I own a Vive and have not once crashed into a wall. The software knows where
your physical boundaries are and warns you when you're too close by fading in
a virtual wall (sort of looks like a blue mesh).

Take your uninformed nonsense elsewhere.

~~~
mozumder
I posted a video of people running into walls. And there are lots of other
videos of the same.

Are you implying it doesn't happen?

Also, how do you make something like Call of Duty when you're limited in your
movement by walls?

~~~
shinymark
> Also, how do you make something like Call of Duty when you're limited in
> your movement by walls?

You don't. Many existing game genres don't map well to VR in a direct port of
existing mechanics.

From a game developer's perspective that is one thing that makes it exciting.
It's a blue ocean of new design possibilities. I'm an optimistic person but I
believe we'll find new designs that are equally compelling to the most popular
games on 2d screens. Different games for different platforms.

~~~
sidthekid
Platforms like Virtuix and Infinadeck are interesting designs for this
problem.

