Hacker News new | past | comments | ask | show | jobs | submit login
Virtual Reality Fails Its Way to Success (nytimes.com)
16 points by boh on Nov 14, 2014 | hide | past | favorite | 18 comments



I think most of us have known "real" VR is inevitable ever since the idea clicked, round about 1990 or so. So far it's lacked the urgency that could have made it a "moon landing"- level quest. Maybe it lacks a killer app.

I think what Palmer has done is organize a vote, where the verdict was, "Yeah, we want it now. Let us find our own killer apps." Whatever, we now have Carmack, Patron Saint of end-to-end Latency, and Samsung, kind of working together-ish in as concerted an effort as there's ever been to make VR work.

Oculus is bound to seem stone-aged even 15 years out, but it's a strong start.

From the article: one thing that stood out that I haven't thought much about is the potential to use VR to achieve altered states (other than the standard immersion and nausea).


I'm starting to think Oculus is becoming a gimmick. It needs success in order to be successful. That's tautology to an extent, but it's really that simple: If it hasn't had any success, it's not successful, and success is nowhere on the horizon.

It's a cool experience, to be sure. And maybe someday it will be mainstream enough where everyone has an Oculus. But that day seems very far in the future.


Have you tried one? It is already very immersive and only going to get better. Within an hour of trying it for the first time a few weeks ago I had a fountain of ideas of possible uses for it and startups that would develop around it. I think it's going to be a whole new section of the tech economy.


How can it be a gimmick if it hasn't even launched a consumer version yet?


You misunderstand, it's a gimmick UNTIL there's a version consumers will buy.


I don't think that VR has quite the same implications as, e.g., the moon does. "Reality", to us, will always be the sum of our senses and experiences, and we're gonna be fighting a struggle against the uncanny valley (I'm betting) for the rest of our lives. At that point, it's hard to figure out its applications outside of entertainment.


I agree that landing on the moon and VR are not really comparable, but VR has huge applications. Basically in every industry.

Telepresence for business meetings, surgery, remote repairs, military, not to mention the vast entertainment options like the sex industry, videogames, etc. It could also be a boon to people in long distance relationships or those with friends across the world who just want to hang out in a virtual living room together and shoot the shit.


I think the human mind is over-willing to get lost in dreams. Normal consciousness itself is sort of patchwork and fuzzy. I find it very easy to get lost to the point of immersion already in WoW and Minecraft on a medium-sized screen - heck, in a novel, or a daydream. So, I think our native hardware would help it work, if anything.


VR didn't take off until now because the right convergence of other technologies hadn't been reached yet. The smart phone explosion lead to cheap, high resolution displays, much like the popularity of the Wii brought the price of gyroscopic sensors down. Graphics hardware also became powerful enough to render scenes twice and reverse the fisheye effect of the cheap lenses used in the Oculus, which in my opinion was the real light bulb moment for the project. The value and application of VR has been obvious for decades, but the previous attempts were hampered by some serious technological limitations.


That seems like a weird statement by Cameron if it's true. When did he say that? I would expect that someone like him would still have nice things to say even if the technology was still a little off-the-mark (whether or not the Rift is or isn't off-the-mark).


A friend told me the other day that only "static" scenes really work, where you stay in one place. And when the player starts to "move around" the lag causes nauseousness. Is this true?


Lag reduction is a big issue, and one that everyone is actively working on.

I tried out two Oculus Rift DK2 demos at SIGGRAPH this year.

The first didn't have the new head tracking, and you had to strap yourself in to a walking pod to move around. That made me want to throw up after a few minutes, although I think that was mostly the exertions required to walk on the spot in slip shoes on not smooth enough plastic, turn, and push against the edge of the pod.

The second I was sat down for, and it did have the new head tracking on. No motion sickness, and it was a really immersive experience.

So possibly, anecdotally, that supports the idea that moving is a problem, although I'd still blame the walking pod being really uncomfortable to use more than the on-screen movement.


The Rift team is experimenting a lot with exactly what causes the nausea, and is specifically tackling those issues in the newer versions. http://www.polygon.com/2013/10/17/4850272/oculus-ceo-gets-si... Women are more likely to get sick. http://www.zephoria.org/thoughts/archives/2014/04/03/is-the-... It also varies a lot from person to person, but seems to get better with exposure.


Static meaning what? Games where the observer is stationary, like flight sims or racing sims, are more immersive and less likely to cause nausea, but lag would be the same in both experiences.


I won't speak for anyone else but I've had a Rift dev kit for the past several months and what I've found tends to confirm a lot of my expectations: in applications where your viewpoint matches your physical viewpoint (seated in a chair or walking around a small area that stays in one place while also walking in real life) nausea is pretty much not an issue when there aren't other issues with performance (like if you hardware can't keep up with what you're asking it to render, it will "stutter" as the framerate drops but that's more headache-inducing).

Software like Elite: Dangerous or the tech demo "Sightline: The Chair" work great because your "avatar" is in the same seated position as you are. Once you try to shoehorn VR into a more traditional first-person video game style setup, it gets dodgier. Since you don't have the range necessary to actually walk around that far, people tend to use things like gamepads or the traditional WASD/mouse controls to move around. This can really throw you off because in a game, the mouse or analog stick controls your facing direction but in VR, so does your head. At the same time, in real life, you can face your body in one way and still move your head independently. In VR, it can take some practice to be able to "get" that without feeling nausea. Any time your actual head's motions aren't 1:1 with the view on screen it can get questionable. The closest thing I can think of is the feeling of "the spins" if you've ever had too much to drink and try to lay down. The seeming disconnect between head motion and viewpoint motion throws your balance for a loop.

Then there's the whole forward-backward-left-right thing. If you're sitting in a chair but your on-screen body is running around or jumping or falling your inner ear is telling your brain a different story than your eyes are.

Honestly, this is why I have my issues with some projects and users that feel a traditional FPS-game experience is the holy grail of VR. I can see how it would seem that way at first glance because how much more first person can you get? Still, experiences like Elite or The Chair or even a game like Couch Knights where you're looking around at a virtual "game board" while your surroundings stay in place really do create an amazing sense of immersion and presence.

I think that while gaming will be the early focus due to the existing graphics engines and the fact that many gamers are used to buying high end computers and pricey peripherals, the real innovations will come with the projects that are just now getting started. Stuff like using depth cameras to recreate a 3d "movie" you can look around in or transmitting similar information over networks for 3d telepresence is going to be a much more mainstream and transformative development than playing the vidya games in VR.

The article touched on a point that I'm always reminded of. Those old devices like the Apple Newton or even the more popular successors like the Treos and the PocketPCs were awesome and promising but the smartphone market didn't really take off for several more years because the hardware and software needed to make them polished, fast, and ergonomic didn't exist yet. The Rift is like my old Treo but someday the tech will exist to deliver something more akin to a modern smartphone at equally (relatively) affordable prices for average people.


The "Elite: Dangerous" example is interesting. What's the difference between flying around in space and flying around a typical FPS game? I don't have personal experience with the Rift but here are some thoughts:

- The "context" of the cockpit is helpful in some way, it's an environment which "agrees" with your inner-ear sense. If the cockpit were removed from Elite, would "presence" be lost? If a cockpit of some kind were added to a typical FPS game, would that decrease the dissociated-motion effect?

- Movement is different in Elite. The environment that disagrees with your inner-ear (space and the things in it) is far away, moves more smoothly than a typical FPS, and is sparse. This probably makes it easier to accept than the fast, jerky movement inside small, constrained environments of a typical FPS.


Hey, at least the article mentions Artaud. His work has always seemed like the secret weapon for good VR work to me.


I was surprised that I was immediately nauseous within seconds of putting on an Oculus, because I don't feel disorientation watching 3D movies or riding amusement park rides. It was something more deep seated than that, akin to motion sickness or ocean sickness, and I hadn’t felt such an immediate onset since I was young.

So I’m thinking that the component that’s missing is not so much better hardware, but a sense of control, because right now we are along for the ride but we want to be driving. The big off-putting things for me were:

* Screen door mesh appearance of seeing the mask area around pixels (which arguably shouldn’t be present on an OLED)

* Difficulty getting the focal point and field of view right (because really this is just a cell phone and some reading glasses, see Google Cardboard)

* Incorrect or missing association between gravitational up and simulation up (inner ear confusion)

* Not seeing my own body in the scene (needs integration with Xbox Kinect)

* Possible lag because the Rift may not be using movement prediction/dead reckoning (though I didn’t notice any, so maybe it is)

Really these are quite minor and easy to fix compared to what’s been accomplished so far. The only real expense is eventually going to be an affordable 4x HD display, because webcams for the optical tracking, gyros, tracking software and 3D drivers are all trivial or open source. Someday the controller will be a $5 chip like USB and work with any HDMI display.

Also the fact that the split screen projection works with ordinary video is causing all of this to go viral.

I’m trying to wrap my head around how stereoscopic panoramic video recording will work, because that’s the last thing stopping VR (not hardware).

I’m thinking the best way to solve that will be to scan a moving 3D panoramic scene into a computer and use the difference between frames to infer distance to each pixel. I’m having trouble finding a link, the closest thing I could find was the 2005 ShoWest where they talked about remastering a 2D movie like Star Wars episode IV into 3D at 48 fps:

http://www.davidbordwell.net/blog/category/directors-cameron...

http://www.davidbordwell.net/blog/wp-content/uploads/3d_dire...

Some other ideas might be to also use a laser rangefinder or ultrasound, light fields or split the image coming through the sphere with a partially reflective mirror onto two or more CCDs and have the computer infer the distance to each pixel by the correlation between the images. Anyone know any research/projects that explore recording panoramic 3D video?




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: