The criticisms of tracked hand motion are spot on, at least with regards to the Leap Motion device. But even if the controller were to work well, having no feedback makes actually interacting with the environment extremely difficult.
So I hacked together a haptic glove out of an Arduino, a few vibrating motors, and a WebSocket-to-Serial-Port bridging program in a weekend. Took it to National Maker Faire and demoed it to a few hundred people. There were only a handful of skeptics, and of them, only two of them walked away unconvinced. Even basic haptics has a huge impact on the believability of the motion interaction.
Hey there! So I imagine based on your "you guys" comment you're actually thinking of NullspaceVR [0]. They are a team out of Rochester University that is developing a slew of haptic devices. I got to talk to them some and they are a very good group of people that hopefully we can do some work together in the near future.
I was there with DCVR [1]. We're a social meetup group for entrepreneurs and hobbyists in the Washington DC area centered around virtual reality. We decided to organize and put a booth together to help promote our individual projects, while also trying to grow our local community for our meetups. But otherwise, in regards to my projects, I was there myself.
Since you sound like you're in the area, please come to our next DCVR meetup. They are a lot of fun and it's probably one of the best ways to get involved with VR from scratch right now.
Was I the only one disappointed to hear that the knife couldn't be used as a screwdriver? I would think that a game designer would be excited to allow the pocket knife to be used as a separate, emergent solution to the puzzle. Don't give the player a tool and then tell them it doesn't work. That's not immersion, it's a tease.
I liked "Fail Fast, and Follow the Fun" and emphasis on immersion.
But this article, like Carmack's pro-VR article, convinced me that VR isn't ready. Before it was unacceptable latency for some motions; now it's motion sickness.
But these are clever guys. One day it will be ready.
You should try a current-generation VR system, or even last year's tech like the DK2. You would be surprised.
Motion sickness is now mostly an issue of badly designed software, not hardware. Developers porting content designed for 2D screens don't realize that their camera movements make people sick.
As someone who writes VR software, I think VR hardware unfairly gets blamed for bad VR software.
I cannot stress how true this post is. You have nailed it on the head. So many people are building naive projects in Unity and--finding disappointment in the performance--are blaming VR in general.
The problem is that VR inverts the concerns for graphics developers. It used to be that fidelity was paramount, performance was secondary. Now, with VR, performance is paramount, fidelity is way, way secondary. It would be better to make PS1-era graphics that hit the high framerate, low latency requirements, and satisfy the proprioceptive issues than to try to push Unreal Engine to its limits if it means you're "only running at 30FPS".
I mean, we still accept 30FPS in desktop gaming. People are willing to run that slow if it means they can crank the latest AAA game to the "High" graphics settings. And it mostly falls into the same sort of hair splitting that audiophiles engage in if we were to argue that they aren't having a good experience at that point. But with VR, if you can't hit the native refresh rate of the display, you're going to have a bad time.
My biggest problem with VR, and 3D goggles in general, is that eye-accomodation is not necessary in the virtual world, with current technology. This means that the eye will get lazy if goggles are used too often.
Correct. But considering this is just first-generation consumer VR, I would expect this to improve.
For example, here's a prototype[1] using two layered panels to achieve a continuous 4D Lightfield. This allows your eye to naturally accomodate and see the scene with proper depth of field. Just a prototype, yes, but perhaps we could see this in consumer tech in a few years.
Accomodation-vergence mismatch is definitely one of the worst parts of current generation VR, but I've seen a few promising solutions.
I have a lazy eye. If I do some eye exerices every so often, my eye isn't lazy. Watching a 3D movie or using properly calibrated VR goggles has the same effect as my exercises.
What you're thinking talking about is the discrepancy between the eye lens focus and the stereoscopy focus. Luckily, it doesn't cause lazy eyes. Some people can naturally seperate the two and everyone else gets headaches.
From what I've heard, keep in mind I have a bad memory and am not an expert, your eyes are supposed to work just as much if not more than while using a regular screen. You will focus your eyes all over, while on a 2d screen, you keep your eyes on the same position (center of screen, 20 inch away) for the duration of your gameplay.
And once you hit my age, your have have stopped accommodating for quite a while. I also wear glasses with two quite different prescriptions. I wonder how many of the VR systems will work for me.
Do you know if there's research into whether VR is harmful for your eyes? Obviously not short-term, but if you played a couple hours a day, would it be bad for your eyes? That's my biggest concern.
Okay, so I haven't published anything on this, but I have asked several optometrists explicitly. They have all said that the worst that can happen is you get a headache from eyestrain the first few times you use it, that you might feel tired, but ultimately it goes away, there is no chance for permanent damage. Their view was that the muscles tire out, that you can't strain them to the point of injury, and it just becomes uncomfortable and unworkable, but not damaging.
Not much research, not many conclusions, so hard to say. There could be accommodation issues but more likely only if you're using VR >10 hours/day. And as I pointed out elsewhere in this thread, lightfield displays will eventually fix the accommodation issues.
I've heard that everyone is impressed with the lack of motion sickness with Valve & HTC's Vive headset. I think the major difference is the "low-persistence" display, which flashes each frame for a tiny amount of time instead of leaving the screen on all the time. That way the persistence-of-vision is a little more normal and it doesn't smear the image across your eyeball as much. The latest Oculus prototypes have the same feature.
So I hacked together a haptic glove out of an Arduino, a few vibrating motors, and a WebSocket-to-Serial-Port bridging program in a weekend. Took it to National Maker Faire and demoed it to a few hundred people. There were only a handful of skeptics, and of them, only two of them walked away unconvinced. Even basic haptics has a huge impact on the believability of the motion interaction.