The problem with this response is that it's mostly right, but misses enough that it ends up being misguided but for a reason that is unstated.
Similar to the author I worked with HMD systems while in the air force, specifically the integrated HUD avionics F35 helmet system. He is correct about the current lack of accommodation (depth perception) and the vestibular mismatch for movement. Other posters discuss vestibular stimulation, without realizing that such stimulation actually induces physical motion. The biological systems are too tightly connected to just simulate physical movement in the brain without a physical reaction.
What he misses, which is critical is that certain virtual retinal displays can solve the accommodation and multiple depth object focal lengths. This is the major thing that magic leap has going for it, but while it's hard it's not impossible to do in multiple ways. We have our own multi-depth accommodation vrd patent filing.
There is no real solution to the movement problem though... Except to actually work with virtual objects in the real world, AKA augmented reality.
So the actual solution to his problem doesn't lie in VR solutions, it's in AR.
Until we have sufficient BCI that works around vestibular and ocular pathways we aren't going to see the matrix levels of presence in non AR displays.
Re: depth displays, there are also light field displays, currently being demoed, and hitting smartphones by 2018, from manufacturer blather, so probably 2020, but that's not long, and they can work for the purpose.
Vestibular mismatch remains an issue, but for seated sims there are hydraulics, for walking around sims - there is no mismatch. There's a crazy amount of experimentation going on right now with locomotion systems, such as some that involve you walking across your space, turning 180, and keeping the world fixed in your view, but with visible dust in the virtual air that becomes the main thing you notice, not the faded out world, and on you go.
For running/terrain, we just need backpack pcs and larger tracked areas. I already know of a startup looking to breathe new life into old indoor laser tag installations.
As to his anecdote - I stuck dozens of people in my dk2 and many were nauseous, myself included. On the vive - nobody - a d I've let far more people try it as a result of the overwhelmingly positive reaction - I had to pry my sixty something mother out of it.
Re: post simulation - that I concur is a problematic one. I can see very bad shit happening and it being ostensibly due to post-simulation experience distortion. I can vouch that it's very real - but that you do find your legs. After my first few sessions in the vive (never had with dk2, not immersive enough), I'd come out seeing things as weirdly unreal, too flat, and generally felt a bit "spaced". I don't get that any more - and my exposure hasn't been that heavy - maybe 50 hours over the last three weeks or so.
>As to his anecdote - I stuck dozens of people in my dk2 and many were nauseous, myself included.
It's all about the software. I've been developing DK2 based experiences for trade shows and have seen thousands of people try VR for the first time. Perhaps 1 in 100 suffer from simulation sickness, but this highly depends on the application. E.g. One of my first prototypes made everyone that tried it very sick - including myself (this due to a single camera movement mid-experience - flying over a rail & then spiralling up a staircase).
Unfortunately, beyond the scarce info on the subject of sim-sickness - my experience has been derived completely from trial & error.
I'd like to see more specific information shared on the subject of simulation sickness.
I felt somewhat sick watching a let's play of the half-life 2 bridge level. No difficulty with VR there; that problem is simply the accurate depiction of incredibly unsafe movement through an incredibly unsafe environment.
It makes for an interesting conflict with the desire to provide an "extreme" experience, I think.
Well put! - in my line of work it's a conflict between producing something impresses the majority, while at the same time prevents the minority from vomiting on the gear (touch-wood, hasn't happened yet).
Sickness and vertigo isn't always met with vomit. I encourage you to seek out another metric that respects the delicacy of the headspace of even those with iron stomachs. (I personally get headaches but no nausea effects..)
The Rift once made me cross-eyed for several hours. I had to sleep to get rid of the effect. I haven't heard about this since then, so I'm wondering if you've seen or heard of it.
It happened when I attempted to use the Rift with Portal, which at the time had very primitive Rift support. The game had a calibration mechanism which adjusts the field of view for each eye independently, as well as the "left-rightness" of objects as they recede into the distance.
I fiddled with the calibration for a little while, but everything looked pretty muddled. Suddenly it all came into focus, mostly, and I played the game for an hour or two. Which is to say, I stumbled around in this weirdly-calibrated world.
When I took off my Rift, I was astonished to find that I was cross-eyed, and couldn't uncross them. I tried closing my eyes, focusing in the distance and then back, etc, to no avail. After a few hours I gave up and went to sleep.
I woke up fine. It was one of the strangest experiences I've ever had, and I was wondering if anyone's experienced a similar effect.
EDIT: Interesting. Apparently the answer is "No, no one else has experienced this."
I should clarify some details. This was several years ago, back when Rift support was in its infancy. The only thing I remember is that there was some sort of program designed to inject Rift support into games. Back then, Mirror's Edge, Portal, etc didn't have Rift support, but using this program, you could hack it to work.
I think what happened is this: Point at the ceiling, then bring your finger close to your nose. Keep focusing on your finger. There's a distance where you have to cross your eyes in order to continue focusing on your finger. Bring it a little closer.
I'm pretty sure that's how I had the calibration set up. And I played like that for a couple hours, which caused my eyes to adapt to that situation. Hence, cross-eyed vision.
I've played many other games with no problems, so it was definitely a product of the bizarro calibration. It'd probably be a bad idea to try to replicate this, but there were no lasting effects.
If you can afford it, get checked out by a neurologist or an opthalmologist; double vision brought on by eye fatigue can be the first sign of several different larger issues, most of which will benefit from being diagnosed early.
To be clear for anyone else that may read this, any sudden visual impairment warrants an immediate visit to the ER. Even though many problems will resolve themselves, there are plenty of others that can cause permanent vision loss if they aren't treated properly.
I'm not a doctor, but I've had similar symptoms unrelated to VR. After several tests (including an MRI), I was advised to wait and see if the problem recurs. Given how long you've been asymptomatic, just wait and see seems like a reasonable course[1]. On the other hand, my symptoms did come back after a while, and it sounds like that VR session put more strain on your eyes than I've ever subjected myself to.
[1] I am not a doctor; when in doubt, talk to anyone w/ a medical license
A) That's pretty scary, the extent of my problems with the rift have been motion sickness (it's worse in some games than others). Overall, because of all the cables + headphones + motion sickness + driver issues + finding my mouse and keyboard without knocking anything over, I never really got to play my occulus for more than 45 minutes before giving up. It's been in my closet for months. I bought both dev kits but am holding off for a while on retail versions.
I own the DK1/2 and slowly built up a resistance to VR over time. I have been thinking about The Jerk with regards to VR for the last 2 years, glad to see I'm not alone.
That clip is hilariously relevant. I already have issues with my eyes due to screens so I will heed the advice in this thread and avoid VR until it is well nourished.
Playing with HMDs and VR in the 90s, I came across a book that described the efect as a "virtual hangover" with effects that could last up to 72 hours, which was why AF pilots using them in training were grounded for that period.
It's not an uncomon effect for any close focusing to distort vision temporarily, especially if your eyes are tired. I visited an opthalmologist last year because of persistent ghosting at night, and it turned out to be from reading laying on my side in bed causing one eye to partially cross but not fully uncross until morning.
The mixed reality continuum doesn't require different displays to provide different types of systems, so the distinction between AR and VR from a hardware implementation standpoint eventually goes away. If you build a VRD for AR then you are 90% of the way towards a combined system that could also work for VR. The trick then is figuring out how to "render black", arguably the hardest challenge for a combined system or even just hmd AR system.
So yes, currently it's an issue with current systems but won't always be the case.
In terms of locomotion, i agree that you can't get there (at least for systems with 6DOF) through systems that are ocular input only and not have the vergence problem. The caveat being if you can build a fast enough photon to cortex system.
The trick then is figuring out how to "render black", arguably the hardest challenge for a combined system or even just hmd AR system.
Why is that necessary? I bet I could have tons of fun with wireframe based augmented reality games with zero opacity in them. That would also be a good way to deliver heads-up real-time data.
The military/defense guys probably played around with this before. I'd love to hear their take on wireframe AR.
>Nvidia’s CEO reckons the challenges holding VR back aren’t going to be solved for 20 years, delivering a surprising dose of realism to a VR industry creaking under the weight of hype.
Thanks for your informed comment. I was going to say something similar about the Magic Leap, except from a background with less knowledge.
Last month's Wired article on Magic Leap says that they've got the magical depth focus problem solved. Unfortunately it's frustratingly short on details. So I don't think we'll know for sure until it's actually released, but if this first generation of Vive/Oculus doesn't solve the issue, whatever generation that Magic Leap ends up being in might.
All the articles on Magic Leap are short of details. All their YouTube clips have had some questionable element. All we have are breathless investors, a couple of reporters, and the due diligence we assume was behind the $500+ million they received.
Until they show something to the public, I'll give them zero credence of having really advanced the state of the art.
I won't talk any more about ML specifics but I would say that you should expect VRD to be the standard within 10 years regardless of what ML does or doesn't release.
No, actually I think he's just wrong about focal lengths. I have a Rift, and while moving viewpoints can be an issue, accommodation isn't a problem at all.
You're not understanding accommodation well enough.
The short version is, an OLED/LCD has an "infinite" focal length and cannot do otherwise. So by default everything in the display is rendered at the same focal point. You can do some tricks like blurring elements, but these are not responding to the convexity of your eyeball or how your eye muscles are working, so if you look at something that is software blurred it will still look blurry. Even with eye tracking it's still really a hack, and isn't naturally responsive - it's the software reacting to your gaze.
Display based accommodation however renders objects at different focal points. So when your eye changes gaze point and your ocular muscles respond to put your eye at that focus level, the object/s come into focus. It's the reverse of the hack.
Sounds like a trivial difference but is completely different, significantly more convincing and logistically more complex.
Thank you for providing technical information to a thread that is thriving with far too much anecdotal data. Im careful to say noise because in an early technology like VR it is good to have all this discussion and sharing of data but I really appreciate the underlying science more than the anecdotal checkmarks. Anyhow thanks.
I second that. I've got one (kickstarter) and the only thing that triggers the nausea for me is running perpendicular to the direction I'm looking. One of the games I've downloaded has letters and signs you have to lean in really close to read, and I've found no discomfort in that. Maybe it affects other people, though? For me, I'm a little myopic but I take my glasses off to wear the headset so my focusing muscles are a little bit frazzled anyway.
"There is no real solution to the movement problem though"
There is no absolute need to let the player experience real acceleration in unrealistic games, you can't differentiate between being at constant motion and being stationary. What I am getting to is that in unrealistic games you can make things move only at 5km/h or not at all, sure not realistic but possibly ok enough for less players to get motion sick, sure not a perfect solution but game developers have done amazing things with the limitations of 2d screens, and some people still get motion sick from shooters played on 2d screens. Not polished enough for real mainstream use but still polished enough for a very big market.
Alternatively you could make the player accelerate at steps of 5km/h:
He's certainly right that in AR (where you literally don't mess with a lot of the scenery, sometimes most of it, because you use a transparent visor), these problems are often less severe.
It's less of a "sufficiently smart compiler" argument and more of an "if you only touch 5% of the scene, the other 95% is just looking through glasses, and people do that just fine" argument.
Couldn't we fix the vestibular mismatch with a tilting chair? Isn't the problem more that the direction of acceleration is incorrect than its magnitude?
Probably not. Something might work for heavily coupled racing simulators where biologically similar locomotion (walking, running, jumping) is not a consideration. You would however need a very large space for this. I have seen one of these that was set up in a gym with a pulley/cable system and motors but it's not really practical for anything outside of very specific simulations.
A consumer focused solution really isn't feasible.
But then AR's going to have a similar motion problem when you're up close to synthetic imagery instead of only having a little bit of it as an overlay.
I always get a headache when using an iPad in the backseat of a car. But not in a train. I always wondered why this was, because it doesnt happen with my phone and never used to happen with my gameboy as a kid. Is this the phenomena you are referring to?
They're comparing apples to oranges. The $80,000 military VR headsets of 1989 absolutely did not use OLED or OLED light field displays. They've only recently been available from manufacturers at any price with their current pixel density and refresh rate (OLED has always had amazing response time, but refresh rate has been lacking until recently) specifications.
Even the most expensive research OLED panels with high enough pixel densities for VR have only existed since the early-mid '00s. The studies he linked to are from 1989.
The technology already exists (you need a pair of expensive MicroOLED displays and some special lenses), it's just not available at a consumer price point.
> Momentum
For someone involved with military devices, it's surprising that this guy has never heard of galvanic vestibular stimulation. GVS will never be easy and will always require very, very extensive calibration per-user, but it's still possible to use for simulating perceived movement.
This isn't that important for me personally, as I've never experienced VR sickness even when playing fast-paced high-momentum VR games for multiple hours at a time. I could see people who get VR sickness wanting this, as GVS would help prevent it from happening.
I think you're missing the bigger point that he's trying to make - our sensory systems work in concert to provide a full picture of reality and they're very sensitive and extremely well tuned together through a lifetime of experiences.
Just providing an accurate projection of the visual field to the eye's doesn't solve the problem completely. Visual perception is not a passive mechanism, our eyes 'explore' the visual field through quick movements and readjustments and this process inherently locks together the proprioceptive signals from the ciliary and ocular muscles to the stream of information from the retina. By only providing screens, we can't simulate this process in a natural way and our body will detect it.
Similarly - our vestibular sense is strongly tied to our experiences of our body moving through the world. Just using a blanket stimulation of electrodes on the skin will not recreate these signals in anywhere near a convincing enough manner to fool the brain's perceptual processing.
All of our sensory perceptions are strongly linked together - if just one is perturbed in an unconvincing way (try wearing glasses that aren't prescribed to you for a while), our brain will do our "caveman brain" thing and make us sick.
On the other hand the brain is insanely adaptable. People made to wear glasses that flipped their vision upside down adjusted fine, and when they took them off, everything looked upside down.
Even your example is wrong. You can in fact adjust to glasses of the wrong prescription. People who need glasses have deformed eyes which is basically like having the wrong prescription. The brain adjusts, and people can go without glasses without vomiting everywhere.
The nausea reaction is an issue, but it doesn't affect the entire population. I have friends who can't play video games because of motion sickness, but no one argues that video games can't be commercially successful. And various tricks like removing head bob, adjusting the field of view, and adding a virtual "nose", have been shown to significantly alleviate that.
> our eyes 'explore' the visual field through quick movements and readjustments and this process inherently locks together the proprioceptive signals from the ciliary and ocular muscles to the stream of information from the retina. By only providing screens, we can't simulate this process in a natural way and our body will detect it.
That's the point of the light field displays GP mentioned. They also encode depth allowing the eyes to refocus on the correct depth corresponding to the object's virtual position.
Put differently, a LFD reconstructs photon paths, as if you were looking into a window facing the recorded object.
The remaining engineering problem is that those microlens arrays step down the effective resolution of the display, so you need much higher DPI to get the same sharpness as a non-LFD would get you.
The important costs though for the current wave of VR are hundreds of millions of dollars for OLED display manufacturing lines, and an additional many millions of dollars of engineer time from Nvidia / AMD / HTC etc for custom drivers, low persistence hardware etc. It's very, very, likely that previous $80,000 or even $200,000 systems are not up to par with current mass-market hardware regarding key metrics relevant to motion-sickness, or even that the military bidding process would even specify those metrics considering that their importance was only discovered recently through user testing on hardware that was not previously available.
One other interesting thing to note (at least on the HTC Vive) is the majority of the content for the platform is focused on standing / walking with motion in real life matching 1-1 with motion in VR. Which both stops a major source of motion sickness and rules out entire categories of software like flight simulators, which seems to be his entire work history.
Computer software and hardware are an interesting case where often, the more expensive something is, the worse it is; the best stuff is often quite cheap. This comes from the high upfront costs meeting very low marginal costs. A cutting-edge chip fab will cost $10b+ all told right now, but can produce millions of chips a year at trivial costs.
I knew as soon as he mentioned military VR results that his case was going to be based on ancient and inferior hardware and outdated results, and include no actual results or even first-hand experience with a Vive. (At the most generous reading of the article, he worked with a DK1 a while ago.)
1-to-1 motion certainly seems to eliminate motion sickness. I can't remember a single instance of getting nauseous with a room-scale Vive experience and I've been in that headset for numerous hours.
I've tried a couple seated flight-sims. You simply can't see distant objects as well as you'd expect due to the low resolution of the panels, which is far more important in flight sims than with most other things. I can't say I got sick, but it didn't feel real to me, and I can see how others might get sick.
Whoa! I have been kind of watching the whole VR thing from the sidelines, and I didn't expect this to be even anything somebody was even thinking about.
Thanks for the hint! Are there any specific projects that you'd think are worth mentioning here ?
There are consumer oriented research GVS headsets currently under development, Samsung had one at CES this year. From what've heard it's not hugely accurate, but the general sensation should be enough.
Light-field displays have not solved anything. They introduced depth at the expense of resolution, and by a huge factor. Even regular VR headsets have huge pixels, the light-field ones are so low resolution, you will wish your 320x240 VGA was back.
If current progress with MicroOLED panels continues, we should expect 5000x5000 displays that are small enough for use in a glasses form factor with light field displays. This will significantly help with the reduced effective resolution.
The next problem will be figuring out how to render 10000x5000 at 120hz in realtime on consumer GPUs. It will definitely require insanely beefy rigs to accomplish.
10000x5000 displays will take decades. That's a factor of 6 compared to 4K. We don't even have graphic cards that can play modern games at 4K / 120Hz, we just got there for 1080p.
I'm talking about modern games, meaning 2016, not 2011. Maybe you can play some lame (graphics wise) game at 4K 120Hz, but not the ones that are actually huge.
You don't necessarily need to render everything at full resolution, just the area the eyes are focused on which you detect with good enough eye tracking.
The proof is in the pudding. I'm kind of amazed that the first response to this article is not either: "Correct. We tried Vive/Oculus on 1,000 subjects and 90% were sick within ten minutes," or "Incorrect. We tried it on 1,000 subjects and 10% were sick after 2 hours." Or some other clear test from real players. How can the viability of VR be this speculative this deep into launching to real customers?
I've personally demoed the Gear VR (via the Iceland flyby demo) to about 80 people.
Only two experienced significant nausea. Two or three more reported feeling a little queasy. The rest didn't have any nausea-related problems.
A few people reported vertigo, one desperately gripping the chair they were sitting on when they looked down. One literally threw the headset off when some VR dolphins in another demo got too close. These are unusual reactions.
I'd say about 90% of the people don't have any problems at all. The consumer Rift should be even more acceptable.
Still, these devices are all in their infancy. Expect devices a few generations in to be far, far better than the ones that mostly target early-adopters today.
I expect the main challenge to overcome for most users will be comfort. Few people will want to wear these things for long, once the novelty wears off. A lot of people don't even like wearing regular glasses. Wearing these bulky, hot, uncomfortable headsets for extended periods of time will not be worth the trouble for most people, once they get over the gee-whiz factor.
Both major VR headset manufacturers have measurements like that, including data on specific optical features which affect the rates of simulator sickness. But I don't think they're sharing any data that's that raw.
Anecdotally, I have a Rift, and do not get sim sickness except in rare circumstances involving at least an accelerating viewpoint, and even then not severely and only briefly.
Anecdotally, guy brought his Vive into the office and a half-dozen people including myself tried it out. I mostly tried Into the Blue for about 15 minutes watching the jellyfish. Basically a static perspective and I did not feel any kind of discomfort. The very strange thing was that I felt uncomfortable after I took it off for about 20 min. Another person had more or less the same reaction and was ill enough to actually have to go to the restroom to throw up afterward.
Motion sickness tolerance varies a lot of course, I wouldn't hazard a guess as to the percentage who would have this problem with VR, but unlike say, watching a color television, the percentage is not insignificant.
No joke. Some people get sick after playing or watching games on a regular 2D monitor.
As for your discomfort after taking the headset off, this is mostly because current VR tech strains the eyes a bit due to everything always being in perfect focus.
Here's an optometrist having his first experience with VR and talking about what he thinks:
> Some people get sick after playing or watching games on a regular 2D monitor.
Yes. It's highly dependent on the game, too; certain games seem very prone to issues. For example, The Talos Principle, a first-person puzzle game, causes nausea in enough players -- including me! -- that there's a whole section in the configuration for "motion sickness options", like changing the field of view or disabling head bobbing.
(FWIW, I was able to play through the game comfortably after switching it to a third-person viewpoint.)
To be clear this isn't specifically a VR game I'm talking about; it's a (somewhat) mainstream FPS-style game on the PC. Head bob isn't uncommon in these games, but it doesn't usually bother people.
I know, right? The skepticism people are putting up toward something that's right there in front of them in the world right now, the amount of back-and-forth that's being treated as justified, is kind of mind-blowing. (I've actually found that seeing the confidence people have in their self-assured dismissals of VR's feasibility has altered the way I perceive the attitudes I encounter in other fields, particularly politics.)
Since (nearly) everybody in this thread seems to be under the impression that it's impossible that the vast majority of people who've used the Vive are actually telling the truth about the lack of nausea it incurs, here's one of several weekly videos put out by the developers of Fantastic Contraption, showing people playing their game unimpeded for two and a half hours: https://www.twitch.tv/colinnorthway/v/61979623
In hours and hours and hours of real usage, the only thing about the headset you ever see anybody people in these videos complain about with the headset is sweat.
The point is that the same VR that doesn't work for a bunch of situations because of nausea can work very well for content created with those limitations in mind.
All entertainment works within some constraints of the medium, and things that are great on one medium often are impossible or just bad on other medium. An expert saying "VR won't work for vehicle/racing sims because of acceleration/vestibular issues" is perfectly compatible with VR working great for real customers - if you look at the provided content, then they manage to provide quite a lot of fun and variety while simply avoiding any vehicle/racing sims, probably because of the same issues.
Yes! This is the kind of research I want to see. And I know that this kind of investigations have been done in private by Oculus and others, and they have honestly admitted in the public that simulator sickness and nausea are issues that need to be worked on. We haven't, however, seen any data in public.
Some of that is related to the article talking about differing nausea when synthetic imagery was different distances away -- it depends a lot on what you're looking at.
One game that seems to work for me in VR is EVE: Valkyrie;
An arcade style multiplayer dogfighting in 0G space.
As you are in space, the lack of an 'up' feels like you are never at the 'wrong' direction.
It took some getting use to doing barrel rolls or adjusting yaw and pitch too quickly. As I improved, the sickness from more complicated manoeuvres subsided.
I think that attitudes that VR is somehow dangerous and impossible to be done without sickness are foolish and ignore historical precedent; Many things throughout human development have felt 'unnatural' and required adjustment and training - driving, flying, skydiving, moving pictures; yet humanity has adjusted to these in short course.
I think this is a very valid point. Myself coming from elite dangerous and star citizen. The very first thing I noticed was how I can make a choice that there is no up and down, thus I shouldn't keep trying to right myself and it completely changed my dizziness and perception. Whereas certain people can't even watch space flight on a monitor.
I've got a vive and at this point I've probably spent around 40 hours in it, including a few multi-hour sessions. Before that I played with an early Rift dev kit for several days.
The biggest issue is when the world moves without you - Windlands is a big culprit here. In general standing and using controller locomotion so far has been a recipe for nausea.
But outside of that, so far everyone who has used my unit (and that's about 20 some people currently) has had a blast. No cases of simulator sickness (and I've only seen two reports of that in the community as a whole so far) and no nausea other than an initial misattempt with Windlands.
Sorry, VR is probably here to stay, though whether it goes mainstream will be about whether or not the price can drop quickly.
Yeah this military guy can't see past flight simulators and seated experiences in vehicles. No one gets sick as long as your movement in VR is one-to-one with the real world.
Just like when mobile apps came out, existing games didn't work on the new platform. No one wanted to play an FPS on mobile. VR is a new platform and developers have to adjust for what works on it. Games like SPT, holopoint and zenblade get it right. They're all simple games inside a room but they are all create amazing experiences. I've been using my Vive almost every night for weeks and this is definitely here to stay.
I've never had any nausea issues with "room-scale" games where you move around by physically walking and maybe using an instant teleporter wand (https://www.youtube.com/watch?v=Q7dVaembmgc). Everything OP mentions about momentum is a total non-issue in games like this. Can't say I've had any particular problems with judging distances of objects either.
I've tried a few different games with controller locomotion (hold the joystick up to move your avatar that you see from forward, etc) and I definitely had some trouble. Part of the trouble was mostly about constantly mixing up whether I should try to use the controller or real movements to move in-game, but also a bit of nausea at times too.
>The Google Cardboard VR folks seem to know this and basically suggest that applications make it seem like we’re basically standing still and just turning our heads around (which we really are - so no nausea). That’ll work - but it’s not going to get you an immersive 1st person shooter game…or a car racing game…or…well, pretty much anything immersive.
Sure, not all currently-existing game genres will translate well to VR, but that doesn't mean that there's nothing VR is good for.
I've experienced quite a bit of nausea from goggle only VR products like GearVR. Even the menu was enough to get me sick -- moving my head forward without the menu changing distance with me seemed to make my body think I should feel movement but didn't, or something like that. The nausea in these situations was quite bad and lasted for 30-60 minutes or more after taking off the goggles.
But I have to say that the experience was quite a bit different with the room scale VR like the HTC Vive. The only time I experienced any nausea in there was in Hover Junkers. The game has you riding a vehicle with frequent changes in direction at high speeds. This again seemed to be related to the perception of motion without corresponding sensations.
But the teleportation used in games like Budget Cuts didn't affect me at all. (There were some disturbing bouts of claustrophobia when I teleported too close to a wall, but that didn't induce any nausea.)
At the time that I tried the Vive, I spent about 20 hours total playing with it, spread across probably 4 sessions. I think my longest duration without a break was 6 hours. It was exhausting but not nauseating. The others with me had similar experiences. In fact, one of them normally gets so nauseated by things like this that they have prescription medication for nausea for use while riding in the car -- and they didn't experience any nausea at all.
This is all just empirically incorrect. VR headsets are already in the hands of consumers, and not causing simulator sickness in the circumstances described. Developers have now had lots of time to experiment and find out what does and doesn't cause it. Accelerating the viewpoint? Yes, that'll affect some people. Placing objects close to the viewpoint? That's completely fine, and the author's idea that this will somehow mess up people's depth perception is flat-out wrong.
And yet a few comments above yours are people saying that Oculus has publicly admitted that there are sickness problems, and anecdotes of a fairly high proportion of people who do get sick. How can there be such wildly different opinions about something that should be factual: real evidence of how current VR rigs affect people playing them for substantial periods with currently available games?
The idea is that you can fairly easily, with current hardware, make experiences that don't induce sickness. It's mostly about ensuring the framerate stays high and never moving the viewport for the user.
Unfortunately Oculus and HTC/Valve can't stop developers from doing the obvious things that make everyone sick. Dumb developers can and will.
Some people get sick simply looking at motion on a 2D monitor. So there will always be people getting sick in these experiences.
I can't really this guy too seriously when the bolded "kicker" to his story are citations to studies from 1989. Equipment and research from that time period had latency in the hundreds of milliseconds. (For context, modern studies (2009) show that 5ms latencies are imperceptible in HMDs [1])
I'd be interested to hear what vintage and what specs those $80K headsets really were. Knowing the specs of older $50-100K research HMDs, I have extreme doubts that resolution, latency, or tracking accuracy match today's consumer headsets even (most of the tech stemming from the billions and billions of dollars of investment spent on smartphone development from the past decade).
When talking about depth perception, hyperbole like "THAT problem can’t be fixed by any known technological means" (in reference to focal/depth plane projection) is completely inaccurate, and contributes to the feeling of crochety old man ranting. There are multiple VRD and light field solutions being worked on to address the multi/vari-focus issue (my current favorite approach is the micro-lensed near-eye light field displays, although I suspect that barrier-based LFDs will be what gets adopted first).
Vection is a big issue, but there's a huge amount of 1:1 motion experiences available, and based on personal/anecdotal experience, most people have no problem spending 15-30m in them w/o having strong adverse effects. Thousands of people have done the 10m+ the Oculus Connect and Vive Experience demos w/ very few issues. Based on that, I believe that for experiences w/o artificial vection, comfort is primarily a software problem at this point, not hardware.
Personally, I'm pretty bullish on all the locomotion experiments going on (teleportation, re-orientation, redirection). I'm a bit less convinced on vehicle/cockpit vection w/o linked motion control, but the price is certainly not an issue for high-end military simulators and lots of enthusiasts are making DIY 6DOF simulators.
Well, he's obviously wrong because we have it now and it does work. I've tried the Vive an Rift and while some things made me feel a bit sick, it was mostly when the camera moves without your direct control (e.g. in Lucky's Tale).
His section about momentum is stupid - the Vive especially allows 1:1 mapping from VR to real life so that isn't an issue.
Also, this comment is probably the most stupid thing in there:
> Sadly, the $80,000 googles we made for the US military had less latency, higher resolution displays, and more accurate head tracking than any of the current round of civilian VR goggles…and they definitely made people sick - so this seems unlikely.
Call me a skeptic but as if they had lower latency and higher resolution. $80k is a tiny budget for the military, and consumer displays and graphics hardware are the state-of-the-art, not military ones.
He's wrong because he's worked with the tech for decades, but you tried it and didn't get sick? The friendly article notes that only about half of users get sick within a few minutes, although most of the rest do as well within an hour.
1:1 mapping from VR to real life is fine if you're walking around a small room (and not puking all over the place). However, it's not fine if you want to anything that involves a more interesting range of motion: driving, flying, even walking in a straight line for more than a few meters.
> consumer displays and graphics hardware are the state-of-the-art, not military ones
>The friendly article notes that only about half of users get sick within a few minutes, although most of the rest do as well within an hour.
Practically no one gets sick from Vive in 1-1 experiences, which is all the parent said. The article says VR won't work, but really he just means artificial vection won't work.
There are also plenty of ways of making vection work by limiting certain things. For example with Vive allowing easy 360 tracking of hand and headset, you can avoid all artificial rotation. Some people only get sick from artificial yaw.
Also there are things like FOV restriction you can do during movement which limits vection. Then you can snap back to full FOV once the artificial movement is done and let the user walk around the room again with full FOV: https://www.youtube.com/watch?v=lKnM5gC-XpY
I would guess the article author is bound by non-competes locked into ancient military contracts and can't get in on the startup action in consumer VR and is just bitter. People are probably asking him what he does at parties, and they say in response, oh I know of Vive and the Rift, and it is getting on his nerves so he is just sort of trolling to lash out.
Flight sims with visible horizons do indeed cause tons of motion sickness in a lot of people though. But that doesn't say anything about VR in general.
> In 1997, MicroOptical demonstrated the eyewear display in which the viewing optics were incorporated in the eyeglass lens. The eyeglasses display provided a 320x240 pixel resolution with 8 bit greyscale and a field of view of approximately 8 degrees (horizontal).
And a Ford T is basically the same thing as a modern car. The modern car is just better in every category, but basically it is the same and it was there first.
I've used both the MicroOptical HUD in 1999 (by which time it was full-color and around 640x480) and Google Glass in 2013. The display was essentially identical (which is unsurprising, since Google acquired many of the patents involved), the main change was in the power supply: the original MO required lugging around a huge box clipped to your belt, while Glass had an onboard battery.
SO VR works, but simply some content with motion characteristics like driving or flying won't work (yet) on VR. That's a limitation, but it doesn't mean that VR - there are many genres that don't really work on some platforms. You don't generally play RTS with a console controller, you don't play FPS on smartphones, and you won't play driving/flying sims on (current) VR - that doesn't mean that VR doesn't work as such, simply it's currently suited for very different types of content, content that doesn't involve world wildly moving around you without you moving.
"moving images on the screen aroused a physiological feeling of horror in the audience (shots of an onrushing train) or physical nausea (shots taken from a great height or with a swaying camera)" -Jurij Lotman
You could research the Lumiere Brother's films, and probably find some material on the matter pretty easily.
See also, the history of IMAX (70mm film). Originally invented in 1929, named Fox Gradeur,they cancelled the product for decades because it was too realistic, and was making audiences profoundly uncomfortable.
I even recall reading recently that color film and television caused lots of nausea when they first hit the market.
> He's wrong because he's worked with the tech for decades, but you tried it and didn't get sick?
Well.. yes? He says it's impossible to make VR that doesn't make people sick. We now have VR that doesn't make people sick. He's wrong. He decades of experience probably cloud his judgement because until now VR was terrible.
>>> Call me a skeptic but as if they had lower latency and higher resolution. $80k is a tiny budget for the military, and consumer displays and graphics hardware are the state-of-the-art, not military ones.
Any argument for this statement, or just 'I tried Vive and it was great'? It doesn't say anywhere that $80k was a budget but it's rather clear that it was cost of the unit.
Sorry, but you just make loads of big claims and all with nothing backing them.
Yeah I meant budget as in 'budget to make one of them'.
The argument is obvious - consumers are the biggest market for displays by a huge margin. The military don't have super-advanced displays that LG and Sony can't match. The military didn't make a 4K 5.5" screen before Sony. They don't have huge display research divisions.
I feel like the original author should have to show this $80k low latency, high res VR system before I can disprove it. All I could find was this:
It looks shit compared to the Vive. Look at how bad the field of view is! No numbers on latency or resolution but come on. There's almost no chance they're as good. Hell, the technology to get really low latency, like G-Sync and OLED displays didn't even exist until really recently.
Just because it's 'military' doesn't mean it is the best.
I think the real issue here isn't if or how often VR headsets cause simulator sickness. The real issue is that VR can fuck up your proprioception. If pilots, who are using it relatively infrequently, are having issues lasting for several hours to several days, what sort of issues are gamers going to have when they're using these things daily over the course of years?
If I recall correctly, this was one of the biggest issues that Sega ran into with Sega VR. They knew that short term exposure to 3d/VR would shut off some of the cues your body uses for depth perception, and they were concerned that prolonged use might have a more permanent effect. Because of this, their lawyers advised them not to release the product for fear of lawsuits if people suffered permanent damage.
Yes - when you get motion sickness badly it's amazing how bad it is and how long the effects last.
I experienced incredibly bad motion sickness at a racetrack once; the ole "vestibular disconnect". I had been gradually improving lap times and went to one circuit which was continual left-right-left-right sequence of corners with very little let up (cornering G's around 1.5). After 3-4 laps I had the cold sweats and had to get out of the car... I tried 3 different sessions to shake it and couldn't. Was dizzy for the remainder of the day wanted to throw up constantly. A good sleep and I came right. The dizziness was horrible - you definitely wouldn't want to have that experience then go on to operate heavy machinery :)
In my understanding talking to a scientist friend in that field - everyone has different tolerance levels, but when it happens and your body goes into "caveman mode" the results are horrible. The underlying message in the authors answer (my read) is it's "be careful... you can really mess someone up".
After that incident at the track my empathy for motion sickness went up a few notches..
I get simulation sickness even from games I play on a 1080p screen from an arm's length away. I couldn't stand a 3D-glass for more than 15 minutes and therefore I'm too afraid to even try the VR headsets more than ten seconds (It gets really bad when it starts with a splitting headache and nausea combined).
However, I feel like I'm in the minority on this. No one around me seems to have any trouble watching movies in 3D, for a start.
I guess, just as some people aren't fit to operate a vehicle (problem in eyesight or other reasons) there are some who aren't fit to use VR and I really doubt they (we) amount to the mentioned 50%.
An interesting anecdote - My friend who absolutely cannot play or even watch someone play an FPS game had no problems with the samsung gear(arguably a device on the lower end). You should give it a try and see how you fair.
My wife gets sick in 3D movies and she can handle the vive surprisingly well. When the image adjusts to your own movements it's less nausea inducing than viewing lots of motion in a movie while sitting still.
This actually probably has more to do with the fidelity of rendering in the Vive apps. It's important to remember that due to hunter/gatherer specializations men and women (testosterone and estrogen dominant individuals actually, we tested this with hormone treatment patients that were swapping genders) tend to perceive visual reality differently [0], indeed the eye has the second most (besides genitalia obviously) sex differentiating receptors than anywhere on your body. All of the stuff the OP was talking about with motion sickness and parallax are mostly male concerns (as men use parallax and motion as our primary vision perception systems). Women use shading and lighting as their primary systems; which makes them prone to sickness when those aspects are screwed up (as in older systems) but if rendered with enough fidelity (as we are beginning to be capable of) may not have that problem. At the end of the day, we don't really understand how vision effects different people's brains that well.
I've yet to try a VR headset but because I have very different vision in one way to the other 3D films simply don't work for me. I can watch them but I only see them in 2D, my experience with Google Cardboard was the same, so sadly I don't think I'm ever going to get to experience what others do.
I'm told by my optician that my brain never worked out how to use both eyes at once due to a squint when I was young - at any given time I'm only actually processing the input from a single eye. They attempted to correct it but my peculiar vision was picked up much too late in life so it just made me motion sick as the brain rapidly alternated from one eye to the other. It also means my depth perception is pretty appalling, at a distance I've learnt to compensate, but watching me try to put together two small objects close up can be pretty hilarious as I wildly miss.
It makes sense to me that if I'm only seeing the image for the left or right eye individually the 3D effect wouldn't work.
Using the sweeping assumptions embraced in this article, any 3D games which happen to cause motion sickness (due to the player having 'six degrees of freedom' in zero g) to a non-trivial proportion of players when played on a monitor or TV, such as Descent https://en.wikipedia.org/wiki/Descent_(video_game) prove unequivocally that 3D games 'will not work' on TVs or monitors.
I vaguely recall many alarming articles written about Wolfenstein 3D causing motion sickness (even some talking about how it was permanent somehow), and how big of a problem this would be for 3D shooters.
He mentioned seasickness as an example, yet people still go on cruises.
I think the problem will be "solved" with a combination of improvements in the tech (light field, etc), conditioning (the brain has surprising plasticity, especially if you condition it from a very young age), custom designed applications (to avoid the motion problem), and maybe lots of nausea medication. :-)
It's worth noting that the folks in the article reporting simulator sickness and long term disorientation are pilots, meaning they have been screened and selected for their endurance and resistance to motion sickness. On top of that, the symptoms they are reporting go well beyond seasickness. Have a look at rowanH's comment in this thread for a better idea of the effects we are talking about.
I'd certainly agree that the brain can adjust to using VR, but at what cost? If using VR means losing your depth perception and sense of balance, so that you are walking around bumping into things (or worse driving), then it's probably not going to be worth it for many people, despite how fun it might be.
I'm still confused about the price comparison he keeps using. Is that really apples to apples? The Navy may have wanted to sell a dozen of these units at a time whereas oculus and HTC go into true mass production. I was always under the impression that rally limited runs of anything are far more costly per unit.
> [the goggles] we made for the US military had less latency, higher resolution displays, and more accurate head tracking than any of the current round of civilian VR goggles
Dubious.
> I believe that the most major problem is with depth perception.
I would expect, even a fairly incompetent engineer, to not make such elementary mistakes... its pretty easy stuff to measure, and even more trivial to know if you made the thing to start with :P
If the focus distance and convergence distance being mismatched would cause nausea, how comes 3D movies have no such issues (at least for the vas majority of viewers - I haven't head anyone complaining)?
And if the distance disparity would indicate hallucination and trigger vomiting as a defence mechanism agains shrooms, how comes existing FPS games have no such issues?
Some people claimed to have problems playing Mirrors Edge back in 2009, but it seemed like it was mostly the hype and the attention-craving. I myself never felt any issues, and neither did anyone I've talked to.
Anecdotally: I get very nauseous watching or playing FPS's.
Watching a friend play _DOOM_ in 1996, playing _Halo_ in the 2010s, playing _Fallout 4_ (on a 65" screen about 2 m distant) today: all these scenarios have induced nausea in me.
On more than one occasion, an hour of play has put me in and out sleep for 12 hours.
Me too. I get (relatively) easily motion sick - in fact, I can't play Halo full-screen (but have rounded Halo 1 some dozen times on legendary in co-op split screen). And I have no problems playing "Elite: Dangerous" on the Oculus DK2.
On the same headset, the "Villa on a hill" demo scene is problematic, though, because of some issues with geometry inside the house (warping angles), and more generally, "walking" around while sitting down.
I think it'll be interesting to see how motion sickness from VR compares to seasickness -- I've talked to many people that have been sailors -- and started out with lots of problems, but eventually "got over it". And I've also heard stories of people that never got over it.
I don't think everyone will enjoy the same VR games and experiences, but I don't see how that's any different from other games and software. There are some FPS games I simply can't enjoy -- but those games still sell well. So I don't see why VR can't be a great success for a market that doesn't include everyone.
As a counter-point, I know people that can't focus on text on screens that have a resolution below the "retina"-screens, and there are of course a number of people with various sight-related issues (See eg: Moore's "Color Forth" system, and why he made it the way he did).
Perhaps VR will create a new, substantial section of the populace that will need to be classified as "limited function" due to not being able to use VR. No-one is claiming we should retire 2d screens because they are of no use to blind people. That's not to say there won't be controversy, but seeing that many people already enjoy VR, should be enough to point out that it's likely some form of VR will remain useful in the future, especially now that it is arguably rather affordable?
People experience dizziness, headaches and nausea in the 3d movies.
> how comes existing FPS games have no such issues?
People can get very seasick. FPS gamers are self selected group. People who feel nausea stop playing. I often get fever like nausea if I play too long. Not too bad though.
3D movies are so far away from reality, we still experience them as a moving pictures in relief. They don't have correct scale, depth, field of view, or even correct head position. We have these in VR which triggers the brain much more powerfully.
> so driving a car while “under the influence” of post-VR disorientation is probably as dangerous as drunk-driving.
That seems like a bit of a stretch to me. The only evidence cited is that the military issued a warning about driving after exiting a military flight simulator. I'm not questioning that the effects of VR can persist after exiting the simulation, I just don't think they can be compared to the effects of alcohol without significantly more investigation.
The authors points about both ocular and vestibular sources of VR nausea are valid, and both will probably be resolved. There are two solution paths: technical and biological. The author spent a bit of time on the technical solutions but none on the biological solutions. The simplest biological solution is one that we all already possess to varying degrees - neural plasticity. The brain is a plastic organ. In young children it is even more plastic. I doubt that the studies have yet been done, but I'd bet that kids trained with high-quality VR goggles from the age of three would not suffer from either form of nausea. Also I doubt that they would suffer from the after-effects of VR. Old-timers like myself will experience nausea. For my wife, it will be so bad that she just won't do it. I expect that the majority of today's adults will reject VR. But as with all things, the children are the future.
AR is a different thing entirely. The Hololens holoportation demo was one of the most amazing tech demos I've ever seen. Watching that, you know where the future lies. The gear manufacturers may not like to admit it, but there will likely be a long "arcade" phase to AR entertainment were we pay a visit to a "holodeck" facility that has the half a million dollars in gear which would be required to make this a great experience. I have no problem with that - I already pay to go see a movie or go bowling or to an amusement park.
Why then is the HTC Vive Reddit filled with post after post of people having a great time playing games for hours, and very few reports of nausea or sickness? I'm not suggesting we ignore health concerns related to VR, but the huge gulf between the anecdotal evidence post-launch and this guys VR sickness apocalypse make me think something is off.
People do talk about it a lot; it's often referred to in an optimistic way: "getting your vr legs", instead of a negative manner. That said, the early adopters of VR are self-selecting and probably the least likely to get nausea. I know a few people who are prime candidates for VR (graphics developers, tech nerds, etc) who either sold their devices or refuse to buy one because they know they'll never get over the nausea.
I have no real problems with nausea (or motion sickness in general) and I love my Rift. That said, I do get mild headaches if I play it for more than an hour. So yeah, it's a thing.
I am one that is pretty prone to nausea but got into VR. I don't think VR early adopters are necessarily self-selecting as far as nausea goes. Personally, I found different experiences gave me varying degrees of nausea. Games where locomotion was controlled by controller got me sick almost instantly, whereas games where I was sitting (like Elite: Dangerous) did not.
Hype, people don't want to up-vote nay-saying (if they even bother to report it), would be my guess (assuming the nausea is actually a serious problem, this would be my explanation).
I agree, hype could be an explanation, but it seems unlikely.
Multiple multinational companies sink many billions of R&D dollars into fundamentally flawed and dangerous consumer technology. Said consumers, not wanting to accept the truth conspire to almost completely suppress any meaningful spread of reports of the flaw.
Feels sorta unlikely to me! That's not really how the world works. Surely the simplest explanation is most likely ... that first gen consumer VR is in most cases "good enough", and a significant minority consumers experience some amount of sickness, but not enough to invalidate the industry?
Here's a video of people using the Vive for two and a half hours with no motion sickness, one of several put out by the developers of Fantastic Contraption, every week: https://www.twitch.tv/colinnorthway/v/61979623
Why on Earth are people in this thread putting so much effort into weighing in with their two cents "assuming nausea is actually a serious problem"? I understand that Hacker News loves playing devil's advocate, but devising theories to explain non-existent phenomena is just being tedious.
Microsoft promo - but I doubt the architect would like to make a total ass of himself. I work in CAD for construction and while not directly involved in Hololens projects people mostly see this as transformational for their clients. So, there might be actual value here but the market has not yet spoken.
In previous VR surges designers were not so educated in digital and the workflows were not there. VR was a thing itself - now it's more like a one more view into an established process and dataset - and is actually only an incremental change and not a profound one (and is thus likelier to succeed)
So we’re continually estimating range using the tensions in two sets of muscles - one for focus, the other for convergence. When the brain gets the right signals, these two mechanisms agree perfectly.
Just got my HTC Vive. Nausea isn't a problem for me, but age has definitely made me farsighted. Muscles in my eye might strain to let me focus close, but that's just not happening nowadays. Since this has happened gradually over the years, my brain has gotten used to it.
Oddly enough, when I tried the Oculus Rift, I found I could focus more clearly on nearby virtual objects than far away ones. Not only is VR an alternate reality where I can play with force fields and laser beams, it's a reality where my eyes aren't old anymore and work correctly.
Cardboard isn't real VR. Honestly. Today's smartphones DO NOT have the proper screens or sensors. You will get sick. Don't force it on others who might not be willing to tolerate that.
Google's Daydream is all about creating standards to improve this situation.
Part of me wonders if children are exposed to this technology from a young age, will they experience this type of sickness at all? Like how our parents couldn't figure out a nintendo or can barely open an email, maybe our children will be like "duh dad, you puke after 10 min in VR while I can live here all day!"
I think children that grow up on the sea adapt to sea sickness, maybe similar here.
"The peak incidence occurs in children under 12 years, but it is uncommon in infants. ... Children under two years old are highly resistant to motion sickness, as they are often supine and do not use visual cues for spatial orientation. Susceptibility peaks around 10–12 years of age."
I don't have a VR headset so I can't speak from direct experience about VR, but I don't buy the author's argument that the disparity between the divergence system and the focusing system makes elimination of nausea impossible. I have two pairs of single-vision prescription glasses: one for walking and driving, and one for looking at computer screens. They have different focusing characteristics, and yet I can switch from one pair to the other without any disabling effects. Hell, I even wear a set of bifocals sometimes and switch back and forth from one second to the next. Some people even like wearing "progressive" lenses that have continuously varying focal lengths from the top of the lens to the bottom of the lens. Lots of people like using reading glasses too; I've never heard of anyone getting sick from that. I think the author is underestimating people's ability to get used to different focal lengths.
I quite enjoy using the current VR systems, Vive in particular is very impressive although not at all practical for most people.
PSVR is impressive and what I think will be the most successful system as it the best value and already has a ~40m user install base where it will "just work".
For me the biggest issue is using any of these systems for more than 20-30 minutes is uncomfortable as you have this big unit sat on your face. Whereas when I sit down to game I like to do so for a couple of hours but that just isn't possible (for me) with VR.
In addition to the distance cues the brain derives from eye musculature he mentions, there are also distance cues based on audio perception. This perhaps explains why dramamine has an odd impact on VR motion sickness.
The audio tech in this area is pretty weak. Dolby Surround is about "gee whiz" not detailed modeling of environments with customizations for the impulse response of each moviegoer's ear canal. Lots of work to be done.
We do have positional audio in modern consumer VR headsets. (And I'd bet the military units mentioned in OP don't have it). This goes way beyond binaural audio.
The headphones are pretty bad -- and not isolated. So part of my brain is hearing "living room" while the rest is hearing "wide open space." I posit this contributes to the problem. Also two imperfect immersive sources (video + audio) that each lag is far worse than one.
I was glad to see Oculus was tackling the problem... then I got a look at what they were actually doing. Can't say more other than to point out there's a lot more work to be done.
I used to play Doom, and Quake and would have to lie down and curse the developers. It was almost identical to having been on a sailboat in rough seas below deck. It's terrible.
I wonder if the same seasickness remedies used on water would work with VR. Frankly, after what I got from games 20 years ago, I'm really afraid more modern realistic VR will be even worse.
I've yet to blow chunks on a vr headset, however in 2015 I came close via a homemade headset for a nexus 7 with the cardboard app 'Drive City Rollercoster'
Dispite nearly falling down the stairs with a nexus 7 strapped to my head, (I broke my fall with my left arm)
I'm sure this tech will take off this year.
I have a hard time believing that driving after using VR is the same as driving drunk. The problems described are real, but we need to work towards solving them sooner or later so why not now? It will get better and better over time.
I find the problem of the pain in the neck of heavy VR headsets a lot worse than nausea. I get nausea after maybe a couple of hours. But for the neck, I have to remove the headset regularly for it to be bearable.
I also felt VR will be ended up like what 3D-TV did, that something sounds great initially but can't be adopted for many people, you got dizzy and sick after a while and that is enough to stop using it.
I'm wondering why the Oculus developers didn't take those things into account, especially since they are so well known to experts. The most interesting challenges of VR seem, as of yet, unsolved.
The whole momentum section has been solved. If you take an existing video game where you use a controller to move an avatar that will make you sick, however if you map the games controls 1:1 to the users motion then the problem disappears.
What he neglects to mention is both studies raise concerns about simulations of helicopters. From the abstract of the first study:
"The simulators which exhibited the highest incidences of sickness were helicopter simulators with cathode ray tube (CRT) infinity optics and six-degrees-of-freedom moving base systems. Of those studied, fixed-wing, fixed-base, dome displays had relatively low incidence of simulator sickness."
If you read the second study, the VR environments tested we're exclusively those of piloting helicopters.
Since helicopters have a lot of lateral and rotational acceleration, its plausible that this is why the simulation sickness is so great.
Perspective projection is the problem. Make the following experiment yourself:
Stand in front of a tall building/chimney. Now look at the base of the building. Slowly raise your eyes to go from the bottom to the top. You perceive dimensions of the building as the same all the time.
Now do the same on camera. You notice the building changes its dimensions as you raise the view.
This effect will make you feel sick when you have it right next to your eyes. Brain simply does much more for our perception to work.
Solution: capture the light field of the scene using a renderer with a thin-lens camera model and then reconstruct the correct rays to display in the headset in real time.
Oh, wait, we already have the software to do it, while I'm again convinced the military units and projects discussed in OP don't.
Not really. Even if you take FOV 60 degress, the artifacts are still there. It's simply a matter of projection based on similar triangles on a plane. Retina is curved, eyes are also micro-moving all the time amongst other things to estimate depth. Perspective projection is the elephant in the room of CG/VR, there were multiple PhDs on how to perceptually correct this but everybody stays with simple affine matrix because it is convenient and 1/z is linear for quick depth comparison. Remember Wolfenstein 3D? Raycasting there didn't deform dimensions of objects. Doom however deforms them. Just start W3D in front of a wall and look left-right. Wall's dimensions don't change. Now do the same in Doom. Vertigo!
I've had an oculus for weeks and have none of these problems. I even played Ethan carter without comfort controls and as long as I'm not standing up its fine. Standing up while moving in vr just makes me lose my balance.
The only time I've gotten sick at all on vr is playing project cars, and that was only from reversing and going forwards a few times after a wreck.
It really sounds like he's basing this on no experience with the current generation of headsets.
He states in the article that he worked on an Oculus project, and in general has worked with both military and civilian simulators for years. He's basing his conclusions on extensive data.
It will work, it just won't catch on because people don't want to wear shit on their face all day and look like a moron talking to beings 90% of the population can't see. It's just a toy for gamers and developers right now. No telling when the general population will start using VR regularly but probably not until the devices start to look like a regular pair of glasses.
Oh no. Oh no no no. I think it's pretty clear that VR headsets make you look like a moron at a whole new level. Take away a person's eyes—hide their eyes so that they seem to have none—and you take away a lot. Put them in touch with a private world and you make them even more alien. Radio, telephone, even television can't compete with that level of moron-looking.
I assure you people said those exact same things when radio, telephone, television and smartphones came on the scene. Since that last one is pretty recent, I'm sure you would remember all the comics and satire around people being absorbed in their iPhones and whatnot.
> it just won't catch on because people don't want to wear shit on their face all day and look like a moron
Social factors are a big thing, and while it pains me to say it (and your comment was worded in an unnecessarily offensive way) I do think that you might be at least partially right.
You are certainly right that social factors have a huge effect on the uptake of new technologies, but it's often hard to predict how those social factors will change and shape the uptake of new technology.
My example is text messages. Doesn't that seem like something only a total dork would like? I was a fairly early adopter of text messaging myself, and I certainly fit the 'poorly socialized nerd' stereotype, and my perception at the time was that text messaging was for people who were too poorly socialized or nervous to talk via voice. But that's not how text messages came to dominate. text messaging came to dominate, as far as I can tell, because it's seen as somehow rude to actually talk with someone who isn't physically in the room with you when other people are in the room with you. (It doesn't seem like it ought to be rude to me, but my social responses are not normal. I can certainly observe that the vast majority of people find that sort of thing rude, and I act accordingly.)
my point here is just that you are right to say that social perceptions matter in the uptake of these technologies, but probably premature in confidently stating that you understand what direction those social perceptions will push.
I'm guessing that the future will be AR. Heads up displays that annotate reality, e.g. turn static printed books and magazines into hyperlinked text you can tap and expand like a Kindle. When the glasses or projection devices become sophisticated enough, the possibilities will be endless.
Similar to the author I worked with HMD systems while in the air force, specifically the integrated HUD avionics F35 helmet system. He is correct about the current lack of accommodation (depth perception) and the vestibular mismatch for movement. Other posters discuss vestibular stimulation, without realizing that such stimulation actually induces physical motion. The biological systems are too tightly connected to just simulate physical movement in the brain without a physical reaction.
What he misses, which is critical is that certain virtual retinal displays can solve the accommodation and multiple depth object focal lengths. This is the major thing that magic leap has going for it, but while it's hard it's not impossible to do in multiple ways. We have our own multi-depth accommodation vrd patent filing.
There is no real solution to the movement problem though... Except to actually work with virtual objects in the real world, AKA augmented reality.
So the actual solution to his problem doesn't lie in VR solutions, it's in AR.
Until we have sufficient BCI that works around vestibular and ocular pathways we aren't going to see the matrix levels of presence in non AR displays.