Hacker News new | past | comments | ask | show | jobs | submit login
IL-2 Sturmovik and Oculus Rift (rockpapershotgun.com)
93 points by javanix on Sept 9, 2013 | hide | past | favorite | 65 comments



As a Rift owner, the most immersive demos I've tried by far have been ones where you are sitting in the cockpit of a ship or car. It's much easier for my brain to "believe" those demos, probably because it's not getting psyched out by the lack of 3 dimensional spacial tracking or the 'floating head' feeling you get when playing FPS's.


The moments I saw what the rift was capable of I was thinking of simulations Renaissance.

Mechwarriors, Descents, Freelancers - anything where your in game "avatar" isn't tied to the camera can benefit immensely from the technology.


Wow. Can we please have a mechwarrior / descent renaissance?


Descent is undergoing a renaissance.

You can buy the original games from gog.com and then download the updated clients for D1/D2 from http://www.dxx-rebirth.com/ or find the patches for D3 via the links at http://descentbb.net/viewtopic.php?f=1&t=4953 . There's a game tracker at http://www.descent-servers.net/ and built-in trackers in D1 and D2.

Some groups that are still active include http://descentrangers.com/Home/Index (primarily D2 and D1 team play), http://rbb.ripteam.com/ (D3 teams), and http://descentchampions.org/ (a 1v1 ladder for mostly D1).

You can find other Descent pilots on facebook at https://www.facebook.com/groups/descentpilots/ , https://www.facebook.com/groups/kalikahn/ , and https://www.facebook.com/groups/583400181716926/ .

Disclosure: I am one of the founders/administrators of the Descent Champions Ladder.


Retrovirus was okay-ish, and despite 6DOF very nice to use with a mouse. I hope they make a sequel and include Rift support.

The Descent1/2 source is freely available, so some brave soul will probably add Rift support.


The first experience I ever had with a stereoscopic head tracking display was with the original D1 in '96 or so, so some sort of stereoscopic support must already be in the code.

Coincidentally the second game I tried with it that day was Mechwarrior 2.


Seconded. All demos make me want to barf after 10 minutes (Minecraft is the worst, because I want to love it but it makes me feel so nauseous), except for the ones where your avatar is static. `Blue Marble` and `Titans of Space` are great for that (ToS especially so as you can actually "see" your body when you look down- of course it doesn't move, but it adds to the immersion. Adding a Kinect in the mix to make your arms actually move would be amazing)


The best experience I've had so far was with the Proton Pulse Rift demo, probably because the head tracking is the only means of movement or interaction. If you're finding cockpit-based games easier to process than FPSes, I'd recommend PPR.


I'd like a dev environment to be made in oculus rift. Why simply limit ourselves to games (though very cool and fun)? To me it seems like the oculus allows for an entirely separate reality to be created. Imagine putting on the rift and instead of your desktop you can have a wall with 100 monitors in it? Move you editor to the side have a build above you etc...


I bought on to the Kickstarter with the express intention of doing just that.

I sold mine. In 5 years, maybe I'll try again. It needs 2-3 generations of improvement, at least, before it's suitable for development work. Currently, the resolution is so bad you can't even display legible text within the 3d world.


Well, the point behind the currently-available revision is that it's a rough model for developers to build things that will look better on the higher-def release model.

By all accounts, the HD prototype that's floating around looks really, really swell.


I have no doubt it does, for games. But due to the nature of its optics, the perceived resolution of the Rift will still not be up to snuff for a while. You're essentially looking at the screen through a magnifying glass; I'm not sure there's any display with a good-enough pixel density out there, yet.


Thanks for saying this. I've tried the developer version and while it was great fun the resolution, screen door and shimmer issues were really disappointing.

I agree that it needs a couple of generations of improvement before it becomes useful for text. And if you check the specs of the "HD" version it's not going to be sufficiently improved to make the difference.


I tried the HD prototype at Unite 2013 and it looks really really good. Just look for the video of RogueCraft and you'll see text appear just fine.


A friend told me this the other day as well. It is rather limitless... and sounds amazing to code in. You could even have great visualizations for things such as working with the different branches of code. As in the "zone" as you can get. You can change thhe scenery to whatever as well. Dark room? Done. Wonderful vista? Done.

It's not that farfetched to imagine a future where plenty of programmers sit 'jacked' in, into an Oculus Rift to code.


"It's not that farfetched to imagine a future where plenty of programmers sit 'jacked' in, into an Oculus Rift to code."

Life imitates art. Sadly, William Gibson novels may seem less beautifully strange to future generations. ;)


Though I immediately thought of the corresponding passage in Snow Crash. :)

> But his real reason for being in Flatland is that Hiro Protagonist, last of the freelance hackers, is hacking. And when hackers are hacking, they don't mess around with the superficial world of Metaverses and avatars. They descend below this surface layer and into the netherworld of code and tangled nam-shubs that supports it, where everything that you see in the Metaverse, no matter how lifelike and beautiful and three-dimensional, reduces to a simple text file: a series of letters on an electronic page. It is a throwback to the days when people programmed computers through primitive teletypes and IBM punch cards. Since then, pretty and user-friendly programming tools have been developed. It's possible to program a computer now by sitting at your desk in the Metaverse and manually connecting little preprogrammed units, like Tinkertoys. But a real hacker would never use such techniques, any more than a master auto mechanic would try to fix a car by sliding in behind the steering wheel and watching the idiot lights on the dashboard.


It is a great idea, but I think the Rift will need to get a wildly better resolution. A 1920x1080 real screen isn't going to be able to show you a very good virtual screen in 3d space.


8k version would be enough according to this http://www.gamasutra.com/view/feature/199361/a_conversation_...


What's worse is that it's not even 1920x1080 - the screen is divided into two halves, one for each eye. So the visible resolution is 960x1080. But since those pixels are also stretched over an area three times wider than your normal monitor's field of view it means the pixels are huge. Half as many pixels stretched over ~10x the area.

In the end it looks like good ol' 1981-era IBM PC 320x200 CGA graphics - and if you do the math on the pixel size you'll see that's about the same angular pixel density.


With smartphone ppi pushing above 400 it will be solved in the second or third generation. It will be possible today too but you have to jack up the price threefold.


You also have to think about the latency, and if I'm not mistaken they already want 60 FPS as the default, and I know Carmack wants 120 FPS eventually, because framerate is apparently a lot more important in that kind of virtual environment, because it's more obvious when you have dropped frames. So 120 FPS is probably "ideal", but not practical yet (the screens aren't there either).

Add to that 2x or 4x (4k) the resolution, and you're going to need really powerful hardware and very fast cables. HDMI 2.0 and Thunderbolt 2.0 barely support 4k @ 60 FPS, but I suppose that's getting pretty close. There are probably other bottlenecks, though.


120Hz monitors are already standard equipment for professional PC FPS players: apparently it makes aim significantly easier. It does seem to usually take some drastically lowered settings to get anything resembling 120Hz minimum from the video card, though. Of course there's also the problem of getting 120Hz panels at the sort of DPI that Oculus needs. (The 120Hz LCDs were apparently an unintended by-product of the LCD manufacturers' drive for 60Hz 3D.) This is one area in which LCDs are still playing catch-up to CRTs, by the way.


it doesnt make "aim" easier. it reduces the perceived latency.


It reduces the perceived latency, thus making aim easier. If you want to make a habit of being rude to strangers in public, you should be very, very careful to have your facts exactly right before you do it.


woot, getting threats on the public HN!

reducing latency doesn't make aim easier, it makes prediction easier. aiming is pointing at the right pixel at a given moment. since i figure there can be confusion, i put quotes on "aim", too.

if you were playing FPS at a high level you'd also probably make the difference between prediction and aim itself, because they're 2 different skills that you train differently.

my words were actually too exact for your sake, ironically - and not actually rude but honest.

So, now that you have been, I'll just let you know that you, sir, are just a random asshole. (note: i still have my facts exactly right)


> woot, getting threats on the public HN!

Excuse me? I didn't threaten you in the least.

> reducing latency doesn't make aim easier, it makes prediction easier. aiming is pointing at the right pixel at a given moment. since i figure there can be confusion, i put quotes on "aim", too.

> if you were playing FPS at a high level you'd also probably make the difference between prediction and aim itself, because they're 2 different skills that you train differently.

I have repeatedly heard/read high-level TF2 players describe 120Hz as improving their 'aim' - their choice of words. This applies to hitscan as well as projectile aim, and most certainly includes flick hitscan aim, where any kind of prediction is at a minimum. While we're at it, here's a Q3 professional http://www.stermy.com/shoutbox.php :

> Yes, 120hz LCD are WAY better than 75hz LCDs and will improve your aim/accuracy quite alot.

On the other side, I think I've yet to see an FPS pro refer to 120Hz as improving 'prediction' as such (though no doubt it does).


Version 0.01 of this is available now with Deskope http://www.youtube.com/watch?v=QhjTNwa-6f8

At the moment, the dev kit Rift's low resolution is not quite suitable for serious text editing. Also, we are clearly going to eventually want free-floating windows instead of Deskope's side-by-side desktops. But, it's a start and moving forward the path to near-future VR desktops is pretty much clear.


Virtual desktops do that quite admirably today, though.


I've never been able to effectively use virtual desktops. I guess I tend to forget where I put windows.

With one huge virtual display I'd be able to see where things are out of the corner of my eye or with a quick glance.


I am a virtual desktop fan as well, I even prefer it to the two+ monitors approach, but just moving your head to look at any other open window seems superior. Plus maybe shake it a bit to move it to the center position, that sounds awesome to me.

To bad Rift for now is focused on Windows for now, I'd love to see a Compiz DE make use of this.


You know there are SDKs for Linux and OS X, right? It's just the 3rd party developers who are focusing on Windows, as usual with gaming.


The more I read about the Oculus Rift, the more I want to try one.

When the first generation of a technology looks this good, it really makes you excited for the 2nd and 3rd generation.


When the first generation of a technology looks this good, and John Carmack is going to help make the future versions (as CTO), holy crap does it make you excited.


In a certain sense this isn't first-generation. This is really a long-delayed second generation. Most of the gotchas of VR were examined over a decade ago, and this new kick at the can has come out of cellphone technology making the old systems viable.


Right time, right place. To me that seems like a pretty standard way (some) tech breaks into the mainstream. Tablets also existed at the beginning of the last decade (maybe even earlier in some form, I don’t know) but they never broke out of their niche.

It took the right technology (low-power SoCs that are nevertheless fast enough for desktop PC level UI performance, better batteries, compact capacitive touch screens, SSDs – and all that together at the price of a budget PC) plus the right software (not just a desktop OS) to make it work.

It seems and I hope we are at a similar point with VR. The tech is finally good and cheap enough, plus the software mature enough to allow developers easy integration.

Hopefully it will work out.

(Also relevant: The Hype Cycle. http://en.wikipedia.org/wiki/Hype_cycle But I’m never sure how much confirmation bias is in that way of seeing the world – but it seems to apply to a great many things.)


I remember a guy around '97-'98 who was trying to get me involved in his research around giving the impression of movement in VR systems. Electrodes would be applied to the base of your skull and they'd be used to stimulate your cerebellum (or somesuch, I can't recall) which would give you a sense of movement. This was amplified by standing on a gel mat.

Talking with other people, they said that a side effect of this process was that the electrodes kind've hurt, and none of them would do it again.


I've had my doubts about the Rift, but most of those were centered around problems I see with using the Rift in games like FPSes. The author of the article has totally sold me on the Rift for simulator games though. I want one.


I can't understand how are you going to reach for the controls though. In all simulators you need use the keyboard extensively - power up the engines, flaps, landing gear. Then you need to remember weird hotkeys for radios, autopilot or rarely used functions. You need to type in a lot of numbers when doing routes or changing frequencies. I don't know how can you type with this thing on your head. And then you will need to find you mouse/stick to resume flying normally. Something is still missing here.


Realistically serious full-sims are going to be facing a lot of problems. Light-sims that can play on an XBox gamepad and thus won't require the full keyboard should fare much better, and likewise sims that fit nicely onto a standard flightstick+throttle controller.


Some of the current generation of sims (FSX, DCS) focus on "clickable cockpits" where you can operate basically the entire thing via a HOTAS and your mouse, flipping switches and turning knobs. While playing DCS I actually rarely touch the keyboard and move it completely out of the way.

The current head tracking solutions like TrackIR actually make it pretty hard to flip switches on the sides of the cockpit because they drastically exaggerate your head movements.

So I could see Oculus Rift working pretty okay for these. For the ones where I'd still need a keyboard, I could probably hit the right key most of the time without looking, or so I'd like to think.


Something like Orbiter with NASSP, Virtual AGC, 3d modeled cockpit, and a Rift would be fantastic.


LeapMotion might be the missing piece. Doesn't have haptic feedback, obviously, but it would be better than trying to type.


I think it could be usable for an FPS but it needs something ... more.

This article hit the nail on the head when he stated that the Rift is already a perfect fit for simulator games.


Check out the Rift game/demo called Crashland. It uses the Sixense Hydra controllers for 1-to-1 aiming and works surprisingly well. You turn the reference orientation of your body/view but otherwise you are looking around and aiming with your arms. I really hope all FPS games for the Rift make this an available control scheme. Also helps that the game mechanics are fun and giant spiders are nasty.


I've seen demos with the controller that straps across your chest so you can move with your torso, and those look pretty cool, but you're still left with the problem of wanting to walk around but not being able to (because you'd step on your cat 'in meatspace'). There are solutions to that too, but they aren't great, take up space, are expensive, etc.

In a game where you are strapped into a seat, you don't have that problem. You remain seated in your chair at home and remain seated in game, so everything works great. The Sixense Hydra controllers, or similar, could probably be great for manipulating switches and leavers in your cockpit.


Here's a link to the mentioned controller, the Virtuix Omni: http://www.virtuix.com/


I'm dying to race Formula 1 Cars with this thing. It's going to be a ton of fun.


iRacing.com already supports Oculus Rift. I believe it has Formula 1.

https://www.youtube.com/watch?v=xf65VGDs8OA


Yes, it has the cars of one scuderia, Williams F1.

And Euro Truck Simulator 2 has Rift support!

The whole spectrum of wheeled vehicles is covered by the Rift.


Well there are more than a few distressing music videos done with this device in mind, essentially porn paradise, if not shuts in fun. While it can have some great gaming,if not medical benefits, people tend to their darker impulses first.

Tame video https://www.youtube.com/watch?v=7bytIGCeGxo but I would not call it SFW. Search youtube for Hydradeck oculus rift for a large collection of vids, most are very safe for work.


We've got one in the office. The tech is obviously really solid, but the resolution of the dev model is abysmal and no one has really figured out a great control scheme for "on foot" first-person shooters. For first-person driving/flying games, where the direction you move and the direction you are looking are orthogonal, the experience is seamless.


The fact that simulators work so well lead me to believe that there has to be some semantic hack to get an FPS-like game feeling right. What sits between MechWarrior and Quake? How about Adventures of Professor X?

That said, regardless, space sims are going to be insanely awesome. I'm looking at you, Star Citizen.


Could it just be that in mechwarrior / car sims your avatar is 'sitting' and so your brain expects your body to be sitting at the same time?


Easy to test. Someone needs to make an FPS mod in which players are moving in electric wheelchairs or something similar.


On that note, I feel like the OculusVR could provide for a really illuminating experience as to what it's like to go through life in a wheelchair in the first place.


When playing the old IL-2, I remember looking around with a joystick hat taking me out of immersion right away (and being a really hard way to get situational awareness, too). As soon as I heard descriptions of Oculus Rift, I knew I'd want to play similar games with it, but I didn't realize IL-2 itself was getting a remake. Very cool.


In the meanwhile I've heard that TrackIR works really well with sims (some saying it being almost as essential as a joystick)


Yes, flight sims have always been the core base of support for TrackIR.

Mind you, it seems that's partly because TrackIR's manufacturers NaturalPoint had trouble attracting support from FPS developers. From the guy who was apparently product manager for NaturalPoint http://forum.il2sturmovik.com/topic/385-oculus-rift-headtrac... :

> Thanks for the kind words. My larger point is that they have already won the battle, whereas NP and TrackIR only conquered one genre - flight-sim. OR doesn't really need the sim community, unless it completely flops with shooters. We make up less than 1% of gamers. I'm more sensitive to this stuff because I was in charge of TrackIR business development while I was at NP and I mainly left because I could not crack the shooter genre. It was really frustrating and I remember all these shooter devs and engine devs (including Gabe Newell when I showed him our mod for HL2 with head tracking)) laughing at me when I explained how cool TrackIR was and what it could do. They ALL told me it was a cheat and they would never take the time to implement special code to support a 3rd party device outside of a mouse and keyboard. Drives me nuts. What made them change their mind? The stupid screen? Weird.


Interesting. Never heard of that before. Looking at a video, the only thing I can see that might be a problem is the motion scaling (where rotations are scaled up, so that a normal monitor will stay in your field of vision even while you look "behind" you). For people who've tried it - is that something that's easy to adjust to, or does your brain nag that things aren't quite right?


As an avid flight simulator enthusiast, here's my experience with head tracking:

At first, the motion scaling and latency are awkward, but you still use it just because it frees up your hands.

After a few hours, though, you adjust to it, and it feels natural and immersive, like you're actually looking around outside.

For simulation games, head tracking is less important than having a joystick, but only just.


Rift owner here as well, I haven't tried Sturmovik but I have been playing a similar game called Warthunder which I found to be really immersive. First 15 minutes of playing brought that child like wonder out of me that I haven't had since I stood in an aisle of 'Toys R Us' watching someone play Super Mario 64 (first 3D game I'd ever seen).

The last few days I've been having an absolute blast playing Half Life 2. Hands down the best experience I've had in the Rift so far. While some say FPS are a difficult fit for head tracking I'm finding it to be very natural/immersive.

And this is all with the first generation low-res dev kit so I'm really excited to see what next generations bring.


It's pretty cheap to pick up IL2 on gog. I think I got it for around 10 bucks. Worthwhile!


That's an old version though, IL-2 1946, while the article is about the sequel, IL-2 Battle of Stalingrad, which is still in development http://il2sturmovik.com/ . Fortunately it seems that someone has got both 1946 and WWI sim Rise of Flight working with the Rift: http://www.youtube.com/watch?v=ConJqygVOqE .


This is the first time I've seen "rhubard" used as a verb.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: