
John Carmack: Latency Mitigation Strategies - bigdubs
http://www.altdevblogaday.com/2013/02/22/latency-mitigation-strategies/
======
breckinloggins
It sounds like we would benefit from GPU and display manufacturers providing a
set of standard low-level control primitives so that people like Carmack who
know what they are doing can really play around with the entire pipeline
without having to worry about all the things the cards and the displays are
doing behind their backs.

For example, a GPU could have a set of standard settings with full buffering
and all the other things that "help them win the framerate wars", but a
developer should be able to turn all of that off when needed.

It's the same with displays. LCD manufacturers could, I suppose, allow a
modern day "CRT Mode 13h" where you just have scan lines mapped to memory
buffers and whatever shows up in those buffers gets turned into a pixel as
quickly as possible.

Are there technical challenges preventing this from happening, or is it mainly
inertia and lack of need from the current market?

~~~
Scene_Cast2
Regarding your last point: it's getting better. Modern TVs have "game mode",
where latency is minimized. The latest LCD panels use embedded Display Port,
eDP. The latest eDP standard introduces a frame buffer on the LCD itself,
where you can just write deltas and tell it to "swap". I think this is quite
similar to what you're proposing, in concept.

~~~
bbatchelder
You may already know this, but the "Game Mode" on modern TVs is just turning
off the 120Hz/240Hz interpolation that is done to "improve" the picture (or is
required for certain 3D systems to work).

The interpolation introduces significant latency that its obvious (and
frustrating) during gaming.

I actually run my TV in game mode by default, because the interpolation done
in 120/240Hz mode makes everything look like it is slightly unreal and shot on
video. Definitely uncanny valley territory.

~~~
newman314
This.

Some people aren't able to tell this apparently but it absolutely bothers the
heck out of me. Probably the reason why I'm not a fan of 48fps either.

I still think it would be useful to have a variable rate player. 24fps for
normal scenes and 48fps for action.

~~~
gizmo686
Is there a detectable (to human) difference between a 24fps screen and a 48fps
screen where the image only changes every other frame. I can see how this
would work with film based projectors, but my understanding of TVs is that
pixels are always on and simply change states between frames, so 'changing' to
an identical frame should have no effect.

~~~
Tuna-Fish
Having the frame change every second frame would mean that the screen would be
changing the picture only half the time, while it normally is always changing
the color of some pixel.

Not sure that this is not just a pure win.

------
dchichkov
> Conventional computer interfaces are generally not as latency demanding as
> virtual reality, but sensitive users can tell the difference in mouse
> response down to the same 20 milliseconds or so, making it worthwhile to
> apply these techniques even in applications without a VR focus.

I wish people would start treating text editors that they are developing this
way. As hard real time systems. Don't care much about virtual reality, but I'm
sick of text (and code) editors with unpredictable latency of basic operations
[like typing a character, making find call, etc].

~~~
dman
Would you be willing to pay a recurring fee for using an editor? I have toyed
with writing an editor at multiple points but have backed away because I think
in the grand scheme of things its a one time sale to a pretty small market
(The number of people who care enough about text editing that they would pay
for a good editor when they see one)

~~~
jiggy2011
Doesn't seem to be too small a market for sublime or textmate.

What I would pay for would be an editor in the style of ST2 but with a few IDE
features like being able to jump to method definitions by shift clicking etc.

Basically a very lightweight IDE with good text editing features (like VIM
style keyboard shortcuts). Integrate a terminal into it, but do it in a very
slick way with the ability to run "recipe" type commands.

For example if I forget the switches for a git command, I should be able to
search for roughly what I want and have a result return which will populate a
command in the terminal.

This might mean hyper focusing on a few languages however.

Not sure I would like the recurring model for just _using_ the editor. Since
this would likely mean you would have to implement some sort of always on
cloud integration into the editor which would annoy me and make me worry about
losing access to my editor.

What I might pay for on a recurring basis would be access to a repo of high
quality plugins that were well integration with every version of the editor.
Also maybe some functionality to allow me to share the editor session with
someone remotely.

~~~
dman
I dont want to gimp the user experience or implement cloud features just to
get people to pay. At the same time if I am going to be spending 2-3 years to
build something I would rather it be something that has a recurring income
stream.

~~~
jiggy2011
What would happen if someone doesn't pay for a month, would you somehow shut
off their editor?

It would be extremely annoying for that to happen because a payment got
screwed up or something.

If you make an editor so good that everyone wants to use it you can probably
make enough from one off sales assuming you set the price right.

By and large when it comes to tools I want to buy them and keep them rather
than lease them.

~~~
dman
This is a good question - I dont think I have a good answer. I lean towards
not implementing a check or stop your use. Maybe a monthly reminder for one /
two months saying you have not paid. After that I would just assume that the
user has stopped using my editor.

~~~
jiggy2011
In order to justify a monthly payment I feel you would really have to be
pushing out frequent high quality updates that just kept making it better and
better.

If development started to get stagnant I would probably "forget" to renew it
after my CC expired or something like that. Or at some point I'd just feel
"I've given this guy enough money" and cancel even if I kept using the editor.

------
modeless
I think head-mounted-display virtual reality is a huge missed opportunity for
next-gen consoles. They could have really done it right in a way that PCs
can't yet, with special hardware support for low latency and stereo rendering,
and a guaranteed large audience so game studios could justify applying their
full budgets to AAA VR titles. It would be a huge differentiator at a time
when consoles seem to be converging to a very similar place. It could be as
big as the Wii was.

~~~
untog
I think VR is fundamentally broken, because you're going to try to react to
things in a way that doesn't work. With a head-mounted display, the game can
react as you move your head. Great. But then you'll instinctively try to walk
towards something and it won't work (or you'll hit the wall). Same with
picking things up, etc.

It seems like it's deep in the uncanny valley of experience.

~~~
WesleyJohnson
I think it's just a matter of time before the other pieces to the VR puzzle
catch up. Omni-directional treadmills are already in existence and are a basic
step towards allowing you to walk in a VR environment. Developments in
prosthetics with 'feeling' could help pave way towards shoes, gloves or other
garments that could provide you with the sensory experiences of walking on
different surfaces or picking up in-game items.

~~~
untog
But at that point you've gone from a games console being something that takes
up a small amount of space underneath your TV to something that takes up an
entire room. From requiring you to pick up a controller to putting on an
entire outfit.

I'm just not sure the average consumer is in any way ready/willing for that.

~~~
purplelobster
I think the first successful VR you'll see will be closer to an arcade machine
than a home console.

~~~
untog
Agreed. But that's at odds with the OPs assertion that " head-mounted-display
virtual reality is a huge missed opportunity for next-gen consoles."

------
networked
A related article from Valve's Michael Abrash that may be a better
introduction to the matter:

[http://blogs.valvesoftware.com/abrash/latency-the-sine-
qua-n...](http://blogs.valvesoftware.com/abrash/latency-the-sine-qua-non-of-
ar-and-vr/)

<https://news.ycombinator.com/item?id=4985100>

------
pdog
With the proliferation of smartphones and projects like Google Glass and
Oculus Rift taking off, I really think we'll start to see custom chipsets to
improve the performance of augmented and virtual reality systems (battery
life, lower latency, and more processing power).

------
stcredzero
_> Updating the imagery in a head mounted display (HMD) based on a head
tracking sensor is a subtly different challenge than most human / computer
interactions._

Doesn't the military already have this solved for head mounted displays for
attack helicopters and 4++ generation fighter jets? Heck, they have augmented
reality displays for that matter.

EDIT: Many of these problems could be solved by putting an entire purpose-
built gaming rig in the headset. There are nicely capable chips for mobile
devices with lots of power in the GPU. Design such a system from the ground-up
for low latency. Accelerometer and head-tracking inputs would be low-level
interrupts, for example.

I wonder if this is how the military contractors solved this?

~~~
knodi
Yes, they have with very powerful hardware thats not very consumer wallet
friendly.

~~~
stcredzero
_> very powerful hardware thats not very consumer wallet friendly._

Yes, but the mobile landscape may well have changed this.

------
ansible
You could cut latency further if you were able to measure the muscle
contractions in your neck area.

Another option is smart eye tracking too. Generally if you are going to move
your head in a particular direction, you'll likely start moving your eyes
first.

------
efdee
I feel stupid.

~~~
breckinloggins
The view update part was the most confusing, but I tried to understand it this
way:

Imagine you had a magical game engine that rendered the entire world perfectly
accurately for every point in space and direction a viewer could possibly be
looking at. All you had to do was say:

    
    
        RenderGameFrameForEveryPossiblePoint();
    
        ... // Who knows how much time
    
        viewerPosition = QuicklyGetViewerPosition();
    
        TellTheGPUToShowWorldAccordingTo(viewerPosition);
    

Well then, you could postpone calling those last two functions until the
absolute last minute. This way you have very little or no movement of the
viewer's head between when you read their position and when you show the view
for that position.

But naturally, RenderGameFrameForEveryPossiblePoint() is _slightly_ out of
bounds of current technology. A lot of what Carmack was discussing, as I
understood it, was simulating this effect as closely as possible. The way to
do that, it seems, is:

    
    
       StartRenderingGameFrameAccordingTo(lastViewerPosition);
       
       ... // Stuff
    
       viewerPosition = QuicklyGetViewerPosition();
    
       viewMatrix = ComputeViewMatrixDelta(lastViewerPosition, viewerPosition);
    
       FinalizeGPUFrameRenderWithNewViewMatrix(viewMatrix);
    

That final bit is just a perspective transformation of a bunch of rendering
that was already computed and given to the GPU. But if the viewer moves too
quickly, you can easily move somewhere in the world that wasn't actually
rendered, or your perspective could shift such that and object that was once
occluded is now visible, or vice versa. It seems a lot of the complexity is
there.

The last thing he talked about, time warping, seems to be a similar thing only
it's scanline by scanline. So in effect you're saying "hey, video card and
display, I know you're going to force me to draw a whole frame at once, so I'm
going to give you a frame where each scanline gets rendered a little bit into
the future according to where the player is moving."

The effect on a monitor would probably look like a forward shear, but on an
HMD (if done correctly), it would correct for the natural shear caused by
having to "freeze frame" the viewer's perspective for one entire frame instead
of just a scanline.

Some of this may be woefully incorrect, but it was how I explained it to
myself. Please correct anything that's wrong or overly simplified.

~~~
T-hawk
That time warping has already been implemented in the project of Lagless MAME.
It compensates for input and display lag by always rendering a few frames into
the future. It's commonly used for games where frame-accurate timing is
critical, notably 2D scrolling shmups and fighting games.

Lagless MAME renders into the future assuming that the state of the input
controls remains constant over that future time, and saves the emulation state
every frame. When a button is pressed or released or whatever, Lagless MAME
rewinds to the saved state for that frame and quickly re-emulates from that
point forward. So the result is to send your input back in time past the lag,
to the moment in the emulation exactly synchronized to when you saw it on the
screen. The experience isn't perfect -- your spaceship would jump a few pixels
then move smoothly -- but by and large it's far superior to playing with the
actual lag.

This technique could be used for lag compensation in almost any environment.
The limiting factor is the cost of re-computing several frames of game state
on every input action. Of course, as Carmack says, actually eliminating lag is
far preferable to masking it with such techniques.

The lag compensation in Guitar Hero and Rock Band games works essentially this
way too.

------
cousin_it
It's amazing how much time Carmack spends working around the "added
cleverness" of systems created by people, rather than the inherent
difficulties of the problem. These days it also seems true about programming
in general.

------
Suan
Misread as "Legacy Mitigation Strategies" - that would have been a good read
as well.

------
stesch
Anyone any experience with a Wrap 1200VR from Vuzix?

------
sultezdukes
is there anyboy smarter than carmacck..maybe sweeneye

