Super Smash Bros. Melee: A game where input lag really matters a lot, the reason we still play using CRTs, not only has several frames of input lag, but it's not a constant number of frames of lag. It can range from 2 to 5 frames of input lag.
Tetris the Grandmaster series: Another game where input lag has a huge impact not only has several frames of input lag (and people have resorted to all sorts of ways to reduce it, including AHK scripts that constantly move the mouse on Windows XP) but the first two games of the series don't run at 60Hz. TGM1 runs at a slightly lower rate, not really significant, but TGM2+ runs at 62.68Hz, which is quite significant and makes some of the challenges a tad harder.
Both of these communities took the measuring test a few steps further than what can be done with isitsnappy. They connected LEDs to the buttons so it was easier to know precisely in what frame the button was pressed.
Someone in the Melee community also placed photoresistors close to the screen and used an oscilloscope to know exactly when the brightness changed.
Not exactly the most relevant anecdotes, but I felt like sharing.
I tried to measure the latency of my own system a little while back. I used a digital camera to record video at 240 fps and measured the time it took for a button press on a DS4 connected over Bluetooth to be reflected on a Mega Drive emulator running the 240p test suite. I can't remember the exact latency, but I think it was around 80ms, which is okay, though there is definitely room for improvement.
I noticed this myself when I upgraded my wireless Magic Keyboard a few weeks ago. I haven't noticed any difference between the two (keyboards) before, and assumed that the new one felt snappier due to shorter travel.
This is a misleading and mostly false statement.
The human consciousness requires time for discrete data to be combined by the brain into what we perceive.
It is NOT instantaneous in real life. There is multiple forms of biological latency from the maximum speed of electrical impulse propagation through neurons to the "compile time" of the brain to assemble and modify information into a cohesive output for internal human consumption.
It may seem pointless to point this out, but many sources indicate that the delay between real life and perception is ~80 miliseconds, around 1/10th of a second.
These are speeds which are highly relevant to the discussion of software lag, as a program running 60 times per second updates several times during the "brain lag" of 80ms.
(Oh and the reason why you can overcome the 1/10th of a second lag and have your hand move in lockstep with reality is a variety of compensation mechanisms, like proprioception, which allow you to estimate where your hand will be in relation to reality and achieve it successfully).
I can imagine screenshot test results from your app becoming a standard way to show the latency of a certain game setup in internet forums. Ideally, an easily sharable result would have to show several things all in one image:
- 1 frame before you pressed the button
- the frame when you pressed the button
- 1 frame before something changed on the screen
- the frame when something changed on the screen
- plus timing data, obviously.
Since changes are sometimes quite small (like when mario jumps but he is kinda in the background because you have the joypad in the same shot), one would have to be able to zoom into a part of the screen for each of the 4 images. And maybe mark a part of the screen with a circle.
If you could do these changes, you've got a certified hit on your hands! Keep up the great work!
Better check your videocard driver's settings. For NVidia on Windows, you can for example alter number of prerendered frames which IIRC defaults to at least 3. Putting that to 0 and turning off a bunch of other things listed in the settings, decent nVidia cards on decent hardware can get a latency of 2 frames which is probably the bare minimum. At 60Hz with a screen which doesn't add additional latency that is 33.3mSec (between the software issueing a render call and the actual output on the screen).
At work we measure latency using an oscilloscope, a photodiode pointed to the (top left of the) screen and some input event e.g. from a fast digital output card (i.e. less than a mSec to get an output high from software). E.g. in software we set an output high in the piece of code of interest then just measure the time between that high edge and the photodiode firing. But using a camera is a neat though somewhat more manual process.
I tested in Notepad and have the "Hide pointer while typing" setting enabled, and found the first thing that happens on screen is the pointing becoming hidden. That happens after 50 ms. The character shows up in Notepad 16.7 ms later.
This was measured with a 120 FPS camera and an old 60 Hz monitor.
The otherwise non visible IR led in the remote lights up when viewed via a phone camera, as you press buttons on the remote.
Handy if you suspect the remote is broken, but aren't sure.
Source: I use the front camera to debug one of our products, which uses IR LEDs as a flash to check for obstructions.
I would like to see a visualization of the recorded sound, so that I can set the "input" frame to exactly when I hit the keyboard. I'll even pay to have this accurately aligned.
Good job so far!
I will definitely give this tool a try, it could vastly improve future measurements using the same technique :D
> The response latency of the real world is zero
I was simply pointing out that even interaction of atoms to proximity is propagated with a speed of light, and should not be called zero. Nothing to do with the rest of the article or the excellent app idea.
I ridiculed a friend when he said he could a stroboscopic effect from wheels in normal daylight.
But it is actually a real thing which I've experienced myself.
You also have to consider the speed of the neural pathway.
This MSR video from few years back is also pretty cool wrt touch interface latency: https://www.youtube.com/watch?v=vOvQCPLkPt4
Thanks for this project, it's really cool.
Heh, that's not so surprising to me. There are several vim plugins for Haskell, and one of them had some very slow features that made the latency go into the "many seconds" range :D
The Macbook keyboards one is ridiculous though.
This was mentioned on HN late last year regarding 2016 Macbook compatibility.
Question: what is the standard human physiologic "click" time?
How fast or slow does it take to make a click. A mouse click takes longer than a screen tap, no?
Sometimes one will hold a mouse click if they are unsure.
What might be a better way is to record a scenario and measure response time on the replay?
Cool idea regardless
According to Disney in the 1960's, it takes from 3/4 to a full second. :)
which I mentioned in a comment above.
The app crashes when attempting to delete captures though.
But maybe UNIX time isn't precise enough for that sort of thing? I actually don't know.