I posted the problem on one of the Newsgroups at the time and got my reply back from John Carmack himself (which naturally fixed the problem pretty quickly.)
I remember being extremely excited then, as I'm sure the questioner is now on SO.
You never know who is reading, so don't say anything you couldn't vouch for in real life (which increasingly overlaps with our Internet life)
Which is not unlike how it was when communities where constrained in population. E.g. in ancient Athens for example you could have --if you lived at the right point in time and weren't a slave-- a conversation with the greatest minds of the era, from Socrates to Plato, to mathematicians etc...
So, a "global village" indeed...
 "According to the Ancient Greek historian Thucydides, the Athenian citizens at the beginning of the Peloponnesian War (5th century BC) numbered 40,000, making with their families a total of 140,000 people in all. The metics, i.e. those who did not have citizen rights and paid for the right to reside in Athens, numbered a further 70,000, whilst slaves were estimated at between 150,000 to 400,000. Hence, approximately a tenth of the population were adult male citizens, eligible to meet and vote in the Assembly and be elected to office." http://en.wikipedia.org/wiki/History_of_Athens#Geographical_...
And the even more contentious question is: how many are in the same place today? Lack of money when growing up and circumstance can be equally as brute as force to deny someone his "possible future" as anything else. How many readers does HN have that are San Francisco natives and how many that are, say, from Mississippi or Alabama combined?
With ~2 million people in jail, some million homeless and several tens of millions eating with coups and soul kitchens, one of basic differences now is that we have the luxury (hypocrisy?) to blame them instead of some institution like slavery.
For those who are interested:
If CS was a game where the players would just pop up out of nowhere (and then stay static) that would be somewhat comparable but when you aim at player you take lots of things in consideration, and most importantly - aiming and hitting a player has very little to do with reaction times. To hit a player that walks into your crosshair can probably be done with few ms of precision, that's because we extrapolate the movements and we subconsciously even takes things like input lag into consideration.
Not saying that those 5 ms are important, if anything constant 25 ms is better than 20 ms with +5 ms spikes, but reaction times have very little to do with actually hitting the other player. But when you have server side hit detection the lag is really important.
Lag correction is key here.
Competitive CS is at a very high framerate(often around 120Hz). So at a 20ms ping, the client is around 2.4 frames behind the server. At 25ms, it's almost exactly 3 frames behind. That means that, on average, the client is going to be rendering 20% less lag corrected frames with the lower ping, which is a pretty huge amount for player movement.
A better example is mixing and recording music. Performers are recorded individually playing along to music they hear in their headphones.
Acceptable latency is under 20ms. http://www.soundonsound.com/sos/jan05/articles/pcmusician.ht...
Now, to expand this scenario, add random delay between each light. first 10ms, 15ms, 50ms, etc. Then see how well a person can click when that last light lights up. This could be a better test for measuring 'Human Response Time vs Jitter Delay' or what ever the appropriate term is that we inherently use subconsciously in FPS games.
I can easily tell the difference between 40ms and 80ms playing CS. but I doubt my reaction time is that low (not to mention the time until my finger clicks).
I made a lot of tests with LCD TV input lag, and 30ms lag vs 100ms lag makes a huge difference in playability. At 120ms+ FPS games are unplayable.
To be more precise, you know what to expect and when - even without counting exact seconds, if you see someone running, then disappear behind something - you know when to expect him to enter your view again. Then you've got sound which additionally helps you in orientation. Then you've got the time it takes between you spotting something and getting the correct aim, while you see the target moving - you're aligning your aim with the moving object and deciding when they meet, rather than just measuring a response to an impulse.
You should return false from your event handler to prevent that (but only do that during an actual test, not all the time). Or remove the focus from the start over button.
I also tried out not focus (aren't the rod cells on the edge of your vision faster in detecting movement than the cone cells in the center?), but it didn't amount to much.
Or maybe you think i'm lying about .007 which is a score I actually got..
either way, lighten up HN...
So the short answer is: it takes that long to go through all of the processors (input controller/keyb/mouse, usb, cpu, processing latency, gpu, more processing latency, RAMDAC/digital output, LCD pre-processing, LCD output, LCD post-processing, pixel transistor)
He doesn't mention using "game mode" on that display - maybe it doesn't have one. Game mode is supposed to minimize pre- and post-processing to minimize LCD latency, exactly for this reason.
Also note it takes about that long for signals to hit your eye, go through your brain, hit a switch that fires an action, that signal to travel down your arm and back into your finger ~ 113ms.
(How does that line up with 100ms game tick cycles? I don't know.)
"The time to send a packet to a remote host is half the time reported by ping, which measures a round trip time.
The display I was measuring was a Sony HMZ-T1 head mounted display connected to a PC.
To measure display latency, I have a small program that sits in a spin loop polling a game controller, doing a clear to a different color and swapping buffers whenever a button is pressed. I video record showing both the game controller and the screen with a 240 fps camera, then count the number of frames between the button being pressed and the screen starting to show a change.
The game controller updates at 250 hz, but there is no direct way to measure the latency on the input path (I wish I could still wire things to a parallel port and use in/out Sam instructions). As a control experiment, I do the same test on an old CRT display with a 170hz vertical retrace. Aero and multimon can introduce extra latency, but under optimal conditions you will usually see a color change starting at some point on the screen (vsync disabled) two 240 hz frames after the button goes down. It seems there is 8ms or so of latency going through the USB HID processing, but I would like to nail this down better in the future.
It is not uncommon to see desktop LCD monitors take 10+ 240hz frames to show a change on the screen. The Sony HMZ averaged around 18 frames, or 70+ total milliseconds."
(1/240 sec = .075/18 = .0041667 sec = 4.1667 ms)
> "you will usually see a color change starting at some point on the screen (vsync disabled) two 240 hz frames after the button goes down"
That's less than 1/100th of a second from the button press to the screen.
I think it'd be better to touch wires together or something more obvious to reduce the margin of error.
A better trigger could be built by interfacing a microcontroller over USB HID and having it send a simulated keystroke or button press at the same time as turning on an LED.
The variance on that would be about one frame, would would add up to 4 ms for each test. However, since the input device did not change between the experimental equipment, it's a constant error that would be added into every test, making that a wash.
Also, the actual pressing of the button could use more clarification. When did measurement start? On the button down signal? On the button up signal? As soon as the finger touches the button?
Then you realize no one was cheating, ur hardware is just crap ;-)
Did he trigger the ping from the same game controller button?
Did the game controller trigger the frame buffer swap on the graphics card?
At 240Hz the lower-right pixel on an LCD will be painted 4ms after the upper-right pixel. Where were the measurements taken.
Yes, LCDs are scanned devices, just like CRT's, you don't see it because of the slow response time.
Lastly, I didn't see any numbers. Where are the measurements?
Just adding a router can add up to 30ms to a hop, if it sucks.
Copper doesn't have much to do with this, rather what is important are the impedance characteristics of the transmission line. 0.66*C is a reasonable number for thin coax cable. Split speaker wire on the other hand is also often copper and has a propagation velocity of 95+% of the speed of light. Generally the dielectric used and configuration of the transmission line is far more important than the type of material used as the conductor.
As for processors, signals are still moved around on metal wiring, the propagation velocity through Si isn't really important but rather the propagation delay of gates which is less about the speed of the wave front and more about how long it takes to turn the transistor on or off. While there is some component of propagation velocity involved in this, its not really significant in comparison other factors in the gate design (see Intel's Tri-gate).