
Carmack on why transatlantic ping is faster than pushing a pixel to the screen - eavc
http://superuser.com/questions/419070/transatlantic-ping-faster-than-sending-a-pixel-to-the-screen/419167#419167
======
CodeCube
I know fanboy gushing isn't really productive. But I'd just like to say that
it's so awesome to live in a time when we can start a topic of conversation
about someone of note, and there's a chance that this individual will join the
conversation personally.

~~~
klmr
I pinged John on Twitter with a link to elicit an answer. I have to agree with
you, having this proximity and immediacy through things like Twitter, Stack
Overflow and services like HN is quite exciting. Just a few years ago this
would have been almost unheard of (exceptions always prove the rule).

~~~
icefox
A few years ago (eek a decade) when a lot of the tech community hung out on
slashdot and I remember seeing this same thing happen. We are social creatures
and before tweeting (I know shocking) there was other ways and there will be
new ways when no one tweets anymore.

------
karlbunch
LOL I used to own a computer game center and had two dedicated T1's bonded and
traffic managed. The CS players would complain if the ping time spiked from
20ms to 25ms to a server and would say it was causing them to miss shots. I
reviewed the connections for jitter and all sorts of things and they would
never believe that the 5ms didn't make the difference.

To prove the point I downloaded a simple javascript stoplight app that would
measure reaction time and told them if anyone could beat my times I'd give
them an all day pass. And it never happened.. not even once.. and the times
were lucky to be in the 210+ms range. 5ms wasn't causing them to miss the
shot..

For those who are interested:

<http://getyourwebsitehere.com/jswb/rttest01.html>

~~~
sosuke
Ha, I love stuff like that test. I couldn't break 210ms though, 213ms was my
best

~~~
to3m
You should hold the key down. I got 10ms average.

------
stephengillie
Carmack probably read the Anandtech article (from 2009) on this topic:
<http://www.anandtech.com/print/2803>

So the short answer is: it takes that long to go through _all_ of the
processors (input controller/keyb/mouse, usb, cpu, processing latency, gpu,
more processing latency, RAMDAC/digital output, LCD pre-processing, LCD
output, LCD post-processing, pixel transistor)

He doesn't mention using "game mode" on that display - maybe it doesn't have
one. Game mode is supposed to minimize pre- and post-processing to minimize
LCD latency, exactly for this reason.

Also note it takes about that long for signals to hit your eye, go through
your brain, hit a switch that fires an action, that signal to travel down your
arm and back into your finger ~ 113ms.

(How does that line up with 100ms game tick cycles? I don't know.)

~~~
angersock
If you read the link, he actually describes his methodology:

 _"The time to send a packet to a remote host is half the time reported by
ping, which measures a round trip time.

The display I was measuring was a Sony HMZ-T1 head mounted display connected
to a PC.

To measure display latency, I have a small program that sits in a spin loop
polling a game controller, doing a clear to a different color and swapping
buffers whenever a button is pressed. I video record showing both the game
controller and the screen with a 240 fps camera, then count the number of
frames between the button being pressed and the screen starting to show a
change.

The game controller updates at 250 hz, but there is no direct way to measure
the latency on the input path (I wish I could still wire things to a parallel
port and use in/out Sam instructions). As a control experiment, I do the same
test on an old CRT display with a 170hz vertical retrace. Aero and multimon
can introduce extra latency, but under optimal conditions you will usually see
a color change starting at some point on the screen (vsync disabled) two 240
hz frames after the button goes down. It seems there is 8ms or so of latency
going through the USB HID processing, but I would like to nail this down
better in the future.

It is not uncommon to see desktop LCD monitors take 10+ 240hz frames to show a
change on the screen. The Sony HMZ averaged around 18 frames, or 70+ total
milliseconds."_

~~~
stephengillie
This agrees with Anand's results - with vsync disabled and very fast input
processing, you could have about 17ms for transmission from the video card (in
the video _cable_ ), another 17ms for processing, and 4ms more for LCD
response, giving the ~40ms that 10 frames would give.

(1/240 sec = .075/18 = .0041667 sec = 4.1667 ms)

~~~
Terretta
This is why I play Geometry Wars on a Sony Trinitron CRT TV.

> _"you will usually see a color change starting at some point on the screen
> (vsync disabled) two 240 hz frames after the button goes down"_

That's less than 1/100th of a second from the button press to the screen.

------
driverdan
I wonder what the margin of error is on marking when the button push is made.
Controller button range of travel makes it nearly impossible to know exactly
when the electrical connection takes place from a video.

I think it'd be better to touch wires together or something more obvious to
reduce the margin of error.

~~~
nrp
It probably depends on how the button is debounced as well. Software and
physical debounces are both generally tuned for at least a few milliseconds
and potentially over ten milliseconds of latency.

A better trigger could be built by interfacing a microcontroller over USB HID
and having it send a simulated keystroke or button press at the same time as
turning on an LED.

~~~
joezydeco
I think you're being generous. I've had microswitches that needed a good 20-25
mSec to fully debounce. Good quality sealed switches (like Cherry) too, not
badly plated counterfeit stuff.

------
gavanwoolery
Maybe I missed it, but he did not seem to go into detail as to how the pixel
was plotted. If the pixel was placed in a buffer, and then a texture was
locked and the pixel was transfered, this would obviously induce significant
overhead on its own (any time you need to communicate between CPU/GPU memory
induces a huge lag, in shorter words)...although Carmack knows this better
than anyone else.

Also, the actual pressing of the button could use more clarification. When did
measurement start? On the button down signal? On the button up signal? As soon
as the finger touches the button?

~~~
jra101
He is doing a clear which is pretty much the fastest path in hardware.

~~~
gavanwoolery
Oops I did miss that. I would argue that a clear is actually not the fastest
method -- it would be faster to keep two buffers of different colors (pre
rendered) and swap them on key press...although the difference should be
pretty trivial ;)

~~~
rys
You're right, clear can sometimes be quite slow on some hardware. Fast clears
tend to only be limited to all 0s or all 1s and only full screen. In the worst
case, the hardware has no support at all and has to setup and rasterise a
full-screen quad and send that down the pipeline.

------
zobzu
In fast FPSes this makes a huge diff regardless of your ping indeed. Even in,
game mode on some monitors/video card setups. You only see it well during LANs
when the guy sees you half to a full second BEFORE you see him.

Then you realize no one was cheating, ur hardware is just crap ;-)

------
robomartin
I am not really clear on the measurement methodology. Why is a game controller
involved?

Did he trigger the ping from the same game controller button?

Did the game controller trigger the frame buffer swap on the graphics card?

At 240Hz the lower-right pixel on an LCD will be painted 4ms after the upper-
right pixel. Where were the measurements taken.

Yes, LCDs are scanned devices, just like CRT's, you don't see it because of
the slow response time.

Lastly, I didn't see any numbers. Where are the measurements?

------
zeteo
The total length of wire in a microprocessor is currently on the order of
100km. A signal doesn't have to go through all of it, but processing often
involves loops etc. Also considering the delays at logic gates (waiting for
clock signals), it's not that surprising that a speed-of-light signal that
needs complex processing may travel longer inside a microprocessor than on a
straight journey of a few thousand miles across the ocean.

~~~
phillmv
Uh, the packet still has to traverse the same loops of microprocessors. More
loops, I would wager, than just sending some bits to a display.

~~~
zeteo
I didn't say that network packets take no processing, just that the processing
path matters. Contrary to your intuition, this path is significantly shorter
for network packets, and an Ethernet board (first produced in the 1980s) is a
much simpler circuit than the latest GPUs.

~~~
jsprinkles
Transocean cables aren't just Ethernet boards on both ends. There's extensive,
extensive gear on either end as well as at the peering points between you and
the destination. I'd wager that the same packet traverses tens of
microprocessors between you and the destination, so your point is
questionable.

Just adding a router can add up to 30ms to a hop, if it sucks.

