Hacker News new | past | comments | ask | show | jobs | submit login

Excellent article.

Thinking about it, 24fps (good old fashioned film), even though laughably low by the standards of modern displays, manages >1 frame per perceptual time-quanta, which is enough to give the illusion of motion. Any lower than that, and you start getting "frames" where there's no change in the film frame, and the motion illusion breaks down.




That's not quite correct.

a) 24FPS are easily detected as flickering; our true flicker fusion frequency is found around 60-80Hz.

b) Perceptual time discretisation heavily depends on the modality and even the task in question. For motion extraction, this "limit" may be much lower or higher than for olfactory tasks or memory retrieval. Overall, it's really hard to put a time delta on neural processes -- after all, biological systems are more or less continuous.


> 24FPS are easily detected as flickering; our true flicker fusion frequency is found around 60-80Hz.

Shooter player here. This is a laughable statement. Pretty much anybody can tell the difference between 60hz and 120hz - even the mouse cursor moves smoother. Good players try to play at higher than 120hz rates, because it does make the difference.


Those statements are not necessarily in contradiction. When I quickly move the mouse in a circle on my computer I see the pointer displayed at several, quite far away, locations at the same time. It looks like the same time anyway because of temporal fusion.

If the mouse were moving continuously you would be able to see the full trace of its path and not just a few discrete locations along the way. A higher frequency would get closer to this.

Also, in an interactive scenario, the frame rate will impact how quick you get feedback on your actions. With 60Hz, the frames each take 17ms, meaning the next frame will be on average 8.3 ms away. Thus if you perform an ingame action, you will get visual feedback in ~8.3ms, whereas with 120Hz it will come in 4.1ms. There is probably some kind of buffering done, so likely your action will not end up in the currently drawn frame but the one after that. So the response will on average come 3/2 frames after your action: 25ms later for 60Hz v.s. 12.5 ms for 120Hz. 10ms faster reaction time should be noticeable.


Hmmm, their statement doesn't say that nobody can but says that some people can't. Those for whom the value is 60Hz or less. What their statement does say is that around 80Hz is a value that is indistinguishable from anything higher


Likely because of how the various parts of a game engine interacts.


No. I'm telling you, get yourself a true 120hz or 144hz monitor, and you will see the difference for yourself.


Or turn of vsync...


I imagine they both do something VSync, however, does not change the refresh rate of the pixels on the screen. It changes the refresh rate of pixels in the buffer. Monitors can only update each pixel at a given rate and that's determined in hardware first, software second.


I think that's why they said 60Hz to 120Hz. If you take a factor you essentially change the true frequency


Only with motion blur. Movies seem fluid even at 18 fps because the motion is so blurred. If you try to play a video game without motion blur at 18 fps it just looks awful.

It also depends on darkness and color, so I really doubt the eye's fps has anything to do with the brains perception of time.

Here's a great article on this subject: http://www.100fps.com/how_many_frames_can_humans_see.htm


For those that have a hard time seeing why this matters, it can be thought of as a matter of sampled versus not.

Games are essentially sampled screenshots. With virtually no hints as to what happened between samples. Just, in one frame an item is in coordinate X. In the next, it is in Y.

Movies, on the other hand, were not discrete samples of time. If between frame A and B, an item moved between point X and Y, then it will leave a blur between them.

That is, the information is distorted, but it is there. Unlike the discrete sampling where the information is just lost.


That's only true if the camera keeps the shutter open for the full interval of the frame.

Shutter speed is different from framerate.

In high light situations, you won't end up with (much) motion blur.


Right, I hadn't meant to say that you keep all of the information. Just that you don't typically lose all of it. Contrasted with perfectly discrete samples.


With the old fashioned film you are probably referring to, projectors are designed to flash each image twice or even three times, meaning that a 24fps movie has a flicker rate of 48 or 72 Hz[0].

The takeaway is that the perceptual frame rate for brightness is higher than the perceptual frame rate for motion.

[0] http://en.wikipedia.org/wiki/Movie_projector#Shutter




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: