Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
30 fps vs. 60fps: Am I crazy for not seeing a difference?
6 points by ldd on Jan 17, 2015 | hide | past | favorite | 17 comments
The only time someone has ever insulted me for my beliefs was when I claimed that I could honestly not see the difference between 30fps and 60fps. Now, either I am lying, you are lying, or neither of us is lying. Could it be possible that, just like colour blindness, some people are just incapable of seeing a difference? I am asking now in the hopes that the discussion remains civil


I would think you have something significantly wrong with your visual system if you can't see the difference. People have gone "blind" while having no physical defects to their visual system, so it is possible you really can't see the difference.

I don't think you should be insulted for making this claim, but there are a lot of trolls out there that make it. So it is a bit like getting annoyed by a bunch of trolls running around claiming the sky is green and then running into someone who really sees the sky as green.

I've included a link to a demo that to me shows clearly the difference in frame rate.

http://www.30vs60fps.com/


Interesting demo. When I took a quick look, I couldn't see the demo. Then after it had been going for a couple minutes and I just sat back and toggled the button while staring at the screen, I could see the difference as clear as water and sand.

That makes me think that it's a training & focus thing. Much like with MP3s, if you concentrate on the picture quality, you can see the difference. If you're just watching for the content, you can't. Most casual web viewers are looking for the content.


Yeah if you are focusing on points in the background you probably won't notice it (because your eyes are moving as the image moves).

A better demo would probably be something like a fight scene. Close ups of fast moving objects very obviously show the difference (a classic mistake of bad cinematographers, which they often try to mask with darkness).


Thanks for your comment. I'll simply say that I am glad that you can see the difference, I acknowledge that there is a difference for you and for the majority of people out there, and it just so happens that for me, even after carefully watching the examples, I cannot see a difference.


I'm not sure of any names but I know there are neuroscience people that study vision and might be interested in studying you.

The human visual system is fascinatingly complex so something like this isn't too surprising.


In that example I found it much easier to see the difference in non-fullscreen, and paying attention to the skyline rather than the bottom half of the screen.


Situation matters sometimes, but you can perceive the difference between 30 fps and 60 fps. Military and commercial aerospace flight simulators learned this early on and they MUST hold 60 fps or pilots get sick.

High end digital televisions and cinema are having a hard time jumping to 60 fps which for many reasons would simplify things since the rest of the consumer electronics works at that, but traditional cinema is locked at 24 fps. There are various stories around you should be able to find that when cinema experiments with 60 fps (or even 48 fps), or compare televisions that can automatically upscale the framerate, consumers complain that it looks "cheap" because the quality matches what they see on YouTube or their home video camera or the evening news. People have been conditioned to accept 24 fps as high quality cinematic so they think it looks "cheap" when in reality they are getting more fidelity. (You should be able to find old reviews of The Hobbit and people comparing it to the evening news.)

To your original point though, in this case, people were able to distinguish the different frame rates. The difference necessarily register as frame rate to people, or paradoxically, necessarily even better.


Nope, not crazy. 24fps has long been the standard for "as smooth as real life" motion in the movie & TV industries. The human eye needs a minimum of about 12 fps to perceive images as smooth motion. I used to do 2D Javascript games at 15 fps and while it clearly wasn't cinema-quality motion, it was perfectly playable.

The reason 60fps has become such a buzzword lately is because most laptop & phone screens refresh at 60Hz, and so you physically can't do better than that. At that frame rate, the electronics of the display become the bottleneck rather than your CPU/GPU power, and most engineers would rather say some engineer is the bottleneck rather than deal with the fuzziness of human perception.


>24fps has long been the standard for "as smooth as real life" motion in the movie & TV industries.

Which is fine only for non-action scenes, and film also has motion blur to make movement look smoother.

Action scenes in games (especially without pseudo motion blur) look significantly better at higher than 30 fps.

24 fps was also chosen as "the minimum we can get away with", so I see no reason not to increase it when you don't have to pay for film stock.


That 12 fps number is absolutely arbitrary. There is no threshold that changes between 11 and 12. This topic always seems to attract so many myths.

Even the difference between 110 and 120 fps is clearly visible for most. And while sensitivity may vary: I doubt that anyone could not be able to instantly see the difference between 30 and 60. How was this tested? Sure that the frames were being delivered?


Not really an answer, but the flicker fusion frequency varies between subjects and depends hugely on both the subject's physiological state and the kind of flicker. http://en.m.wikipedia.org/wiki/Flicker_fusion_threshold lists seven factors (size, color, where it occurs on the retina, contrast, etc)

http://www.scholarpedia.org/article/User:Eugene_M._Izhikevic... shows some curves that show the effect of several of these factors.


I am sure that people have different refresh times in their eyes. I don't consume a lot of video, so the only time I remember noticing a poor video frame rate was when watching Up! in 3-D. I think the theatre just divided the 30fps between the two eyes, and panning shots bothered me so much I had to close my eyes a few times.

In every day life I am constantly bothered by LED tail lights, and cheap video projectors. In both cases I clearly see the individual blinks in the case of the LEDs and the color cycle in the case of the projectors. I find in quite distracting.

Most people can't even see what I am talking about when I describe it and they go seeking it out.


Being someone who can see the difference like night and day, I imagine there must be something that varies per individual... which also makes me feel like you are lying. So I totally understand where you're coming from.


As I told another user, I am very aware that my position is a minority position, and I must deduce that most people can in fact see a difference (there is no reason for everyone is this thread to lie to a complete stranger for no apparent reason.) And yeah, I can see why you'd think that I am lying, but all I can say is that my experience is what it is, that's all.


Do you consume much media? I ask because I used to be similar to you; I never used to be able to distinguish between 30fps vs 60fps but I find that the more media I consume, the more the difference becomes apparent to me.

Aside from that, one of the clearest videos I've seen the difference in is: https://www.youtube.com/watch?v=C-dOuBcxMlk - the commentary isn't very constructive in that video but I see a rather large difference between gameplay clips.


Honestly, I didn't see a difference in the two. I'm sure that 60FPS is better, but I also just don't "see" it.


I couldn't see it during the hearthstone example, but it was abundantly clear during counter-strike.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: