Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
The speed of sight: Individual variation in critical flicker fusion thresholds (plos.org)
28 points by bookofjoe on April 3, 2024 | hide | past | favorite | 20 comments



If I get that right the test is a single blinking led. I think it's much harder than detecting a flickering motion.

I can definitely tell 30 vs 60 vs 120fps when moving a mouse cursor for example.

For flickering LED light bulbs (120/100hz) I usually notice during a rapid head movement that something is off. Or that the light is blindly bright to look at while at the same time what it illuminates doesn't feel as bright as it should be. I learned to assume that it means the light is flickering, because I can easily use my phone to record a slow motion video for confirmation.


Yup, flicker fusion is for the bare minimum. My second-hand knowledge is that we can reliably identify a single flash at roughly 1/900s. We are even subtly conscious of single photons.

What I can say for certain is that the difference 75FPS and 60FPS is the difference between enjoying VR and leaving your lunch on the floor.


It’s not testing flicker fusion threshold unless it’s activating the exact same spot on the retina each time it flickers. When you move, the spots are distributed along a path across your retina.


Yes, I think that the light being stationary is an important distinction, though it's a good thing to control for because motion complicates things.

I came to the comments here because of a flickering effect (different from what they're testing for) I often observe: the tail lights of cars. Sometimes when I move my eyes (not my head) very quickly I will see a dashed trail from the red tail lights of certain cars (I'm assuming they're pulse modulating for some reason). I've been thinking about this in terms of the often-cited phenomena where our brains 'black out' our vision briefly when moving our eyes. I think this (brain 'blacking out' our vision) is not what happens, and some others have similar evidence: https://www.science.org/doi/10.1126/sciadv.abf2218


I don't think I see a dashed trail, but I will pay more attention. I mostly find that many modern car LED lights are flickering very obnoxiously. And it is worse during motion.


I hate those light bulbs. They blink at I think 120hz? Idk maybe that saves energy as half the time the light isn’t on. And maybe most people can’t tell.

As an object moves in sunlight it looks smooth to your eyes, but under a blinking light you only see the motion during moments that get lit up. This is much more obvious for faster motion.

I’m not 100% certain but I think this sort of lighting triggers bad headaches for me. I get bad headaches some times, and I’ve always been pretty sure the lighting was why but I’ve only recently started confirming with my phone that all those places where I get headaches use blinking overhead lights for the entire room. More and more places around me have adopted it and it’s awful. I have tons of clips in my phone now of random locations in slow motion, many of which have these lights. I can’t walk through a store that’s using them for very long.


Very possible. I’ve suffered a brain injury that has made me more aware of this kind of discontinuous input. It is definitely more taxing for the brain.

LED lights with poor drivers are a rudeness to me now.


120Hz US, 100Hz EU. It flickers at twice the main's frequency. It is because the LED drive/power supply electronics is too cheap to have enough capacitance to maintain the LED powered while the alternating voltage is crossing zero.

I used to get headache from flickering CFL lights in store when I was a kid. Those days only the worst LEDs make me uncomfortable.

Some LED bulb will flicker with a really sharp cutoff, they are the worst. Many will flicker, but not down to zero light. This does make a difference.


This shows the flicker fusion as bottoming out just over 60Hz; this conflicts with the fact that a large fraction of the people I know can see flickering in fluorescent lights and CRTs at 60Hz. One roommate I had could tell if my monitor was below or above 72Hz (this is for a static image, so unrelated to FPS).


This comes up in an ancient and often heated discussion a lot on game boards as to the "optimal" FPS to have in a game - many proponents saying anything under 100 FPS is "unplayable" for them, and another group saying basically anything over 60 is either not noticeably perceivable to the human eye, or barely noticeable.

I'm starting to see the pop-science-gamer-journalism interpretation of this study as being "study proves some humans see at different FPS" and all the 100+ FPS gamers being like "AHA! Told you!" when really the results of the study support the ~60 fps conclusion that's been mainstream for a while (if I'm interpreting correctly). I think the science here is pretty clear - there is a variation in human visual temporal resolution, and this paper's a bit over my head but if I'm understanding correctly it isn't much (at least in this sample size and the gamer context - I'm aware they concluded the variation was large between individuals).

TLDR if you're fussing that you're "only" getting 90 FPS vs 120 or whatever, and genuinely feel like it's affecting your performance - don't worry, seems very similar to the audiophile stuff to me. I think that there are probably big outliers in this range (thinking specifically professional baseball players), but perhaps the "perceived" difference is because if you have 130 FPS your average framerate is likely to always stay above your perception, and maybe at 70-90 it will occasionally dip below that threshold, causing you to perceive a difference.


Gamers have testably higher in-game performance at higher framerates.

https://www.youtube.com/watch?v=OX31kZbAXsA


The US military has done research on refresh rates to find out the optimum refresh rate for pilots and their conclusion was the same. Beyond 60fps, there is not a benefit.

There was another research which discussed about a latency in the brain of around 15ms. Even if the visual system could detect a change, not much could be done with that change due to the inherent latency.


>There was another research which discussed about a latency in the brain of around 15ms. Even if the visual system could detect a change, not much could be done with that change due to the inherent latency.

That claim does not support that conclusion. 15 ms is roughly the frame time at 60 fps, meaning framerate potentially accounts for a significant and reducible portion of overall latency.


A 45,000-pound airplane has a slower response time than a cursor on a videogame.


The research was in relation to HUD for fighter aircraft from what I recall. Nothing to do with how the aircraft responds rather identifying targets on screen.


Note that hearing and touch have higher "frame" rates.


It's a bit difficult to draw reliable conclusions from this critical flicker fusion paper to the entire visual field and to the detection of changes in position or appearance.

My take away from the paper is that there is variation in the CFF across individuals for foveated visual targets that are much brighter than the surround. It remains to be seen the effects of visual targets that are dimmer, lower contrast, colour, and peripherally placed.


I’m quite happy to have grown up in the n64 era, totally happy with 30fps. And you can get some sweet effects and high resolution on a cheap card at 30fps.

OTOH, some games seem to add, like, a frame or two between input and response? Whatever it is, it feels really annoying. But I’ve played plenty of games that feel perfectly tight and responsive at 30fps so I think there must be something else wrong. I’m sure it could be fixed with a higher frame rate, but I’d rather not resort to that.


To complexify things a bit more, there's also a difference in testing/implications between:

1. Someone can detect a difference between rate X and Y.

2. Someone can detect that X is lower than Y.

3. Someone receives information better when increasing from X to Y.

It's self-evident that there are values of X and Y where those are all true, but it's not certain that they will become false at the same conditions.


This is fascinating. I often wonder about the different "clock speeds" of various functions in the brain and the mechanisms in play to keep these functions in sync.

Tangentially, I wonder if the brain's "FPS" (sample rate?) in hearing and processing audio differs between individuals in a similar way.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: