Hacker News new | past | comments | ask | show | jobs | submit login

However, the audience is used to shutter angle as part of the visual vernacular (e.g., narrow shutter angle for hyper edgy rap videos, long shutter angle for dreamy retro vibes). If rendered content can't speak the same visual language, a tool is missing, regardless of framerate (up to a point... At 400Hz, I'd be impressed by someone really seeing the difference).

What's interesting about rendered content is that it can extend that (e.g., a shutter angle beyond the duration of a frame), playing with something we thought we had a handle on.




I first learned about shutter angle when reading about its use in Saving Private Ryan; the narrow 45 and 90 degree shutters made the opening scenes much more visceral with so much flying through the air.

https://cinemashock.org/2012/07/30/45-degree-shutter-in-savi...


That was a really interesting article, thanks!


> At 400Hz, I'd be impressed by someone really seeing the difference.

Trivial. Drag your white cursor quickly across the screen against a black background. You will see clear gaps between the individual cursor afterimages on your retina.

Double the FPS, halve the size of the gaps. On a 240Hz monitor I can so clearly see gaps such that if they were halved the gaps would still easily be visible. Ergo 400Hz would still easily be distinguishable from continuous motion.

To put numbers on this, consider a vertical line 1 pixel wide moving across a 4K screen in 1 second (that's not even that fast). At 480Hz that's a shift of 8 pixels per frame. So you'd need at least 8x the framerate for motion of this line to be continuous.


Another example is multiplexed LED 7-segment displays. These distort in different ways depending on how you move your eyes. Even 400Hz is too low to display a typical one realistically. If you use motion blur you'll lose the eye-movement-dependent distortion of the real thing.


Some experiments by a colleague a few years ago indicated 700Hz might be the limit. Will take some verification, obvs.


I read a study that put people in a dark room with a strobing LED and told them to dart their eyes left and right. 1000Hz was the limit before everyone stopped seeing glowing dashes and saw a solid line streak instead.


I was researching this because I was wondering how fast you can make LED flicker for lighting effects before it looks like constant brightness.

I found most of the information on Wikipedia[0], and the limit seems to be at about 80hz, but together with movement, some people can see stroboscopic effects up to 10khz.

[0] https://en.wikipedia.org/wiki/Flicker_fusion_threshold#Strob...


I’m reminded by an old Microsoft input research video, where 1ms latency response is what is needed for the most lifelike response for touch drawing on a screen: https://m.youtube.com/watch?v=vOvQCPLkPt4


That's definitely not the limit. At 700Hz, a 700px wide screen with the 1px line crossing it every 1s would qualify as reaching the limit in that situation. But, speed that line up so it crosses every 0.5s, and it's no longer good enough. You've introduced an artifact - the object looks like multiple copies now, equally spaced apart by 1px gaps. It's not a smooth blur. The display never displayed a gap, but human eyes merge together an afterimage, so we see multiple frames at once over the last ~1/30s. Now go to 1400Hz, but double the speed of the line so it crosses in 1/4s. Now you need 2800Hz to eliminate the artifact. Or, you can artificially render motion blur, but then it won't look right if you eye follows the line as it crosses. So it's also a function of how fast your eye muscles can move across the screen. Thirdly, we can't limit ourselves to a 700px screen - a screen filling the human field of view would need to be 20-30 million pixels wide before one can no longer resolve a 1px gap between two vertical lines. There is eventually a completely artifactless limit, but it's way higher than 700Hz. Of course, 700Hz is nice, and if you fudge the criteria (how often do you see a 1px line moving across your field of view at high-speed in real life) you can argue it's good enough.


trying watching ping pong under a 1000hz strobe.


> Trivial. Drag your white cursor quickly across the screen against a black background. You will see clear gaps between the individual cursor afterimages on your retina.

That's the outcome of aliasing, not of the FPS limitation itself. You could analytically supersample the motion in the time domain and then blur it just enough to remove the aliasing, and the distinct images would then disappear. Motion blur approximates the same result.


Yeah but if your eyes were tracking it smoothly, it would not appear blurry. You could try to approximate _that_ with eye tracking but achieving such low latency might be even harder than cranking up the FPS


If your eyes were tracking the motion smoothly, the moving object would not appear blurry, but the static background absolutely would. So you'd need to apply anti-aliasing to the background while keeping the object images sharp. (Similar to how a line segment that's aligned with the pixel grid will still look sharp after applying line anti-aliasing. Motion tracking here amounts to a skewing of the pixel grid.)


That's not true at all. When moving multiple pixels per frame, there will always be visible jumps. Aliasing only becomes relevant at the pixel level.


"Pixels" and "frames" are the exact same thing analytically, only in the spatial vs. time domain. This is very much an instance of aliasing, which is why blur happens to correct it.


Alright, but supersampling would exactly achieve another approximation to (real) motion blur, but still suffer from the same issues (like not becoming sharp when tracking with your eyes).


It's also used for subject isolation in much the same way as depth of field.

Only objects not moving relative to the frame can be seen sharply.


Can you recommend any resources on how shutter angle is used to communicate different messages in film? The Wikipedia article only gives some very basic examples (short shutter angle when you want to capture particles in the air or for action scenes).





Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: