
Why movies look weird at 48fps, and games are better at 60fps - jfuhrman
http://accidentalscientist.com/2014/12/why-movies-look-weird-at-48fps-and-games-are-better-at-60fps-and-the-uncanny-valley.html?
======
sray
I liked the article, but, as a game developer who does not specialize in
graphics, I really liked one of the comments:

 _Joe Kilner - One extra issue with games is that you are outputting an image
sampled from a single point in time, whereas a frame of film / TV footage is
typically an integration of a set of images over some non-infinitesimal time._

This is something that, once stated, is blatantly obvious to me, but it's
something I simply never thought deeply about. What it's saying is that when
you render a frame in a game, say the frame at t=1.0 in a game running at 60
FPS, what you're doing is capturing and displaying the visual state of the
world at a discrete point in time (i.e. t=1.0). Doing the analogous operation
with an analogous physical video camera means you are capturing and
compositing the "set of images" between t=1.0 and t=1.016667, because the
physical camera doesn't capture a discrete point in time, but rather opens its
shutter for 1/60th of a second (0.16667 seconds) and captures for that entire
interval. This is why physical cameras have motion blur, but virtual cameras
do not (without additional processing, anyway).

This is obvious to anyone with knowledge of 3D graphics or real-world cameras,
but it was a cool little revelation for me. In fact, it's sparked my interest
enough to start getting more familiar with the subject. I love it when that
happens!

~~~
Retra
I always use this fact as a kind of analogue to explain position-momentum
uncertainty in physics. From a blurry photo of an object, you can easily
measure the speed, but the position is uncertain due to the blur. From a very
crisp photo, you can tell exactly where it is, but you can't tell how fast it
is moving because it is a still photo.

It's a good way to start building an intuition about state dependencies.

~~~
shiven
Welcome to Heisenberg's uncertainty principle[0] in the macroscopic world!

[0]
[http://en.m.wikipedia.org/wiki/Uncertainty_principle](http://en.m.wikipedia.org/wiki/Uncertainty_principle)

~~~
unknownian
Would it not be the observer effect instead?

~~~
malka
True that pointing a camera to someone can change their behavior.

~~~
Retra
I hear it can even add ten pounds...

------
bhauer
This article is detailed and scientific.

However, anecdotally speaking, the concern I have with evaluating high-frame
rate in film is that we have very little context—most of us have only ever
seen Peter Jackson's Hobbit films in HFR. In other words, I have never seen
how other directors' work would be affected by HFR.

Speaking exclusively about the Hobbit series in HFR, I too observed an uncanny
valley that traditional films intrinsically avoid with their low frame rate.
The Hobbit films felt more like a _play_ than a film. A play with elaborate
stage effects, but a play nonetheless.

In fact, my chief criticism of Jackson's directing with HFR is that the
feeling of watching a play is amplified by how he mixes the sound and directs
his extras. The extras routinely just mumble nonsense in the background,
leaving only the character you're intended to be focused on speaking clearly.
It's the same thing you see in a play when there is background dialog, and
it's completely unnatural. You find yourself sometimes distracted by the
characters in the background and realizing they're not actually doing anything
meaningful or having real conversations. For example, in the most recent film,
I found myself more distracted by the unnatural audio in early scenes (such as
the beach scene) than the HFR video.

Combine that with the poor acting by the minor characters in the first 45
minutes of the recent film and I think HFR gets a bad rap in large part
because the Hobbit films alone are our point of reference.

~~~
eridius
Maybe I'm in the minority here, but I thought HFR dramatically improved the
Hobbit films. The first 5 minutes of each movie felt very odd, as if everyone
was moving in fast motion without actually moving any faster (sorry, that's
the best way I can describe it), but after those first 5 minutes, the movie
looked completely natural and amazing. Especially the 3D, it looked so good
that I want every single 3D movie to be produced in HFR now, and I lament the
fact that there's no way to get HFR at home.

As for feeling like a play, I guess I didn't pay attention to the background
characters much because I never noticed any of this nonsense mumbling that
you're talking about.

~~~
MichaelGG
Well if you have a PC setup to play videos, there's a system that'll
interpolate any video to a higher frame rate : [http://www.svp-
team.com/](http://www.svp-team.com/)

I watched ST TNG using that and it certainly made it feel a bit more
realistic.

~~~
eridius
Most modern TVs tend to have motion smoothing available as an option as well
(that page says higher-end TVs but I've seen it on plenty of cheaper TVs as
well, though I don't know when that was written). But it's nowhere near as
good as HFR, especially in scenes with a lot of motion it tends to make things
blurrier.

------
lchengify
> At 48Hz, you’re going to pull out more details at 48Hz from the scene than
> at 24Hz, both in terms of motion and spatial detail. It’s going to be more
> than 2x the information than you’d expect just from doubling the spatial
> frequency, because you’re also going to get motion-information integrated
> into the signal alongside the spatial information.

I had a conversation with a friend at Pixar about exactly this topic.

The issue goes beyond just pulling more spacial information out of a shorter
timeframe, it's also that all the current techniques for filmmaking assume
24fps.

Everything has a budget of time and money, and when you say, make 1000 extra
costumes for a shot, you cut corners in certain ways based on your training as
a costume designer. Your training is based on trade techniques, which are
based on the assumption that the director of photography (DOP) and director
are viewing the work at 24fps with a certain amount of spacial detail.
Doubling the frame rate means some of those techniques need to be more
detailed, whereas others might be completely useless.

Given everything that goes into a shot (hair/makeup, set design, lighting,
costume design, props, pyrotechnics, etc), it's unlikely everyone working on a
high-fps film is going to be aware of exactly which techniques do and do not
work. As a result, you get lots of subtle flaws exposed that don't work with
twice the detail. The sum of these flaws contribute heavily to making the shot
look 'videoish'.

~~~
cpeterso
Did your friend say what frame rate Pixar uses for its animated films?

~~~
stephenboyd
Last year, I was told by one of the technical managers for animation at Pixar
that they're all in 24 fps. They've used all the increased processing power
since their founding in the 90's to increase detail, so the result is that the
time it takes to render each frame has barely changed since Toy Story. Each
frame of Monsters University took an average of 29 hours to render.

This article has some good info: [http://venturebeat.com/2013/04/24/the-
making-of-pixars-lates...](http://venturebeat.com/2013/04/24/the-making-of-
pixars-latest-technological-marvel-monsters-university/)

They seem to choose being film-like over being realistic when they have a
choice. In one of their most recent short films, they actually lowered the
framerate for a slow-motion effect to simulate a classic film technique.

~~~
KayEss
> Monsters University took an average of 29 hours to render

Wow, when I was doing CGI work in the mid-90s our budget was generally 20
minutes per frame. The highest quality work we did was around 1 hour per
frame, and this was for TV adverts and idents -- still, we didn't have Pixar's
budgets to play with.

------
dperfect
The most interesting point made in the article (for me) is that the presence
of noise/grain - which effectively reduces real detail in an individual image
- can actually _improve_ the perceived detail across time with high frame
rates.

At first, I thought this extra "detail" could be explained as an illusion
(since noise/grain can mask a lack of resolution), but then I read the
abstract quoted near the end of the article:

"...visual cortical cell responses to moving stimuli with very small
amplitudes can be enhanced by adding a small amount of noise to the motion
pattern of the stimulus. This situation mimics the micro-movements of the eye
during fixation and shows that these movements could enhance the performance
of the cells"[1]

So if I understand right, since the biological systems are tuned to extract
extra detail via supersampling across time, and a small amount of noise/grain
can enhance that ability (mimicking natural movement of the eye), it actually
helps our visual system extract more _real_ detail.

It seems counterintuitive to add noise for more detail, but the explanation is
fascinating.

[1] Stochastic resonance in visual cortical neurons: does the eye-tremor
actually improve visual acuity? – Hennig, Kerscher, Funke, Wörgötter;
Neurocomputing Vol 44-46, June 2002, p.115-120

~~~
coldtea
I wonder how is that related to the dithering algorithms in music production
-- that add noise when downsampling an audio track.

~~~
unholiness
Actually, these are two completely distinct phenomena.

Dithering exists in both the visual and the audio realms, as corysama pointed
out in their linked article [1]. The term's use is identical in both realms,
but neither is really related to stochastic resonance.

Dithering is a technique of adding noise to disguise sampling boundaries. It
hides unrealistic artifacts from digital downsampling of analog signals, but
it does not add more realistic detail. Critically, it does not increase the
signal-to-noise ratio.

The phenomenon being described here occurs on a biological level. As described
above, the additional noise improves your visual acuity. That is, you can
extract /additional/ details with the noise added which could not be resolved
before. This is due to a nonlinear change invoked in the system receiving the
signal, not from the change to the signal itself. So, while adding the noise
initially decreased the signal-to-noise ratio, the overall effect on the
entire system is a increase in signal-to-noise ratio.

Here the mechanism, explained in the Henrig paper, appears to be purely
biological, but stochastic resonance is not strictly a biological phenomenon;
it can also be seen in the electrical domain [2].

[1]
[http://loopit.dk/banding_in_games.pdf](http://loopit.dk/banding_in_games.pdf)
[2]
[http://en.wikipedia.org/wiki/Stochastic_resonance](http://en.wikipedia.org/wiki/Stochastic_resonance)

------
baby
I personally love HFR and have went out of my ways to watch the three The
Hobbits in HFR (I traveled to Paris, the only place in France where they have
it in HFR).

When people complain about 48fps being weird I just feel like they're just not
used to it. It does look weird but after 20 minutes it looks amazing. I'm
personally tired of not understanding anything in action movies that uses
24fps. It is kind of a luxury for the eyes to have 48 fps and I predict that
in a few years we'll have the same debate we have with console now (60 fps is
better than 30 fps).

We got used to 24 fps and so we're making justifications on why it looks
better when it clearly doesn't if you take a step back.

~~~
wmeredith
You're trying make an opinion a fact. "Better" has always been subjective when
it comes to the senses, and I suspect it always will be. See the whole
ridiculous audiophile cottage industry that can sell people $900 wooden volume
knobs for their hi-if systems.

~~~
abandonliberty
That's fair. 'Better' is complex.

Perhaps 'closer to reality' is a better statement. Reality doesn't render at
24fps.

------
AndrewDucker
So, basically, at 24FPS things are blurry enough that you can't see the fine
details, which means that special effects and costumes look realistic.

Increase the frequency to 48FPS and the blur goes away, meaning that we can
see the fine detail, and suddenly sets look like sets, costumes look like
costumes, and CGI looks like a computer game.

~~~
TillE
To be fair, CGI creatures look like a computer game regardless. A very pretty
computer game, but still.

Sometimes it seems like people _want_ to believe that CGI is a whole lot
better than it actually is. People raved about Gollum in the LOTR films, but
go back and look at it. Even at 24fps, it's not great. It certainly doesn't
look real.

~~~
lotsofmangos
CGI backgrounds are very good though. Most people are not aware of just how
much of Fight Club is CGI.

[https://www.youtube.com/watch?v=Dlpr6CnKDFM](https://www.youtube.com/watch?v=Dlpr6CnKDFM)

~~~
vilhelm_s
Incidentally, have you re-watched Fight Club recently? I remember when I
watched it in 1999 I was completely blown away by the "impossible" cuts. I
watched again a couple of months ago, and all I could think was that it looked
like a Half Life 2 cut-scene. The standard for CGI rendering has gotten a lot
higher. :)

------
EpicEng
Seems to be down. Cached version:
[http://webcache.googleusercontent.com/search?q=cache:Hu955ZD...](http://webcache.googleusercontent.com/search?q=cache:Hu955ZDoOSQJ:accidentalscientist.com/2014/12/why-
movies-look-weird-at-48fps-and-games-are-better-at-60fps-and-the-uncanny-
valley.html+&cd=1&hl=en&ct=clnk&gl=us)

------
Jyaif
I disagree on both explanations:

1/ The "soap opera effect" explains the 48 fps issue.

2/ The lack of motion blur in games is the reason why higher fps are better
(see
[https://www.shadertoy.com/view/XdXXz4](https://www.shadertoy.com/view/XdXXz4)
for a great visualisation).

------
UhUhUhUh
There's also a high-level processing aspect. The brain excels at extracting
relevant information, which includes discordant information. Back in the days,
a solo violin was tuned slightly off to allow the audience to hear it over the
orchestra. Barthes also came up with the "punctum" idea, whereby an odd detail
in a picture will generate an impression. What I'm saying is that higher-level
processing is probably responsible for a number of "impressions" that might
have little to do with fps.

------
Qiasfah
Most serious FPS gamers swear by screens that have a higher update rate than
60hz.

In the past this was achieved by setting your CRT to a low resolution and
upping the refresh rate. More recently you can get TN LCD panels that offer
120 or 144hz update rates.

Moving the mouse in small quick circles on a 144hz screen compared to a 60hz
screen is a very different experience. On a 60hz screen you can see distinct
points in the circle where the cursor gets drawn. With 144hz you can still see
the same effect if you go fast enough, but it is way smoother.

This makes a huge difference for being able to perceive fast paced movements
in twitch style games and is the reason there has been a shift to these
monitors across every competitive shooter.

My thoughts on this is that this behavior is similar to signal sampling
theorems. Specifically the Nyquist theorem talks about how you have to sample
at at least 2x the max frequency of a signal to accurately represent the
frequency. For signal generation this means that you have to generate a signal
at at least twice the rate of the max frequency you want to display. If you
want to accurately reconstruct the shape of that signal you need 10x the max
frequency (for example two samples in one period of a sine wave makes it look
like a sawtooth wave, ten samples makes it look like a sine wave).

So, if you're moving your mouse cursor quickly on a screen or playing a game
with fast paced model movement even if your eyes can only really sample at
something like 50-100hz the ideal monitor frequency might be 1000hz. There's a
lot of complexity throughout the system before we can get anything close to
this (game engines being able to run at that high of a framerate, video
interfaces with enough bandwidth to drive that high of a framerate, monitor
technology being able to switch the crystals that fast, etc.).

Yes, 48fps movies typically look less cinematic, but I think this is a flaw in
movie making technology and not of the framerate. The fight scenes in the
hobbit sometimes look fake because you can start to tell how they aren't
actually beating up the other person. This detail is lost at 24fps and is why
they have been able to use these techniques.

~~~
adamcanady
Wait a second.. can you explain the 10x the max frequency part to accurately
reconstruct the shape of the signal?

It's my understanding that you just need 2x (two points in a sine wave) to
construct a unique wave. If you're getting a sawtooth, it means that you're
sampling a wave that is composed of very high frequencies, and you're
accurately sampling it, so a DAC can reconstruct it uniquely.

~~~
Qiasfah
There's some discussion of it at the beginning of this article:
[http://www.ni.com/white-paper/10669/en/](http://www.ni.com/white-
paper/10669/en/)

~~~
pedrocr
What that whitepaper is saying is that "if you only sample 2xMaxFreq and then
connect the dots with straight lines it doesn't really look like a sine wave
so buy 5x as much instrument from us". That's a total cheat as that sawtooth
graph they show is only possible if you allow higher frequencies. If the
signal is bandwidth limited at the frequency of the sinewave the points you
sample at 2xFreq only have one possible solution for the graph (the sinewave
again). There are some great videos about this recently by xiph's monty:

[https://www.xiph.org/video/](https://www.xiph.org/video/)

So if you sample 2xMaxFreq you have samples that describe the full signal and
can reconstruct it exactly. So if our eyes really are 100Hz we can't see
anything above 50Hz. That seems to align well with the ~50/60Hz threshold for
flicker free viewing. Apparently higher framerates are only useful for when we
have fast movement across the field of view which would be the case for FPS:

[https://en.wikipedia.org/wiki/Flicker_fusion_threshold#Visua...](https://en.wikipedia.org/wiki/Flicker_fusion_threshold#Visual_phenomena)

~~~
hackinthebochs
I just finished going through a Fourier Transform course. The technical answer
is that you don't interpolate the samples with lines, but with the sinc
function. The sinc function is sinusoidal and so it more naturally
approximates waves. In this case 2xMaxFreq is enough to reproduce it exactly.
Using linear interpolation in the whitepaper is a blatant lie.

>So if our eyes really are 100Hz we can't see anything above 50Hz.

I'm not sure this follows as we're not perceiving waveforms when light hits
our eyes, but we're perceiving intensity of energy hitting our receptors.

------
dsugarman
I see the same arguments arise about HFR as I do with stereoscopy and the
rhetoric follows the same as the switch from vinyl to digital music formats:
it is no longer art. It feels like you lose the artistic effect when you add a
multiple of information to your brain. The reality is artists need to learn
how to be mindful of the new medium and the old tricks they used to overcome
older medium defects need to be removed from the process. (Ex. Over use of
makeup) I am excited because we have a bright future with better media
technology and pioneers like James Cameron are leading the way.

~~~
MLR
Film is art, it's important people remember that, HFR is just another tool -
it shouldn't be forced upon people.

You won't see people claiming 3D is an inherently superior format to film in,
we shouldn't see the same for HFR.

Conversely if a director feels it's best for their film to use HFR, in full or
in parts, people shouldn't be jumping on their back about it until they've
seen the end product.

------
Animats
James Cameron (Titanic, Avatar, etc.) wants to get frame rates up to at least
48FPS. He considers that more important than resolution, pointing out that
higher resolution only benefits the first three rows in a theater.

With the low 24FPS frame rate, pans over detailed backgrounds look awful. This
is a serious constraint on filmmaking. Cameron's films tend to have
beautifully detailed backgrounds, and he has to be careful with pan rates to
avoid "judder". "The rule of thumb is to pan no faster than a full image width
every seven seconds, otherwise judder will become too
detrimental."([http://www.red.com/learn/red-101/camera-panning-
speed](http://www.red.com/learn/red-101/camera-panning-speed))

There are some movies from the 1950 and 1960s where this is seriously
annoying. That was when good color and wide screen came in, and films
contained gorgeous outdoor shots of beautiful locations. With, of course,
pans. Some of the better Westerns of the period have serious judder problems.
Directors then discovered the seven-second rule. Or defocused the background
slightly, if there was action in the foreground. Some TVs and DVD/BD players
now have interpolation hardware to deal with this.

The author's analysis of the human visual system is irrelevant for pans. For
pans, the viewer's eyes track the moving background, so the image is not
moving with respect to the retina.

------
jfuhrman
Site isn't loading. Google cache:
[http://webcache.googleusercontent.com/search?q=cache:Hu955ZD...](http://webcache.googleusercontent.com/search?q=cache:Hu955ZDoOSQJ:accidentalscientist.com/2014/12/why-
movies-look-weird-at-48fps-and-games-are-better-at-60fps-and-the-uncanny-
valley.html+&cd=1&hl=en&ct=clnk&gl=us)

------
dkbrk
> Add temporal antialiasing, jitter or noise/film grain to mask over things
> and allow for more detail extraction. As long as you’re actually changing
> your sampling pattern per pixel, and simulating real noise – not just
> applying white noise – you should get better results.

This could be a viable alternative to supersampling for antialiasing. Rather
than averaging multiple subsamples for each pixel fragment, this suggests that
if a _single_ subsample were taken stochastically, the results could be as
good, or even better, so long as the frame rate stays high enough.

Antialiasing doesn't quite have the same impact on rendering performance in
modern games that it used to, mainly due to new algorithms such as SMAA and
the increased cost of postprocessing relative to rasterisation, but this could
nonetheless lead to tangible performance improvements.

------
suchow
Does anyone know of a good demo of different frame rates that I can view on a
laptop? Is this even possible with LCDs?

------
anonymfus
Can it be that description of 24 fps as "dreamy" is subjective? Because
usually my dreams don't have such effect. I like plays and 48 fps Hobbit.

May be it's like in the days of monochrome media black-and-white dreams were a
norm, but today they are exception.

------
abandonliberty
I dug into high FPS film when I read that 24 fps were designed to be viewed in
a dark theatre, when human eyes blur images due to switching between rods and
cones.

Most of us no longer watch content in darkness. James Cameron is of the
opinion that improving FPS is more significant than moving up from HD. I
figured I should trust the professional who devotes his life to this.

To truly evaluate high FPS movies and video content, you have to watch it for
a while.

The SmoothVideo Project (SVP) is pretty awesome. Needs some good hardware,
made by volunteers, and needs some work to get set up well.

It struggles in scenes with lots of detail, but panning scenes are incredibly
beautiful.

Going back is a bit difficult.

~~~
nominated1
>>>To truly evaluate high FPS movies and video content, you have to watch it
for a while.

Simply doubling the framerate of existing film is the wrong approach. To truly
evaluate high FPS the director must take the framerate into account during
filming.

I'll use SVP/InterFrame with low FPS sports, homemade video and occasionally
anime but NEVER live action film. It cheapens the whole experience and undoes
everything the director intended.

------
doomrobo
I don't quite understand. If a video is playing at 41fps, then your eye can
sample each frame twice, with a difference of one microtremor to increase
resolution. But if a video is playing at 83fps, you only get one sample per
frame with no added benefit from the microtremor. The article states the
opposite: that the latter framerate allows for a higher perceived resolution.
Can anyone explain?

------
cpncrunch
Duplicate.
[https://news.ycombinator.com/item?id=8791721](https://news.ycombinator.com/item?id=8791721)

------
nitrogen
Is anyone else redirected to a 403 error on a completely different site
(broadbandtvnews) when following the link?

------
leonatan
But... UbiSoft said some games are better and more "cinematic" at 30fps. Derp!

