
Motion smoothing is ruining cinema - fanf2
https://www.vulture.com/2019/07/motion-smoothing-is-ruining-cinema.html
======
wernerb
Turned off the "feature" as soon as Tv was on. Have been advising friends and
adjusting their expensive oled TV's as well. In every case they were grateful
for me "fixing" their TVs which only took a few minutes.

People just don't want to muddle around in the settings. And here lies the
problem. It would be great if manufacturers would open up their interface
settings to be influenced more easily by mediaplayers or settup boxes. Those
in turn, could have a ruleset: if 24p, then disable motion blur if else enable
motion blur. If sports enable motion blur. And so on.

It's not even limited to motion blur, I have low light and normal light
profiles I'd like to have changed automatically based on the actual light in
my room.

~~~
nemosaltat
I work in the lighting control industry. Your final paragraph piques my
interest. I’m curious, why automatically change the TV if you could
automatically change the room? Shades drop, light intensity lowers, CCT warms,
etc...?

~~~
JaimeThompson
Because the vast majority of people don't have multi-thousand dollar room
control setups but they would find some benefit from their entertainment
devices being able to describe their features to each other.

~~~
nemosaltat
Yes, cost is always a factor. That said, if the room control was affordable,
would changing the space be an objectively better solution than changing the
display?

~~~
dllthomas
I think ultimately you probably want both, because sometimes you (... or at
least I) have different wants around viewing. Sometimes I'm going for a
cinematic experience, where I want the room dark and all focus on the media.
Other times I want to be half watching something while playing a board game or
doing some cleaning. Optimal lighting conditions for these scenarios clearly
differ, and probably optimal display settings to match.

------
bovine3dom
This whole thread and article is like it's from an alternative universe to me.
All of my tech-savvy friends go to great lengths to turn motion smoothing on
everywhere ("smooth video project" is the best I know for PC) they can so that
panning shots don't look awful. They wouldn't play videogames at 24Hz, so why
should they watch a film at 24Hz?

It does introduce artefacts. The solution to all of this is for filmmakers to
move on from the 1930s and film at 120 fps or higher. If people only want to
watch a quarter of the frames, it can be downsampled easily.

The argument reminds me of musicians who didn't want their albums chopped up
and sold as individual tracks, to be consumed however listeners desired.

~~~
dperfect
Panning shots probably look awful if you're watching 24 fps content on a
device running at 60Hz (many consumer devices, computers, and Blu Ray players
do this by default). Since 24 frames can't be evenly spread out over 60
frames, you'll see motion judder, and it does look terrible.

However, 24 fps running at a true 24Hz is perfectly fine, and you will _not_
see flicker - at least not if produced by competent filmmakers. That's because
the camera's shutter angle is normally set such that motion blur makes the
motion look smooth. Real cinematographers also know exactly how fast they can
pan with a given lens without causing any noticeable strobing motion.

Comparisons to video games completely miss the point. Most video games are
unable to properly simulate motion blur (some try, but it doesn't usually work
well), so you _have_ to have high frame rates for things to look smooth. Since
you need to react quickly in video games, high frame rates (combined with the
lack of motion blur) are also helpful in preserving crisp edges on moving
objects (so it's a practical advantage). And finally, video games generally
try to simulate reality for the player, so players are more concerned that the
technology makes the experience _believable_ rather than intentionally
_suspending disbelief_ (as in film or theater) to passively unpack the
narrative nuance of artwork unfolding on a screen.

For those same reasons, sports also do well with higher frame rates, but when
fiction (which is obviously disconnected with the present reality) tries to
use high frame rates, it falls into "uncanny valley" territory - much like the
computer-generated humans of movies like "The Polar Express". As others have
noted, a few directors have really tried to push HFR film to the public, but
it has never been received well (whether or not HFR is recognized as being
responsible for the uneasiness felt by audiences watching HFR content).

As for HFR content being "easily" downsampled - it really isn't, at least not
with respect to motion blur (which is an essential point that many people miss
in these discussions).

~~~
bovine3dom
I'm not talking about 3:2 pulldown; I'm exactly the sort of sad sod who has
spent the time writing scripts to match my display's refresh rate with the
media that was running. I strongly dislike watching panning shots at 24Hz; I
do not think it looks perfectly fine.

You're right about motion blur making 24fps look less awful, though. I'm sure
you know that filmmakers often reduce it in fight scenes to deliberately make
viewers a bit uncomfortable. Thanks for pointing out that downsampling
wouldn't work without some extra effort (e.g. just having the same amount of
blur but with the extra frames such that when it was downsampled it looked
normal), as that isn't something I had considered.

I find the suggestion that videogames don't need to suspect disbelief, but
films do, very strange.

~~~
dperfect
> I find the suggestion that videogames don't need to suspect disbelief, but
> films do, very strange.

True, the distinction may seem strange to some, and it's not always a clear
distinction. Here's how I'd describe it in more detail (just my opinion of
course):

In film, we see a person depicted on the screen with certain characteristics
that convey emotion (just as we do with a group of lines comprising a stick
figure with similar emotion), but we are forced to activate parts of our brain
that span the gap between what we see (which obviously isn't reality) and what
that character might feel in his/her universe.

To some degree that same experience happens in video games, but I would argue
that instead of _seeing_ the emotions and trying to understand them, we (as
players) _are_ the person feeling the emotion. In one, we are witnessing
events happen to someone else (and exercising empathy), whereas in the other,
we are experiencing the events first-hand (even in a 3rd person view, we're
playing as if we are the character in view).

A first-hand experience feels most believable when it's actually realistic
(hence the need for high frame rates), but a simple stick figure can feel
"believable" if we're engaged and sharing the emotion we believe it to be
feeling.

As a side note, maybe that's why I personally prefer classic (8-bit era) video
games over modern 3D video games :)

------
DogOnTheWeb
The ultimate solution here, I think, is to allow creators to encode the
desired setting, frame rate, or content category in the metadata so that a
user can benefit from the technology when watching sports without it
detracting from a film later on. Then the default setting could be "auto" and
diehards could override one way or the other.

I'm glad to see a brief tip to that solution in the article and hope it
becomes part of an industry specification.

~~~
vegetablepotpie
An industry standard media meta data interface for motion smoothing would be
the best solution. It may not necessarily make it to the TV industry though.
Although industry standards have been made and adopted widely before (such as
USB) there are plenty of other standards that have failed to be adopted
(FireWire) despite technological superiority (data rates). The standard didn’t
really give a company much advantage by adopting it, USB worked well enough
for most use cases and Apple wanted too much in royalties.

Ultimately any decision a company makes is about money. From the article “It’s
meant to create a little bit of eye candy in the store”. TV companies think
Motion smoothing increases sales. It doesn’t matter what film directors say
about it.

The reality is that TV manufacturers are making a commodity, they don’t want
to be. No one wants to sell a commodity, there’s no money in it. It’s a
constant race to the bottom on price. That’s why we see 3D TVs, curved TVs and
a myriad of gimmicks. Everyone wants to differentiate themselves. If there’s a
new industry standard that makes consumers and content creators happy, who
cares? If I hire an engineer to implement it will it be worth it?

------
Brian_K_White
"looks like a soap opera" exposes everything you need to know about the
ignorance behind this position.

Reality does not judder.

If you want judder for some artistic purpose, you can have that, just like any
other reality-distorting filter.

Some shots are intentionally blurry for valid artistic purposes. Should all
movies and tvs be blurry at all times? Some shots are intentionally monochrome
for valid artistic purposes. Should all movies and all tvs be b&w or red
tinted at all times? Some shots are intentionally too bright or too dark, for
valid artistic purposes... You want stroboscopic stop-motion, go ahead and use
it.

Superior reproductive technology in no way prevents you from creating a scene
that has that and any other kind of artificially applied crappiness like
static interference or low resolution or analog scan lines... But crappy
reproductive technology does preclude the other 99.9% of the time when you
don't want an artificially crappy reproduction.

Video tape was cheaper than film, and tvs and video tape ran at higher frame
rates than film, and soap operas were cheaper than movies, and so the one
superior aspect of soap operas became associated with "cheap".

It's a freaking ignorant _assosciation_ is all. It's obvious as hell, and you
have to be a moron not to recognize why you think "this looks like a soap
opera" and see past that.

~~~
dperfect
> Superior reproductive technology in no way prevents you from creating a
> scene that has that and any other kind of artificially applied crappiness...

The technology to produce movies at high frame rates has been around for a
long time (and projection/display technology could have supported it much
sooner if the content were there), and yet directors have _deliberately_ ,
almost universally chosen what you categorize as "artificially applied
crappiness" (24 fps). Practically speaking, it _is_ artificially applied, and
that's exactly it - it's an intentionally distorted version of reality, and
it's that way on purpose. You're free to attempt to remove that filter (by
smoothing motion on your TV), but that doesn't represent what the director
intended.

Even with "superior reproductive technology", directors are _still_ choosing
to produce films at 24 fps, so can you at least appreciate and respect that
that's how most of them intend their work to be displayed? True, "reality does
not judder" (proper 24 fps content doesn't either, by the way), but reality
isn't what filmmaking is about. To argue otherwise misses the point of film as
an artistic medium. For other types of content, I agree - reality is the
target, but not in film.

------
codefreakxff
The 24 FPS movie rate premise is that the human eye can’t detect the
difference of higher frame rates. But the higher frame rate TV demos look
amazing and movies that are resampled look like crap to people because they
CAN see at more than 24 FPS.

I’m not saying that the resampling is right, but I am super questioning this
film rate in a digital age.

I’m also questioning the algorithm that just creates a messy blur of nonsense
between two frames. Surely we can create a smarter algorithm

~~~
33mhz
24FPS is chosen because of the work it engages the brain in; it's a sweet spot
between seeing film and seeing a series of pictures.

~~~
mrob
Ozu Yasujirō was famous for his use of static cameras and lack of action
scenes. His films are probably the most low-motion mainstream films ever made,
which means they do not engage your brain very much in interpreting the 24fps
motion. And yet they are still highly regarded by critics, and in my opinion
highly engaging in terms of characters and story. Would they have been
improved by shooting at 12fps, to make your brain work as hard as it does with
conventionally shot films?

~~~
greedo
Watch La Jetee (Chris Marker). A "film" comprised mostly of stills with just a
few actual scenes with movement.

~~~
teddyh
> _A "film" comprised mostly of stills with just a few actual scenes with
> movement._

…which was remade into a high-budget Hollywood CGI-laden production, starring
Bruce Willis and Brad Pitt. I’m not even joking; it’s called _12 Monkeys_ ,
and was released in 1995.

~~~
greedo
And into a television series on SyFy.

------
rowanG077
The problem is the refusal of hollywood to release true 60fps content. I'd
take a 60fps version of a movie any day over 24 or 30 fps.

~~~
bitL
60fps changes your perception of a movie. For some reason ~24fps you get the
cinematic feeling, with 60fps you get "amateurish" videographer feeling. Not
sure what's the science behind that.

~~~
cma
24fps lets you focus attention with motion blur (tracking shot on two
characters walking, they will be in sharp focus, while everything else will be
blurred but still convey movement, etc.). It's a similar tool beyond depth of
field. However, a 60hz movie could just have 30hz portions. Stuff like panning
cameras over a landscape looks terrible at 24hz. I don't know how you could
combine the two without making things feel too much like changing between
technology all the time during the movie though.

~~~
AdamHede
You could easily add high quality motion blur in 60 FPS if that was the
desired look.

More FPS is really no different that increased resolution. There is nothing
sacred about 24, 480 or 1080.

It's all conditioning ;)

~~~
cma
How would you add lush blur to a tracked shot in 60fps? The exposure wouldn't
last as long so you'd have to do it digitally wouldn't you? Then it can be
hard to keep the tracked subjects unblurred or can make things look
interpolated.

~~~
brokenmachine
The lenses would still have a limited depth of field though, like a photo from
an SLR.

Wouldn't that make the background be blurred? Or is that kind of blur somehow
different to motion blur?

~~~
cma
Motion blur indicates direction, you could maybe use a weird aperture shape to
simulate it, but things at different depths wouldn't blur right due to not
picking up different parallax movements the same way, and other moving objects
that weren't in focus wouldn't have their motion blur picked up correctly.

~~~
brokenmachine
Oh yeah, that makes sense.

------
korm
What TV manufacturers are doing is unacceptable. It introduces strange
artifacts and the motion looks very artificial compared to a native 60fps
video. So it really doesn't solve anything.

The argument that it's the filmmakers' fault for using 24fps is flawed. What
about animation? What about stop motion? Those may never be produced at a
higher framerate. Movies with high VFX budgets will be much more expensive due
to the additional frames when rotoscoping and compositing.

So if it is inferior and only makes sense for a subset of productions, why is
it the default?

~~~
mrob
In many cases the bulk of the VFX budget is artist wages, not CPU time. If you
already have the motion paths set up then it doesn't take much more human
effort to render more intermediate frames. You might even be able to reduce
the cost per frame by exploiting the smaller differences between frames, e.g.
with temporal noise reduction.

Hand drawn or hand posed animation is the rare case where increased frame-rate
really would take more human effort, but it's often already produced at extra-
low framerate (e.g. 12fps or 8fps), with frames duplicated for playback. The
same duplication can happen with higher playback framerates. Modern video
codecs handle duplicated frames well.

~~~
dreamcompiler
And yet many fully computer-generated movies today are converted to 3D in
post, rather than being created in 3D in the first place. For CGI movies,
doing 3D at creation time gives better results and requires zero human labor
(it's just a translation of the camera's matrix), so why isn't it always done?

~~~
Mathnerd314
CG movies are created in 3D, they use 3D modeling software. There's nothing to
"convert" to 3D unless they lose the original models and have to retrace them
or something. Which apparently is a quite common occurrence, according to some
articles on the 2D to 3D conversion of Toy Story and Shrek. So collateral
damage from capitalism is the main explanation for old movies.

For new movies, where there is proper data, the 3D camera for each scene still
requires more than "zero human labor"; you have to specify the inter-ocular
distance, convergence plane, and tweak the scene a little for cinematic effect
or 3D weirdness. And since there are thousands of shots, and the work is
fairly tedious, it makes little sense to work on it in parallel with 2D during
production when shots might just be cut entirely. So you end up with a bias
towards 2D or 3D during the creation process.

Dreamsworks apparently has a made-for-3D animated movie division
([https://animatedviews.com/2007/dreamworks-
goes-3d/](https://animatedviews.com/2007/dreamworks-goes-3d/)), where
presumably they are flattened to 2D in post. Reviews of their movies
([https://www.3dor2d.com/reviews/how-train-dragon-hidden-
world...](https://www.3dor2d.com/reviews/how-train-dragon-hidden-
world-3d-3-d-movie-review)) unsurprisingly say "great use of 3D". Since that's
the goal of the division I'd expect them to continue doing 3D-first movies.

In contrast Pixar so far has stuck to a more 2D style. In this interview
([https://www.youtube.com/watch?v=HepIGDJK98s](https://www.youtube.com/watch?v=HepIGDJK98s))
their stereo guy says they do 3D in post. The 3D in their movies is described
as "well done but a minor aspect" ([https://www.3dor2d.com/reviews/Toy-
Story-4-3D-3-D-Movie-Revi...](https://www.3dor2d.com/reviews/Toy-
Story-4-3D-3-D-Movie-Review)). In the interview the guy basically says that he
could use more 3D but he's more concerned with keeping the director happy so
he doesn't. Pixar is now focusing on originals so maybe one of their next
films will be 3D oriented. But they're making tons of money anyway so who
knows if the right director will come along.

~~~
dreamcompiler
> CG movies are created in 3D, they use 3D modeling software.

Of course. I should have been more specific; I was talking about stereo
conversion. You addressed that. Thanks for your explanation. It's not as
labor-free as I had assumed.

------
discreditable
One problem is that TVs run at 60fps and don't support adaptive sync. 60 is
not evenly divisible by 24. Without some kind of trick you're padding to 30fps
by duplicating every 4th frame, or trimming to 20fps by dropping every 6th
frame.

Padding is usually what's done. You can see it in NTSC (30fps) DVDs which come
from something originally produced for PAL (25fps). Every 5th frame is
duplicated.

It makes me curious what framerate movie projectors run at. Maybe the "cinema
experience" includes goofy frame duplication and nobody realizes it.

~~~
dbcurtis
Traditionally, each film frame is projected 3 times. The psycho-perceptual
reason is that the phi-phenomenon (“light chaser effect”) kicks in about 12–15
Hz, so camera frame rate must exceed that to give a perception of smooth
motion. The critical flicker frequency is around 50 Hz for most people, so
projected frame rate must exceed that to avoid the appearance of flickering.

Silent movies were shot 18 fps and projected 54 fps. The frame rate for
talkies was boosted to allow enough bandwidth for the audio. The reason silent
movies look jerky today is every third frame is projected twice. They are
smooth when properly projected.

~~~
mrob
50Hz is below the flicker fusion threshold for most people. Even 60Hz visibly
flickers for many people.

Silent movies were shot and projected at many different framerates. Some early
silent movies were shot at 18fps, but the average frame rate increased over
time, and the standard 24fps was picked as a compromise that was around the
average frame rate at which films were displayed at that time.

------
dahart
My kids aren’t bothered by high frame rates, so personally I suspect it only
bothers adults who are used to 24fps.

> And an entire cinematic language has developed around the rate of 24 frames
> per second — the way actors perform, the way shots are composed and cut and
> cameras move. (This is why an awards show or a news broadcast shot on video
> at a higher frame rate looks and feels different from a film.)

This is a real stretch. Very, very few filmmakers are doing anything specific
for 24fps that they wouldn’t do in 48 or 60. If they did, they’d slow things
down so you could see them, but instead we have ever faster and faster fight
sequences in Marvel and Transformer movies where you can’t even see the
details during the action.

The “cinematic language” of a news show vs an action movie is different, but
has almost nothing to do with frame rate, and if frame rate was the main
issue, we’d be doing news in low frame rate and action movies in high frame
rate.

Horizontal pans in 24fps have started to drive me crazy. Films do it all the
time, and you can barely see _anything_ while that’s going on. Higher frame
rate pans are _so_ much easier to watch, even if they make the movie look like
BBC tv.

~~~
soulofmischief
It's more than just a higher framerate. It's false motion, which some are
hypersensitive to because it's not natural looking.

~~~
dahart
> It’s false motion

I agree if we’re talking about motion interpolation. I’m personally sensitive
to artifacts from the 24->60 conversion, and I sometimes turn off frame rate
up-sampling, unless the show has a lot of panning, and then I prefer the frame
rate up-sampling despite the artifacts it causes sometimes. Like Game of
Thrones for example has so much panning that I kept turning on the motion
smoothing. But then the artifacts were so bad I kept turning it off. There
isn’t a good option, so I wish it was filmed at a higher frame rate, and then
displayed lower for people who prefer that.

Above I was really only referring to high frame rates in general and
deconstructing the argument that 24 is inherently better for artistic reasons.
While many filmmakers and viewers like 24fps, most filmmakers aren’t choosing
24fps nor are they doing anything specific with 24fps. 24 is just a standard
that almost everyone is stuck with, most people wouldn’t even consider it a
choice today, save the few directors with big enough budgets and studio
backing that they can distribute films in a non-standard frame rate.

------
jim-a-1020401
I don’t understand why TV manufacturers can’t just have dynamic frame rate
change to match the content like Apple ProMotion
([https://m.gsmarena.com/understanding_apples_promotion_displa...](https://m.gsmarena.com/understanding_apples_promotion_display_on_the_new_ipad_pro-
news-25446.php))

~~~
brokenmachine
Newer TVs are starting to support Dynamic refresh rate, ostensibly for gaming
modes.

I wonder if this tech becomes mainstream, it will eventually start to be used
for normal content.

------
MikusR
And ebook readers that allow changing of fonts are ruining book publishing.

~~~
tgb
I can't tell if you're joking, but I purchased a math text book as an
undergrad that I needed last-minute as a Kindle ebook and it was an awful
mistake, in part because of text reflow. You'd see a formula at the top of a
page but days later when you went back to look for it it would now be at the
bottom because the text had reflowed. It really threw things off.

There were other problems: the index was garbled and useless (it still had the
page numbers of the paper book in it and the formatting was completely off),
the text could be resized but the formulas were rasterized and did not scale
with the text and were frankly hideous. This was form a major publisher. I'd
much rather have had a PDF.

But for novels, changing font size and hence reflow of text are super nice.

~~~
_nalply
To help you, perhaps you can get the PDF from Library Genesis. Google
"libgen", then enter the title or ISBN of your book.

------
noisy_boy
So this is what made me feel watching the new smart TVs at the showroom like
it was "too realistic" and "less cinematic". I wrongly thought it was a side-
effect of super high resolution - didn't even know that this was a thing
before I read this piece.

This has no value except a being a gimmick which can be used to say "look how
smooth/real it is". Worse is to set this to be "on" by default. They should
have just had a button on the remote with a marketing-inspired name, like
"motion flow" or something, which people could have turned on if they wanted
this.

~~~
brokenmachine
I don't mind the high frame rate so much as the obvious artifacts.

I can't help myself and I end up hunting for artifacts to be annoyed by more
than enjoying the show.

Every TV that uses motion interpolation is the same, it's horrible.

------
Merrill
For decades broadcasters have converted movies to NTSC video using three-two
pull down, where each movie frame is converted to either three or two video
fields in order to convert the 24 frame rate to the 60 field rate (well,
actually 29.97 frames/second to 59.94 fields/second).
[https://en.wikipedia.org/wiki/Three-
two_pull_down](https://en.wikipedia.org/wiki/Three-two_pull_down)

How can the new smoothing conversion be worse than that?

~~~
sbergot
My issue with TV smoothing is that it is inconsistent. It can revert back to
the original framerate when the scene is too complex.

~~~
mrob
I agree that it's a problem, but a constant 24fps is also inconsistent in how
much blur it produces. 24fps is fine for a static talking scene, but as soon
as the camera moves everything turns blurry and jerky looking. That harms my
immersion just as much as inconsistent motion smoothing.

~~~
sbergot
I guess those things are subjective. I don't mind some motion blur the same
way I don't like hand animation smears.

------
comboy
To me it's like music creators telling people not to listen to their music on
v-shaped headphones. Yeah, it's not what you intended. But that's what they've
chosen to use.

Also, if they want the picture to be displayed in a certain way, couldn't they
push they movie as 60/frames per second where in fact it's 24/frames per
second with static frames? Wouldn't that effectively disable motion smoothing?

~~~
mwfunk
Nobody’s telling anyone to do anything. The problem people are talking about
is that this setting looks like absolute garbage and it’s inexplicably turned
on by default on almost all TVs these days. The only people who think
otherwise are people who just literally do not care at all about picture
quality and probably also do stuff like stretching the picture to get rid of
letterboxing. These are not the people that default settings should be based
on.

~~~
paulryanrogers
Quite an opinion you've got there. I too think the smoothing is odd, but I'm
over 35.

~~~
mwfunk
I'm certain age has nothing to do with it. The only thing that determines
someone's opinion here is how much they care about accurate AV reproduction,
and that's not driven by age, only passion for the content.

~~~
paulryanrogers
It's possible those who will grow up with higher frame rates will consider the
smoothing normal. And they may consider 24fps choppy. Time will tell.

~~~
mwfunk
I wonder if we're focusing on two different things here. I'm not thinking so
much about the merits of the 24fps video experience vs. the 60fps video
experience. I'm just focused on clean playback of the source content. If
something was recorded in 24fps, I want to see it in 24fps. If something was
recorded in 60fps, want to see it in 60fps.

I'm more complaining about the automatic 24->60 (or 120 or 240) conversion
that's enabled by default on a lot of new TVs. I'm not at all opposed to
higher frame rate video, I like it. I've just never watched something on a TV
where the 24->60 conversion didn't look significantly worse to me than
playback at the native frame rate.

Maybe my unconscious assumption is that people's preferences about the ideal
frame rate can change over the years, but that reinterpolating the video to a
different frame rate will always look worse than native, regardless of the
source and destination frame rates. Maybe this isn't true, but that is my
assumption.

------
iforgotpassword
I must admit, I might not even care that much if the processing weren't so
flawed. What drives me mad is those weird patterns or glitches, e.g. when
someone walks in front of a background with a high contrast pattern on it
while the camera pans along while zooming out slightly. It usually creates
very visible artifacts around the person.

------
tgb
A year or two ago, the TV my roommates and I had would give weird artefacts
while watching sports. I wasn't sure if it was the TV or the broadcast but the
effects were things like people's eyes would move relative to their head or
the logo on the shirt would move relative to the shirt. They'd sort of bounce
up and down. Is this motion smoothing?

My guess at the time was it was an image stabilization algorithm put on by the
studio since sports broadcasts involve long-range videography. Presumably the
logo or eyeballs were getting "stabilized" in place while the rest of the
person wasn't, making them appear to jiggle up and up down compared to the
person's movement. It was eerie. But these were major sports streams, like
world series baseball, and I'd be surprised if they messed it up that badly.

~~~
brokenmachine
That almost sounds like interlacing, but when you say it was only parts of the
image that were moving, I don't know.

------
eddyg
If you want to experience for yourself the effect frame rate (and the related
motion blur) has, this is a really great demo:

[https://frames-per-second.appspot.com](https://frames-per-second.appspot.com)

------
thinkloop
I don't get it, what's the "debate"? According to the article there is
literally no advantage to motion smoothing. When motion smoothing is off, if
the video is shot at high frame rates, like TV or sports, it displays them
that way as people like, and if the video is of film at 24 fps, it displays it
like that as people also prefer. The article seems to try to engineer
controversy by vaguely hinting that sales could be lost if smoothing weren't
on for certain key edge cases, like in-store demos, but that's not the case.
So the question is why is ti default at all, what's the issue?

------
TrackerFF
I remember when a friend of mine got their first HD TV and Blu-ray, none of us
had ever seen that stuff. We were still used to CRT screens, and at best DVD.

We borrowed a movie, fired it up - and everyone thought there was something
wrong with the movie. The image quality looked almost like something from
handheld camera, just very non-cinematic. We returned the movie, and went with
another one.

Nope, same problem. We just figured that's the way HD looked like.

------
Aardwolf
Excuse me for being ignorant, I don't really watch much TV, but:

Why do they need this feature for sports? Couldn't they broadcast the sports
at 60Hz? If they don't broadcast sports at 60Hz, then what framerate are they
broadcasting it at, and what are the 60Hz TV's intended for in the first
place?

~~~
mrob
60fps is still too low for very fast motion, of which sports is a notable
example. And the focal point of most sports is a ball, which moves along
predictable paths, so motion smoothing can do a good job of increasing the
frame rate to the point where the sample-and-hold blur of 60fps isn't visible.

~~~
Aardwolf
So do the televisions use 120 fps then?

Phew, then at least displaying 24 fps movies without interpolation should be
possible because 120 is an integer multiple of 24.

~~~
mrob
Yes, or higher than 120fps. And 120fps can indeed be used to show 24fps with
consistent frame timing (as well as 30fps and 60fps, so it's a convenient
frame rate).

~~~
Aardwolf
It's got to be bad for cycle or motorcycle sport by the way.

I saw some videos on youtube showing the motion smoothing in slo-mo, and heads
and bodies were moving at different speeds between frames with this.

------
dreamcompiler
This is yet another example of technology designed to impress consumers in the
store which actually worsens the experience once they get the product home.
Apple's business strategy for computers has been centered around this trick
since Steve Jobs died.

------
sneak
My new top-of-the-range Sony 4K UHD HDR oled has this setting turned on - FOR
EACH OF FOUR INPUTS. You have to turn it off (as well as a bunch of other
adaptive contrast junk) _per input_. It’s maddening.

------
PorterDuff
Just as an aside, there was a lot of work on this kind of thing (ie.
'improving' 3:2 pulldown) before the modern flat panel tv. Improving slow
motion sucked up a few kilobucks of engineering time.

------
lousken
I've tried SVP which is motion interpolation software and while anime looks
amazing, it totally destroys any real motion picture.

------
jajag
Is the article actually suggesting that the brain subconsciously detects the
frame rate, and then "frames" the movie to match the perceived medium? High
frame rate means TV implies soap opera, low frame rate means film implies more
serious, weighty content? Or am I completely misunderstanding what this "soap
opera" effect is?

~~~
jim-a-1020401
I think it is perceptual but not that specific to the medium. A faster frame
rate “feels” closer to reality and the 24fps frame rate is far enough from
reality that it’s easier to perceive as “something else” which allows fiction
or dramatic content to be perceived differently. I tend to agree with that
conception of it. The idea that when the frame rate gets high enough to not be
noticeable then the brain treats it more as a real world perception and the
standard is higher so it is more harshly judged and ends up seeming as lower
quality reality instead of fantasy mode.

------
lagadu
The root cause of the problem is cinema still using 24fps instead of a proper
standard in 2019 (with a rare exception here and there).

~~~
mrob
100% agreed. 24fps is far too low for fast motion. People in the industry talk
about "art" to justify their low frame-rate, but I don't believe them for a
moment. 24fps lets film-makers get away with sloppy technique that would be
exposed if we could actually see moving things clearly. 24fps is cost-cutting.

TV makers are just trying their best to improve things under the constraint of
bad quality source material. Motion smoothing isn't perfect, but I'm not
convinced it's worse than the alternative.

~~~
andybak
It's just peculiar that all the director's named in that article and almost
everyone I know who thinks in any depth about this issue disagrees with you.

It's conceivable I might "get used to it" but that's like telling someone who
hates cilantro that that will "get used to it". They don't want to and neither
do I.

~~~
mrob
It's not peculiar in the slightest when, as Upton Sinclair famously put it,
"it is difficult to get a man to understand something, when his salary depends
upon his not understanding it." The industry is under increasing financial
pressure (e.g. see [https://www.vanityfair.com/news/2017/01/why-hollywood-as-
we-...](https://www.vanityfair.com/news/2017/01/why-hollywood-as-we-know-it-
is-already-over) ), so why increase costs when customers are still buying
24fps material?

~~~
greedo
How on earth would increasing the frame rate increase costs? It's all (mostly)
shot on digital media, so there's no extra film costs like in the pre-digital
era. Storage costs are trivial for digital media. It wouldn't increase the
time for setting up shots, and probably wouldn't cost extra in postprod.

~~~
mrob
Brighter lighting required. Scenes become impossible to shoot with natural
light. Props and sets need more detail. Unless the actors are very good, more
takes are needed to get the acting right. More work needed on continuity, and
more reshoots because a mistake is obvious. More VFX render time (and more
manual work in some cases, see
[https://news.ycombinator.com/item?id=20600107](https://news.ycombinator.com/item?id=20600107)
).

And storage costs are not trivial. Assuming a modest 10:1 shooting ratio, 8K
resolution, 120fps, and 36bit color (12 bits per channel), that's 1.29PB of
raw data for a 2 hour movie. Some genres (notably documentaries) have much
higher shooting ratio.

~~~
greedo
1\. Most scenes are lit with extra lights. It's extremely rare to shoot with
just natural light. 2\. Prop and set detail has nothing to do with FPS, but
with final image resolution. 3\. Actors are always good. They're pros. You'd
be amazed at how good. Regardless, the FPS has no bearing on the amount of
retakes due to talent performance. 4\. FPS has nothing to do with continuity.
Either the script supervisor catches the error, or it's noticed in post and
corrected then. 5\. Storage costs are trivial compared to most production
costs. Above the line costs dwarf storage.

------
rv-de
Actually this effect has been observable for several decades with cathode ray
tube TVs. Recordings from normal amateur video cameras look more / differently
fluid compared to scenes from movies. This is particularly obvious with some
movies from the 90s where you have scenes cut into the movie which have been
recorded with regular video cameras - for example for dramatic reasons.
Awkwardly few people seem to be able to recognize this at all. To me this
motion smoothing is absolutely obvious and I find it mostly terrible. Real HFR
adds to the realism of animations in my opinion. But it seems that for some
super subtle reason some people's visual processing feels repulsed by this
fake HFR - like uncanny valley - it's smoother but in a way that doesn't seem
right while you're not sure why.

