Hacker News new | past | comments | ask | show | jobs | submit login
Motion smoothing is ruining cinema (vulture.com)
79 points by fanf2 77 days ago | hide | past | web | favorite | 167 comments



Turned off the "feature" as soon as Tv was on. Have been advising friends and adjusting their expensive oled TV's as well. In every case they were grateful for me "fixing" their TVs which only took a few minutes.

People just don't want to muddle around in the settings. And here lies the problem. It would be great if manufacturers would open up their interface settings to be influenced more easily by mediaplayers or settup boxes. Those in turn, could have a ruleset: if 24p, then disable motion blur if else enable motion blur. If sports enable motion blur. And so on.

It's not even limited to motion blur, I have low light and normal light profiles I'd like to have changed automatically based on the actual light in my room.


Motion smoothing is designed to reduce motion blur. It doesn't always work, because it's just heuristic processing that attempts to estimate the missing information, but when the motion is easily predictable (pans are an obvious case), and the exposure time (shutter angle) is low, it can do an excellent job of interpolating the missing frames. This eliminates the sample-and-hold blur[0] that would otherwise appear. It's motion sharpening, not motion blur.

[0] Blur Busters has a good introduction, that's relevant to all non-flickering displays, not just OLED: https://www.blurbusters.com/faq/oled-motion-blur/


I work in the lighting control industry. Your final paragraph piques my interest. I’m curious, why automatically change the TV if you could automatically change the room? Shades drop, light intensity lowers, CCT warms, etc...?


Because the vast majority of people don't have multi-thousand dollar room control setups but they would find some benefit from their entertainment devices being able to describe their features to each other.


Yes, cost is always a factor. That said, if the room control was affordable, would changing the space be an objectively better solution than changing the display?


I think ultimately you probably want both, because sometimes you (... or at least I) have different wants around viewing. Sometimes I'm going for a cinematic experience, where I want the room dark and all focus on the media. Other times I want to be half watching something while playing a board game or doing some cleaning. Optimal lighting conditions for these scenarios clearly differ, and probably optimal display settings to match.


Most people aren't exclusively using their TV to watch movies. When turning on the TV in the afternoon to watch your favorite soap opera with finest motion smoothing, you might not bother that much and just want the TV to be a little brighter.


Fair enough. Wonder if any SmartTVs have an API endpoint for brightness/profiles. It’d be cool to have an external/remote ambient light sensor that could change your TV setting automatically.


Sensible defaults, oh where did I heard that from?


This whole thread and article is like it's from an alternative universe to me. All of my tech-savvy friends go to great lengths to turn motion smoothing on everywhere ("smooth video project" is the best I know for PC) they can so that panning shots don't look awful. They wouldn't play videogames at 24Hz, so why should they watch a film at 24Hz?

It does introduce artefacts. The solution to all of this is for filmmakers to move on from the 1930s and film at 120 fps or higher. If people only want to watch a quarter of the frames, it can be downsampled easily.

The argument reminds me of musicians who didn't want their albums chopped up and sold as individual tracks, to be consumed however listeners desired.


Panning shots probably look awful if you're watching 24 fps content on a device running at 60Hz (many consumer devices, computers, and Blu Ray players do this by default). Since 24 frames can't be evenly spread out over 60 frames, you'll see motion judder, and it does look terrible.

However, 24 fps running at a true 24Hz is perfectly fine, and you will not see flicker - at least not if produced by competent filmmakers. That's because the camera's shutter angle is normally set such that motion blur makes the motion look smooth. Real cinematographers also know exactly how fast they can pan with a given lens without causing any noticeable strobing motion.

Comparisons to video games completely miss the point. Most video games are unable to properly simulate motion blur (some try, but it doesn't usually work well), so you have to have high frame rates for things to look smooth. Since you need to react quickly in video games, high frame rates (combined with the lack of motion blur) are also helpful in preserving crisp edges on moving objects (so it's a practical advantage). And finally, video games generally try to simulate reality for the player, so players are more concerned that the technology makes the experience believable rather than intentionally suspending disbelief (as in film or theater) to passively unpack the narrative nuance of artwork unfolding on a screen.

For those same reasons, sports also do well with higher frame rates, but when fiction (which is obviously disconnected with the present reality) tries to use high frame rates, it falls into "uncanny valley" territory - much like the computer-generated humans of movies like "The Polar Express". As others have noted, a few directors have really tried to push HFR film to the public, but it has never been received well (whether or not HFR is recognized as being responsible for the uneasiness felt by audiences watching HFR content).

As for HFR content being "easily" downsampled - it really isn't, at least not with respect to motion blur (which is an essential point that many people miss in these discussions).


I'm not talking about 3:2 pulldown; I'm exactly the sort of sad sod who has spent the time writing scripts to match my display's refresh rate with the media that was running. I strongly dislike watching panning shots at 24Hz; I do not think it looks perfectly fine.

You're right about motion blur making 24fps look less awful, though. I'm sure you know that filmmakers often reduce it in fight scenes to deliberately make viewers a bit uncomfortable. Thanks for pointing out that downsampling wouldn't work without some extra effort (e.g. just having the same amount of blur but with the extra frames such that when it was downsampled it looked normal), as that isn't something I had considered.

I find the suggestion that videogames don't need to suspect disbelief, but films do, very strange.


> I find the suggestion that videogames don't need to suspect disbelief, but films do, very strange.

True, the distinction may seem strange to some, and it's not always a clear distinction. Here's how I'd describe it in more detail (just my opinion of course):

In film, we see a person depicted on the screen with certain characteristics that convey emotion (just as we do with a group of lines comprising a stick figure with similar emotion), but we are forced to activate parts of our brain that span the gap between what we see (which obviously isn't reality) and what that character might feel in his/her universe.

To some degree that same experience happens in video games, but I would argue that instead of seeing the emotions and trying to understand them, we (as players) are the person feeling the emotion. In one, we are witnessing events happen to someone else (and exercising empathy), whereas in the other, we are experiencing the events first-hand (even in a 3rd person view, we're playing as if we are the character in view).

A first-hand experience feels most believable when it's actually realistic (hence the need for high frame rates), but a simple stick figure can feel "believable" if we're engaged and sharing the emotion we believe it to be feeling.

As a side note, maybe that's why I personally prefer classic (8-bit era) video games over modern 3D video games :)


24fps pans don't flicker, but they do easily lose the sensation of motion and look like a series of static pictures, even with perfect zero-judder frame timing. This can feel a lot like flicker, and even if you can perceive it as motion it's still very bad looking motion.

Closed down shutter angles are only needed to mitigate the blur of low frame rates. If you're shooting at 120fps you can use a fully open shutter (which helps with the lighting requirements too), which gives you a choice of 5 different simulated shutter angles when you downsample to 24fps by blending frames. Blending 2 consecutive frames then deleting 3 gives you a look very much like modern 24fps film.


>As for HFR content being "easily" downsampled - it really isn't, at least not with respect to motion blur

Er, why not? Motion blur is simply time integration. Why can't you simply average adjacent frames during your framerate decimation?

In fact, since "real" motion blur from shuttering uses a likely-suboptimal square windowing function (shutter is binary open/closed), I'd hazard it possible that motion blur computed from a higher source framerate could potentially be more perceptually pleasing (by using a smoother windowing function). You have more information to work with, after all.


> Why can't you simply average adjacent frames during your framerate decimation?

In theory you could, but in practice, it doesn't work very well and you tend to get some really nasty artifacts depending on the motion of the scene. There's a lot more to it than simply averaging adjacent frames since those source frames likely aren't capturing perfect 360° shutter angle time slices that seamlessly align (temporally) back-to-back.


Ang Lee already tried the 120fps thing, it was a colossal flop. No one likes watching movies in anything other than 24fps dude.

Filmmakers filming in 120fps would destroy the film industry overnight because NO ONE would watch what they produced.


> No one likes watching movies in anything other than 24fps dude.

I know of at least 6 people who prefer higher framerates, myself included. I don't know anyone who has expressed an opinion the other way. As I said, this whole thread is like some kind of alternative universe to me.

Motion smoothing apparently hasn't stopped people buying TVs.


Many people hate motion smoothing though; my mom was complaining that their new TV made everything look like a soap opera until I disabled it.

Like many other things, this is all down to personal preferences; similarly I have friends who can't stand to game at anything <120FPS and one who gets motion sick at 60FPS, while I would rather turn up the graphical detail and survive with 60. Neither opinion is more correct, and you have had several people upthread saying they prefer 24 for films (myself included).


You've missed the part where 120 is divisible by 24. You can losslessly go from 120 to the other popular frame rates, including 24.

Edit: nope, recreating from higher fps the motion blur that's common in 24 would be way more expensive.


> You can losslessly go from 120 to the other popular frame rates, including 24.

Not true. Something filmed natively at 24 fps and a 180° shutter angle is vastly different in terms of perceived motion than 120 fps converted to 24 fps.


If you shoot 120fps with 360° shutter angle you can downsample to 24fps with 144° shutter angle, which is close enough to the traditional 180°. And in any case cinematographers often vary the shutter angle, so viewers are already used to changes. Narrow shutter angle is common in modern action movies.


Yes, cinematographers do vary the shutter angle for artistic purposes, but that just confirms the fact that the amount of motion blur present in a film is a deliberate artistic decision rather than a side effect of using an "antiquated" frame rate.

As for downsampling, I believe your math is off. 120 fps at 360° means each frame is exposed at 1/120s. Conforming that to 24 fps gives you the same exposure duration per frame (you're just throwing out 4/5 frames), so you're effectively at a 72° shutter angle - less than half of what's considered "normal".


I only said that 144° effective shutter angle is possible, not that you can achieve it by deleting 4/5 frames (which does give you effective 72° shutter angle). You actually need to delete 3/5 and blend 2/5 (and the 360° shutter angle of the 120fps source needs to be accurate enough that you don't see the join).


Are 1/48s exposures really that common in cinematography?


Yes.

"By far the most common setting for cinema has been a shutter angle near 180°, which equates to a shutter speed near 1/48 of a second at 24 fps." [1]

"A 180° shutter angle is considered normal." [2]

[1] https://www.red.com/red-101/shutter-angle-tutorial

[2] https://en.wikipedia.org/wiki/Rotary_disc_shutter


Oh.

Right, replicating that exactly requires 240fps with fast shutter and discarding half of the frames. Which is a way more expensive proposition.


The ultimate solution here, I think, is to allow creators to encode the desired setting, frame rate, or content category in the metadata so that a user can benefit from the technology when watching sports without it detracting from a film later on. Then the default setting could be "auto" and diehards could override one way or the other.

I'm glad to see a brief tip to that solution in the article and hope it becomes part of an industry specification.


An industry standard media meta data interface for motion smoothing would be the best solution. It may not necessarily make it to the TV industry though. Although industry standards have been made and adopted widely before (such as USB) there are plenty of other standards that have failed to be adopted (FireWire) despite technological superiority (data rates). The standard didn’t really give a company much advantage by adopting it, USB worked well enough for most use cases and Apple wanted too much in royalties.

Ultimately any decision a company makes is about money. From the article “It’s meant to create a little bit of eye candy in the store”. TV companies think Motion smoothing increases sales. It doesn’t matter what film directors say about it.

The reality is that TV manufacturers are making a commodity, they don’t want to be. No one wants to sell a commodity, there’s no money in it. It’s a constant race to the bottom on price. That’s why we see 3D TVs, curved TVs and a myriad of gimmicks. Everyone wants to differentiate themselves. If there’s a new industry standard that makes consumers and content creators happy, who cares? If I hire an engineer to implement it will it be worth it?


"looks like a soap opera" exposes everything you need to know about the ignorance behind this position.

Reality does not judder.

If you want judder for some artistic purpose, you can have that, just like any other reality-distorting filter.

Some shots are intentionally blurry for valid artistic purposes. Should all movies and tvs be blurry at all times? Some shots are intentionally monochrome for valid artistic purposes. Should all movies and all tvs be b&w or red tinted at all times? Some shots are intentionally too bright or too dark, for valid artistic purposes... You want stroboscopic stop-motion, go ahead and use it.

Superior reproductive technology in no way prevents you from creating a scene that has that and any other kind of artificially applied crappiness like static interference or low resolution or analog scan lines... But crappy reproductive technology does preclude the other 99.9% of the time when you don't want an artificially crappy reproduction.

Video tape was cheaper than film, and tvs and video tape ran at higher frame rates than film, and soap operas were cheaper than movies, and so the one superior aspect of soap operas became associated with "cheap".

It's a freaking ignorant assosciation is all. It's obvious as hell, and you have to be a moron not to recognize why you think "this looks like a soap opera" and see past that.


Seriously, if people are this determined to tie themselves into knots trying to defend their ridiculous bias that smooth motion = soap opera, it kind of makes of despair about whether its actually possible to change anybody's mind about anything ever.

Luckily the fact that tv's with motion smoothing are selling suggests that the rest of us have spent the 2 minutes necessary to get used to it and realized that it's superior.


> Superior reproductive technology in no way prevents you from creating a scene that has that and any other kind of artificially applied crappiness...

The technology to produce movies at high frame rates has been around for a long time (and projection/display technology could have supported it much sooner if the content were there), and yet directors have deliberately, almost universally chosen what you categorize as "artificially applied crappiness" (24 fps). Practically speaking, it is artificially applied, and that's exactly it - it's an intentionally distorted version of reality, and it's that way on purpose. You're free to attempt to remove that filter (by smoothing motion on your TV), but that doesn't represent what the director intended.

Even with "superior reproductive technology", directors are still choosing to produce films at 24 fps, so can you at least appreciate and respect that that's how most of them intend their work to be displayed? True, "reality does not judder" (proper 24 fps content doesn't either, by the way), but reality isn't what filmmaking is about. To argue otherwise misses the point of film as an artistic medium. For other types of content, I agree - reality is the target, but not in film.


> Reality does not judder.

Neither do 24fps movies, except when you play them back on home equipment that doesn't display at an integer multiple of 24fps.

Judder is an artifact of the difference between home and theatrical display and preserving it (and giving alternatives a bad rep) is a deliberate attempt to keep the theatrical experience as a premium venue for viewing studio output.

Fighting for judder isn't about preserving anyone's artisitic vision. And the people saying it is aren't even good liars.


The 24 FPS movie rate premise is that the human eye can’t detect the difference of higher frame rates. But the higher frame rate TV demos look amazing and movies that are resampled look like crap to people because they CAN see at more than 24 FPS.

I’m not saying that the resampling is right, but I am super questioning this film rate in a digital age.

I’m also questioning the algorithm that just creates a messy blur of nonsense between two frames. Surely we can create a smarter algorithm


This isn’t true really, it’s been known since before the dawn of cinema that humans can appreciate higher frame rates.

24FPS was chosen as much for economic reasons as anything else - it’s one of the slowest frame rates cinema could get away with while still delivering a satisfactory experience.

Remember that higher frame rates meant higher consumption of expensive film before the digital era. Lowering the FPS to the lowest you could reasonably get away with reduced the cost of materials significantly.

That it’s still popular today is at least partly testament to the conditioning effect of decades of 24FPS movie consumption. Audiences often find the higher FPS look and “feel” strange, at least in a cinema context, such as the backlash the 48 FPS cinema presentation of the Hobbit movies received.

I strongly believe that if industry had settled on say 48FPS as the standard we’d be conditioned to “expect” that look and complain that 24FPS looks strange.


For context when the "digital era" started: it was newsworthy when a middle series of Dr House switched to digital.

Also: cinematic film is what keeps Kodak alive.


The best software for this I’ve seen is Tachyon but it’s way above consumer technology level for cost and compute resources. https://cinnafilm.com/product/tachyon/

Like the interface and OS I assume the TV manufacturers are mostly using cheap versions of what they think they can get away with without pushing up their costs.


24FPS is chosen because of the work it engages the brain in; it's a sweet spot between seeing film and seeing a series of pictures.


Ozu Yasujirō was famous for his use of static cameras and lack of action scenes. His films are probably the most low-motion mainstream films ever made, which means they do not engage your brain very much in interpreting the 24fps motion. And yet they are still highly regarded by critics, and in my opinion highly engaging in terms of characters and story. Would they have been improved by shooting at 12fps, to make your brain work as hard as it does with conventionally shot films?


Watch La Jetee (Chris Marker). A "film" comprised mostly of stills with just a few actual scenes with movement.


> A "film" comprised mostly of stills with just a few actual scenes with movement.

…which was remade into a high-budget Hollywood CGI-laden production, starring Bruce Willis and Brad Pitt. I’m not even joking; it’s called 12 Monkeys, and was released in 1995.


And into a television series on SyFy.


"La Jetée", the original "12 Monkeys".


Also consider that a film is watched in a different environment and lighting than a TV.


The problem is the refusal of hollywood to release true 60fps content. I'd take a 60fps version of a movie any day over 24 or 30 fps.


Agreed. It might seem weird at first, but I think that in the end it'll be like transitioning from B&W to color. Our brains will adjust and we'll realize that we were missing out on all that fidelity.


60fps changes your perception of a movie. For some reason ~24fps you get the cinematic feeling, with 60fps you get "amateurish" videographer feeling. Not sure what's the science behind that.


It's just conditioning. If it had been the opposite, you'd find 24 jarring and 60 natural.


24fps lets you focus attention with motion blur (tracking shot on two characters walking, they will be in sharp focus, while everything else will be blurred but still convey movement, etc.). It's a similar tool beyond depth of field. However, a 60hz movie could just have 30hz portions. Stuff like panning cameras over a landscape looks terrible at 24hz. I don't know how you could combine the two without making things feel too much like changing between technology all the time during the movie though.


You could easily add high quality motion blur in 60 FPS if that was the desired look.

More FPS is really no different that increased resolution. There is nothing sacred about 24, 480 or 1080.

It's all conditioning ;)


How would you add lush blur to a tracked shot in 60fps? The exposure wouldn't last as long so you'd have to do it digitally wouldn't you? Then it can be hard to keep the tracked subjects unblurred or can make things look interpolated.


The lenses would still have a limited depth of field though, like a photo from an SLR.

Wouldn't that make the background be blurred? Or is that kind of blur somehow different to motion blur?


Motion blur indicates direction, you could maybe use a weird aperture shape to simulate it, but things at different depths wouldn't blur right due to not picking up different parallax movements the same way, and other moving objects that weren't in focus wouldn't have their motion blur picked up correctly.


Oh yeah, that makes sense.


Faster shutter = less motion blur?


I, for one, don’t want to watch movies that aren’t in 24fps.


Why? Seriously. This sounds as crazy to me as someone saying: "I, for one, don't want to watch movies that aren't 320p."


Yet, black and white photography is alive and well


What's your point? Outhouses are still alive and well as well. Doesn't mean modern sanitation is vastly superior.

Not sure what that has to do with this discussion.


You seemed to have doubted that rational people might be choosing limited tech in the very post I replied to. So I provided an example of well respected class of art, where much of the image information is discarded, ignoring decades of progress.


I don't doubt rational people choose limited tech in specific circumstances. I doubt rational people choose limited tech in all circumstances.

What would you think about someone that only wants to see black and white photography and won't even take a look at full color photographs?


There's a big difference between appreciating black and white photography and thinking that color photography should not exist.


Higher resolution is unambiguously better.

Higher frame rates are not.


Why? Every single time I see 60fps content it feels like higher quality then standard.


Most people who are into film do not agree with you, hence the fact that higher frame rates being better is ambiguous at best.

Nobody is disputing that higher resolution is better. You have most of an industry screaming about the terrible trend of higher frame rates.

I’m glad that you prefer it, but that doesn’t make it an unambiguous benefit.


Where do film people scream about higher framerates? I know at least that James Cameron want to move to 60 [1].

[1] https://www.cinemablend.com/new/James-Cameron-Still-Making-T...


What TV manufacturers are doing is unacceptable. It introduces strange artifacts and the motion looks very artificial compared to a native 60fps video. So it really doesn't solve anything.

The argument that it's the filmmakers' fault for using 24fps is flawed. What about animation? What about stop motion? Those may never be produced at a higher framerate. Movies with high VFX budgets will be much more expensive due to the additional frames when rotoscoping and compositing.

So if it is inferior and only makes sense for a subset of productions, why is it the default?


In many cases the bulk of the VFX budget is artist wages, not CPU time. If you already have the motion paths set up then it doesn't take much more human effort to render more intermediate frames. You might even be able to reduce the cost per frame by exploiting the smaller differences between frames, e.g. with temporal noise reduction.

Hand drawn or hand posed animation is the rare case where increased frame-rate really would take more human effort, but it's often already produced at extra-low framerate (e.g. 12fps or 8fps), with frames duplicated for playback. The same duplication can happen with higher playback framerates. Modern video codecs handle duplicated frames well.


And yet many fully computer-generated movies today are converted to 3D in post, rather than being created in 3D in the first place. For CGI movies, doing 3D at creation time gives better results and requires zero human labor (it's just a translation of the camera's matrix), so why isn't it always done?


CG movies are created in 3D, they use 3D modeling software. There's nothing to "convert" to 3D unless they lose the original models and have to retrace them or something. Which apparently is a quite common occurrence, according to some articles on the 2D to 3D conversion of Toy Story and Shrek. So collateral damage from capitalism is the main explanation for old movies.

For new movies, where there is proper data, the 3D camera for each scene still requires more than "zero human labor"; you have to specify the inter-ocular distance, convergence plane, and tweak the scene a little for cinematic effect or 3D weirdness. And since there are thousands of shots, and the work is fairly tedious, it makes little sense to work on it in parallel with 2D during production when shots might just be cut entirely. So you end up with a bias towards 2D or 3D during the creation process.

Dreamsworks apparently has a made-for-3D animated movie division (https://animatedviews.com/2007/dreamworks-goes-3d/), where presumably they are flattened to 2D in post. Reviews of their movies (https://www.3dor2d.com/reviews/how-train-dragon-hidden-world...) unsurprisingly say "great use of 3D". Since that's the goal of the division I'd expect them to continue doing 3D-first movies.

In contrast Pixar so far has stuck to a more 2D style. In this interview (https://www.youtube.com/watch?v=HepIGDJK98s) their stereo guy says they do 3D in post. The 3D in their movies is described as "well done but a minor aspect" (https://www.3dor2d.com/reviews/Toy-Story-4-3D-3-D-Movie-Revi...). In the interview the guy basically says that he could use more 3D but he's more concerned with keeping the director happy so he doesn't. Pixar is now focusing on originals so maybe one of their next films will be 3D oriented. But they're making tons of money anyway so who knows if the right director will come along.


> CG movies are created in 3D, they use 3D modeling software.

Of course. I should have been more specific; I was talking about stereo conversion. You addressed that. Thanks for your explanation. It's not as labor-free as I had assumed.


That's why I specifically referred to rotoscoping and compositing, not rendering. The artist has to edit frame by frame.


Isn't MPEG or any other encoding doing similar things?


MPEG does motion compensation for compression AFAIK, which is different to motion interpolation, which is what is being discussed.


One problem is that TVs run at 60fps and don't support adaptive sync. 60 is not evenly divisible by 24. Without some kind of trick you're padding to 30fps by duplicating every 4th frame, or trimming to 20fps by dropping every 6th frame.

Padding is usually what's done. You can see it in NTSC (30fps) DVDs which come from something originally produced for PAL (25fps). Every 5th frame is duplicated.

It makes me curious what framerate movie projectors run at. Maybe the "cinema experience" includes goofy frame duplication and nobody realizes it.


Traditionally, each film frame is projected 3 times. The psycho-perceptual reason is that the phi-phenomenon (“light chaser effect”) kicks in about 12–15 Hz, so camera frame rate must exceed that to give a perception of smooth motion. The critical flicker frequency is around 50 Hz for most people, so projected frame rate must exceed that to avoid the appearance of flickering.

Silent movies were shot 18 fps and projected 54 fps. The frame rate for talkies was boosted to allow enough bandwidth for the audio. The reason silent movies look jerky today is every third frame is projected twice. They are smooth when properly projected.


50Hz is below the flicker fusion threshold for most people. Even 60Hz visibly flickers for many people.

Silent movies were shot and projected at many different framerates. Some early silent movies were shot at 18fps, but the average frame rate increased over time, and the standard 24fps was picked as a compromise that was around the average frame rate at which films were displayed at that time.


Nowadays with digital TVs that conversion between frame rates is completely pointless. The playback device just switches to a different frame rate and resolution and the tv should adapt. My cheap TCL TV accepts different rates just fine. I can somewhat understand that they keep doing that for DVD releases since someone somewhere might watch it on their old trusty CRT, but other than that you might as well just import the release that has the original frame rate. Oh wait, there are those region codes right? opens the pirate bay ...



My kids aren’t bothered by high frame rates, so personally I suspect it only bothers adults who are used to 24fps.

> And an entire cinematic language has developed around the rate of 24 frames per second — the way actors perform, the way shots are composed and cut and cameras move. (This is why an awards show or a news broadcast shot on video at a higher frame rate looks and feels different from a film.)

This is a real stretch. Very, very few filmmakers are doing anything specific for 24fps that they wouldn’t do in 48 or 60. If they did, they’d slow things down so you could see them, but instead we have ever faster and faster fight sequences in Marvel and Transformer movies where you can’t even see the details during the action.

The “cinematic language” of a news show vs an action movie is different, but has almost nothing to do with frame rate, and if frame rate was the main issue, we’d be doing news in low frame rate and action movies in high frame rate.

Horizontal pans in 24fps have started to drive me crazy. Films do it all the time, and you can barely see anything while that’s going on. Higher frame rate pans are so much easier to watch, even if they make the movie look like BBC tv.


It's more than just a higher framerate. It's false motion, which some are hypersensitive to because it's not natural looking.


> It’s false motion

I agree if we’re talking about motion interpolation. I’m personally sensitive to artifacts from the 24->60 conversion, and I sometimes turn off frame rate up-sampling, unless the show has a lot of panning, and then I prefer the frame rate up-sampling despite the artifacts it causes sometimes. Like Game of Thrones for example has so much panning that I kept turning on the motion smoothing. But then the artifacts were so bad I kept turning it off. There isn’t a good option, so I wish it was filmed at a higher frame rate, and then displayed lower for people who prefer that.

Above I was really only referring to high frame rates in general and deconstructing the argument that 24 is inherently better for artistic reasons. While many filmmakers and viewers like 24fps, most filmmakers aren’t choosing 24fps nor are they doing anything specific with 24fps. 24 is just a standard that almost everyone is stuck with, most people wouldn’t even consider it a choice today, save the few directors with big enough budgets and studio backing that they can distribute films in a non-standard frame rate.


Most adults aren’t bothered, either.


I don’t understand why TV manufacturers can’t just have dynamic frame rate change to match the content like Apple ProMotion (https://m.gsmarena.com/understanding_apples_promotion_displa...)


Newer TVs are starting to support Dynamic refresh rate, ostensibly for gaming modes.

I wonder if this tech becomes mainstream, it will eventually start to be used for normal content.


All TVs are capable of displaying 50Hz European content natively. Converting 24fps to that is trivial - play it at 25fps(not noticeable) and every frame twice.


Money.


And ebook readers that allow changing of fonts are ruining book publishing.


I can't tell if you're joking, but I purchased a math text book as an undergrad that I needed last-minute as a Kindle ebook and it was an awful mistake, in part because of text reflow. You'd see a formula at the top of a page but days later when you went back to look for it it would now be at the bottom because the text had reflowed. It really threw things off.

There were other problems: the index was garbled and useless (it still had the page numbers of the paper book in it and the formatting was completely off), the text could be resized but the formulas were rasterized and did not scale with the text and were frankly hideous. This was form a major publisher. I'd much rather have had a PDF.

But for novels, changing font size and hence reflow of text are super nice.


To help you, perhaps you can get the PDF from Library Genesis. Google "libgen", then enter the title or ISBN of your book.


So this is what made me feel watching the new smart TVs at the showroom like it was "too realistic" and "less cinematic". I wrongly thought it was a side-effect of super high resolution - didn't even know that this was a thing before I read this piece.

This has no value except a being a gimmick which can be used to say "look how smooth/real it is". Worse is to set this to be "on" by default. They should have just had a button on the remote with a marketing-inspired name, like "motion flow" or something, which people could have turned on if they wanted this.


I don't mind the high frame rate so much as the obvious artifacts.

I can't help myself and I end up hunting for artifacts to be annoyed by more than enjoying the show.

Every TV that uses motion interpolation is the same, it's horrible.


This is exactly the reason, and it makes everything look like a horrible soap opera. 24fps and motion blur, is a lot of what makes movies, look like movies. Otherwise it turns into someones home video.


For decades broadcasters have converted movies to NTSC video using three-two pull down, where each movie frame is converted to either three or two video fields in order to convert the 24 frame rate to the 60 field rate (well, actually 29.97 frames/second to 59.94 fields/second). https://en.wikipedia.org/wiki/Three-two_pull_down

How can the new smoothing conversion be worse than that?


My issue with TV smoothing is that it is inconsistent. It can revert back to the original framerate when the scene is too complex.


I agree that it's a problem, but a constant 24fps is also inconsistent in how much blur it produces. 24fps is fine for a static talking scene, but as soon as the camera moves everything turns blurry and jerky looking. That harms my immersion just as much as inconsistent motion smoothing.


I guess those things are subjective. I don't mind some motion blur the same way I don't like hand animation smears.


That is copying the original frames pixel for pixel. Motion smoothing involves interpolation heuristics (if you're lucky) between two original frames, creating an artificial frame that the film creator never intended nor vetted.


To me it's like music creators telling people not to listen to their music on v-shaped headphones. Yeah, it's not what you intended. But that's what they've chosen to use.

Also, if they want the picture to be displayed in a certain way, couldn't they push they movie as 60/frames per second where in fact it's 24/frames per second with static frames? Wouldn't that effectively disable motion smoothing?


Nobody’s telling anyone to do anything. The problem people are talking about is that this setting looks like absolute garbage and it’s inexplicably turned on by default on almost all TVs these days. The only people who think otherwise are people who just literally do not care at all about picture quality and probably also do stuff like stretching the picture to get rid of letterboxing. These are not the people that default settings should be based on.


If it says 200Hz on the box, it must be better than the one that's only 120Hz, right?

This is the amount of research people do before buying something they'll stare at for 4h per day for the next 5 years... Never mind that the native framerate of the panel is only 60Hz.

This is why we get Vivid HDR mode ("Torch mode") and motion smoothing enabled by default.


Quite an opinion you've got there. I too think the smoothing is odd, but I'm over 35.


I'm certain age has nothing to do with it. The only thing that determines someone's opinion here is how much they care about accurate AV reproduction, and that's not driven by age, only passion for the content.


It's possible those who will grow up with higher frame rates will consider the smoothing normal. And they may consider 24fps choppy. Time will tell.


I wonder if we're focusing on two different things here. I'm not thinking so much about the merits of the 24fps video experience vs. the 60fps video experience. I'm just focused on clean playback of the source content. If something was recorded in 24fps, I want to see it in 24fps. If something was recorded in 60fps, want to see it in 60fps.

I'm more complaining about the automatic 24->60 (or 120 or 240) conversion that's enabled by default on a lot of new TVs. I'm not at all opposed to higher frame rate video, I like it. I've just never watched something on a TV where the 24->60 conversion didn't look significantly worse to me than playback at the native frame rate.

Maybe my unconscious assumption is that people's preferences about the ideal frame rate can change over the years, but that reinterpolating the video to a different frame rate will always look worse than native, regardless of the source and destination frame rates. Maybe this isn't true, but that is my assumption.


And to me it is like a painter complaining about a museum which shines a RGB-Disco-Light onto a piece of work whose colors have been purposefully considered.

Music as well as Films are pieces of work who are only really existing in their projection, a playthrough or a concert. Everything that mangles with that projection without beeing a willful choice of those projecting it makes it harder for artists to speak to an audience in the way they intended.

There are perfectly fine reasons to change a projection, e.g. for the or visually impaired or because you just prefer it that way, but ultimately it should be a willful decision to deviate from the expected default..


It's not a cinema. So it's not a painter complaining about a museum. It's a painter complaining about the light in your living room.

I understand the artist, but most people won't get best of anything. You don't see coffee companies complaining about your grinder settings even though you definitely could get a better taste if you only paid attention.


I take it that you aren't a fan of pan and scan.


Indeed : )


That has its own issues - you can’t completely “smoothly” convert 24FPS source material to 60FPS, and many viewers can detect the uneven frame cadence when this is done due to the inconsistent introduction of extra frames to get it to fit the 60FPS cadence. You can often see it in slow panning shots, were it introduces a perceptible judder to the video.

While the Wikipedia article on 3:2 pulldowns mainly discusses converting 24FPS film to 30/60 FPS broadcast, the same conversion is virtually always done on digital material as well, in software at playback time or by the video codec changing the frame rate.

As mrob also writes, you only get smooth frame rate changes usually if the new frame rate is an integer multiple of the original one, as then the extra frames introduced is consistent. This is why 30FPS video doesn’t have this problem at 60hz, but 24FPS does.

> https://en.wikipedia.org/wiki/Three-two_pull_down


60 isn't an integer multiple of 24, so it would result in uneven frame timing. It would look worse than playing 24fps with every frame shown for the same time, which modern TVs can do.


That solution sound incredibly convoluted. Besides, the next (or current?) generation TVs would probably analyse the frames anyway.


It's just a few additional bits since everything is compressed anyway.


I must admit, I might not even care that much if the processing weren't so flawed. What drives me mad is those weird patterns or glitches, e.g. when someone walks in front of a background with a high contrast pattern on it while the camera pans along while zooming out slightly. It usually creates very visible artifacts around the person.


A year or two ago, the TV my roommates and I had would give weird artefacts while watching sports. I wasn't sure if it was the TV or the broadcast but the effects were things like people's eyes would move relative to their head or the logo on the shirt would move relative to the shirt. They'd sort of bounce up and down. Is this motion smoothing?

My guess at the time was it was an image stabilization algorithm put on by the studio since sports broadcasts involve long-range videography. Presumably the logo or eyeballs were getting "stabilized" in place while the rest of the person wasn't, making them appear to jiggle up and up down compared to the person's movement. It was eerie. But these were major sports streams, like world series baseball, and I'd be surprised if they messed it up that badly.


That almost sounds like interlacing, but when you say it was only parts of the image that were moving, I don't know.


If you want to experience for yourself the effect frame rate (and the related motion blur) has, this is a really great demo:

https://frames-per-second.appspot.com


I don't get it, what's the "debate"? According to the article there is literally no advantage to motion smoothing. When motion smoothing is off, if the video is shot at high frame rates, like TV or sports, it displays them that way as people like, and if the video is of film at 24 fps, it displays it like that as people also prefer. The article seems to try to engineer controversy by vaguely hinting that sales could be lost if smoothing weren't on for certain key edge cases, like in-store demos, but that's not the case. So the question is why is ti default at all, what's the issue?


I remember when a friend of mine got their first HD TV and Blu-ray, none of us had ever seen that stuff. We were still used to CRT screens, and at best DVD.

We borrowed a movie, fired it up - and everyone thought there was something wrong with the movie. The image quality looked almost like something from handheld camera, just very non-cinematic. We returned the movie, and went with another one.

Nope, same problem. We just figured that's the way HD looked like.


Excuse me for being ignorant, I don't really watch much TV, but:

Why do they need this feature for sports? Couldn't they broadcast the sports at 60Hz? If they don't broadcast sports at 60Hz, then what framerate are they broadcasting it at, and what are the 60Hz TV's intended for in the first place?


60fps is still too low for very fast motion, of which sports is a notable example. And the focal point of most sports is a ball, which moves along predictable paths, so motion smoothing can do a good job of increasing the frame rate to the point where the sample-and-hold blur of 60fps isn't visible.


So do the televisions use 120 fps then?

Phew, then at least displaying 24 fps movies without interpolation should be possible because 120 is an integer multiple of 24.


Yes, or higher than 120fps. And 120fps can indeed be used to show 24fps with consistent frame timing (as well as 30fps and 60fps, so it's a convenient frame rate).


It's got to be bad for cycle or motorcycle sport by the way.

I saw some videos on youtube showing the motion smoothing in slo-mo, and heads and bodies were moving at different speeds between frames with this.


This is yet another example of technology designed to impress consumers in the store which actually worsens the experience once they get the product home. Apple's business strategy for computers has been centered around this trick since Steve Jobs died.


My new top-of-the-range Sony 4K UHD HDR oled has this setting turned on - FOR EACH OF FOUR INPUTS. You have to turn it off (as well as a bunch of other adaptive contrast junk) per input. It’s maddening.


Just as an aside, there was a lot of work on this kind of thing (ie. 'improving' 3:2 pulldown) before the modern flat panel tv. Improving slow motion sucked up a few kilobucks of engineering time.


I've tried SVP which is motion interpolation software and while anime looks amazing, it totally destroys any real motion picture.


Is the article actually suggesting that the brain subconsciously detects the frame rate, and then "frames" the movie to match the perceived medium? High frame rate means TV implies soap opera, low frame rate means film implies more serious, weighty content? Or am I completely misunderstanding what this "soap opera" effect is?


I think it is perceptual but not that specific to the medium. A faster frame rate “feels” closer to reality and the 24fps frame rate is far enough from reality that it’s easier to perceive as “something else” which allows fiction or dramatic content to be perceived differently. I tend to agree with that conception of it. The idea that when the frame rate gets high enough to not be noticeable then the brain treats it more as a real world perception and the standard is higher so it is more harshly judged and ends up seeming as lower quality reality instead of fantasy mode.


The root cause of the problem is cinema still using 24fps instead of a proper standard in 2019 (with a rare exception here and there).


24 fps might be arbitrary (could have been 23 or 26 or close), but it's a mighty fine standard for achieving the desired motion blur.

We don't want ultrasharp moves (as in 60 fps for example), and we don't want a blurry mess either.

This is a parameters directors control for, and is an artistic choice, and aside from very fast action and sports, 60 fps looks like crap.

It's not about what one is "used to" in seeing in cinema either. Real life also has inherent "motion blur" which we still want present in a film. Try moving your hands fast in front of your eyes...

People wanting "a proper standard in 2019" are like those programming newbs that heard that assembly is faster in their freshman year, so think that we should rewrite everything in assembly.

Higher != better always. Also a reason we don't get speakers that can push 200 dB (surely better than a puny 100-120 dB right?)


>Real life also has inherent "motion blur" which we still want present in a film.

The inherent motion blur is always there no matter what frame rate you chose. You watch movies with your eyes, and you can't exceed the capabilities of them. The problem with low frame rates like 24fps is the additional motion blur it adds over that inherent motion blur.


> it’s a might fine standard for achieving the desired motion blur.

Why? You can have nice motion blur in 60fps, what’s special about 24? Why would 24 be more desirable than, say, 36fps? And maybe more specifically: your own example demonstrates that it’s your eyes that are blurring, not the image. When you wave your hand around, it’s not blurry, you only see it that way. That suggests that 1000 FPS with no blur should be better than 24fps with a lot of blur, and that you will still perceive 1000fps as motion blurred.

> We don’t want ultra sharp movies, and we don’t want a blurry mess either

Pans in 24fps are an absolute blurry mess no matter what you do. I can’t stand pans in movies anymore. I do want ultra sharp movies with higher frame rates.

> It’s not about what one is “used to” in seeing in cinema either.

I’ve noticed that kids tend to like higher frame rates and don’t find them weird. This seems to indicate it really is about what one is “used to”.

> Higher != better always. Also a reason we don’t get speakers that can push 200 dB (surely better than a puny 100-120 dB right?)

Hahaha! Like I totally agree with the blanket statement that higher is not always better. But 200dB is physically impossible (max air pressure on earth is ~194dB), and if it were would actually make the listener deaf or maybe kill them. https://en.m.wikipedia.org/wiki/Sound_pressure#Examples

It’s a bad analogy. In audio, sampling rates and resolution do go higher, and it is considered better. People master audio tracks at 200kHz rather than 20kHz, and they use 24 or 32 bits of resolution rather than 16.


> and aside from very fast action and sports, 60 fps looks like crap.

Lots of people disagree, that's why youtube is full of interpolated 60 fps music videos and 60 fps live performances.


Lots of people also think McDonalds is the best food, and Drake is a genius, so there's that...


Actually, production companies are the ones uploading 60 fps live performances. Producers are well aware that 60 fps looks better, they are just not capable of doing it for all the content they produce.



I'll quote some of Abe Dillon's comment on that video:

> This is an industry insider telling you, "No. More FPS is not more better. Adherence to 24 FPS is a deliberate choice by film makers themselves."

> Go check out a video on Adam Savage's Tested channel about kit bashing (entitled "Adam Savage's One Day Builds: Kit-Bashing and Scratch-Building!"). Listen to how he describes working with styrene. How much he loves the medium because "it hides a ton of crimes" and "there's lots of forgiveness within the process". It's kind-of the same with film. Higher FPS tends to look cheap and is less forgiving. The medium starts to work against you instead of with you. It's harder to pull off tricks like the pho-long-shot church fight in Kingsman: The Secret Service or Birdman.

In other words, they can't make 60 fps look good enough. I guess the only hope for us is some sort of deep learning motion interpolation post processing that some day might be able to generate almost perfect missing frames.


Peter Jackson encouraged people to see The Hobbit in 48 fps. Personally, I hate 24 fps - it looks so stuttery. But it's purely personal preference.


Yeah 24 FPS hides crimes by disorienting you and making you lose track of what is going on. Kind of like that insane sequence of Liam Neeson climbing over a fence.

I don't see how that is a good thing.


> 24 FPS hides crimes by disorienting you and making you lose track of what is going on.

The point is that 24 FPS hides technical/production crimes so you aren't distracted by them and can more fully immerse yourself in the story. I guess if you mostly watch demo videos of steaks frying or furling fabric these TVs are an improvement, otherwise, no.


Come on, surely everyone has seen a fight scene or some other action scene shot with a shakey cam at 24 FPS that just is impossible to follow? It doesn't let you immerse yourself, if anything it pulls me out of the action because I have to look away or I'll be sick.


100% agreed. 24fps is far too low for fast motion. People in the industry talk about "art" to justify their low frame-rate, but I don't believe them for a moment. 24fps lets film-makers get away with sloppy technique that would be exposed if we could actually see moving things clearly. 24fps is cost-cutting.

TV makers are just trying their best to improve things under the constraint of bad quality source material. Motion smoothing isn't perfect, but I'm not convinced it's worse than the alternative.


As a former animator, let me tell you that cost-cutting tricks are insanely important parts of the process. Hell, it’s a part of any process - there is a balance to be found between making a thing “perfect” and making it in a sane amount of time. Very few projects have infinite deadlines and budgets.

24 is good enough to produce a caricature of reality, with slightly different rules that are much easier to bend. That’s pretty damn useful for storytellers.


It's just peculiar that all the director's named in that article and almost everyone I know who thinks in any depth about this issue disagrees with you.

It's conceivable I might "get used to it" but that's like telling someone who hates cilantro that that will "get used to it". They don't want to and neither do I.


It's not peculiar in the slightest when, as Upton Sinclair famously put it, "it is difficult to get a man to understand something, when his salary depends upon his not understanding it." The industry is under increasing financial pressure (e.g. see https://www.vanityfair.com/news/2017/01/why-hollywood-as-we-... ), so why increase costs when customers are still buying 24fps material?


So you're saying the listed film directors have some vested interest in 24fps? And all the film lovers are just following their lead?

I don't buy it.


How on earth would increasing the frame rate increase costs? It's all (mostly) shot on digital media, so there's no extra film costs like in the pre-digital era. Storage costs are trivial for digital media. It wouldn't increase the time for setting up shots, and probably wouldn't cost extra in postprod.


Brighter lighting required. Scenes become impossible to shoot with natural light. Props and sets need more detail. Unless the actors are very good, more takes are needed to get the acting right. More work needed on continuity, and more reshoots because a mistake is obvious. More VFX render time (and more manual work in some cases, see https://news.ycombinator.com/item?id=20600107 ).

And storage costs are not trivial. Assuming a modest 10:1 shooting ratio, 8K resolution, 120fps, and 36bit color (12 bits per channel), that's 1.29PB of raw data for a 2 hour movie. Some genres (notably documentaries) have much higher shooting ratio.


1. Most scenes are lit with extra lights. It's extremely rare to shoot with just natural light. 2. Prop and set detail has nothing to do with FPS, but with final image resolution. 3. Actors are always good. They're pros. You'd be amazed at how good. Regardless, the FPS has no bearing on the amount of retakes due to talent performance. 4. FPS has nothing to do with continuity. Either the script supervisor catches the error, or it's noticed in post and corrected then. 5. Storage costs are trivial compared to most production costs. Above the line costs dwarf storage.


> that’s like telling someone who hates cilantro that that will “get used to it”. They don’t want to and neither do I.

This is an important point, and has more to do with the debate than whether or not 24 vs 60 is actually better. Many people don’t care if 60 is better and don’t want to hear the argument.

I expect that TV manufacturers aren’t intentionally trying to change anyone’s mind, but rather are looking at the future generations of viewers. Kids don’t seem to mind HFR as much, and kids outnumber adults who prefer 24fps. Today’s kids are tomorrow’s TV buyers.

A good answer might be to appease today’s vocal film lovers by adding settings or changing defaults, but they might also reasonably choose to cater to the preferences of the next generation over the current one.


It's interesting that you, of all things, chose cilantro for your simile, when there are indications that dislike for cilantro is actually primarily genetic[1].

[1]: https://en.wikipedia.org/wiki/Coriander#Taste_and_smell


Cilantro. The worst! In 24fps or otherwise.


Directors are artists. They pretty much always prefer the old inferior way because it is more arty. C.f. film vs. digital.


You say "arty" like it's a bad thing. How peculiar.


It is in this context. Directors like 24 FPS because it is more arty than 60 FPS, but really it is strictly inferior. Even if you prefer a lower frame rate for weird aesthetic reasons, just shoot at a high frame rate and then cut every other frame so people can choose.

It's like, nobody thinks 1080p is better than 4k. Or that 8 bit dynamic range is better than 10 bit. So why do some directors think that 24 FPS is better than 60? Art. Right.


Course they do. If 1080p is all you need, the extra cost of 4k is pure waste. Bigger numbers do not mean better.


4k is definitely better than 1080p, unless you have pretty bad eyesight and no glasses.


>100% agreed. 24fps is far too low for fast motion. People in the industry talk about "art" to justify their low frame-rate, but I don't believe them for a moment.

Yeah, armchair critics know more than artists and movie industry professionals...

And 100 million + movie budgets have "sloppy technique" to hide with "low" 24 fps frame rates, whereas your imaginary movies will have good technique to justify higher fps.

(In fact directors shoot with different shutter speeds to get the artistic motion capture they like to have -- from regular real-life like motion blur at 180 degree shutter, to fine motion for action scenes. The frame rate is not the determining factor here).

Not to mention that beside the unrealistic "I'm on cocaine" look, higher frame rates mean dimmer images and more noise -- but it's not like people wanting "higher" understand those things, they just know "higher is always better", and they get their understanding from video games...


The first time I watched 60fps film was over 20 years ago, when I saw a short Showscan movie at Futuroscope in Poitiers, France. I hadn't even heard the phrase "frame rate" back then. But I immediately knew that Showscan looked amazingly better than anything I'd seen before. It was the first film I'd seen that looked anywhere close to realistic, and nothing else at Futuroscope could compare. I watched it again because I liked it so much.

I think motion smoothing is on by default because people like it, not because TV manufacturers are idiots.


Showscan and other high frame rate formats are indeed great for simulating reality. That’s why sports is mentioned in the article. If you’re building a flight simulator, sure go and use 120fps.

But cinema at anything higher than 24, looses its sense of Impressionism. 24 is not the way we see the world in our normal life, and that’s just the point. Look at an impressionistic painting. It just isn’t what we see in our normal life, and again that’s just the point. It’s flaws are the point. It’s what makes Impressionism its own medium vs realism. Movies in 24 are simply Impressionism. Yes it was an accident 100 years ago but it worked.


Tokyo Story consistently places at or near the top when directors or critics are polled for best film. It's a superb film IMO, and it's not impressionistic at all. Actors look directly at the camera, which is static and positioned to make it seem like you're actually sitting there with them. You care about the characters because it feels so immersive. The realistic style works despite the low frame rate because there's so little motion.

What other great artistic achievements are we missing because the film industry refused to move past impressionism?


Yes, it could be that people are tacky.

After all most movies watched (especially today) are crap.


I agree that we should move past 24fps. Panning shots judder to my eyes in a very unpleasant way, even if not panning particularly quickly. It's not just action shots that suffer at the low frame rate. If people hate it because they're used to 24fps, well, then they'll get used to a new standard soon enough.


60 fps is probably not that important for most movies and tv shows though, but it's a bit annoying to watch music videos at 24 fps.


Actually this effect has been observable for several decades with cathode ray tube TVs. Recordings from normal amateur video cameras look more / differently fluid compared to scenes from movies. This is particularly obvious with some movies from the 90s where you have scenes cut into the movie which have been recorded with regular video cameras - for example for dramatic reasons. Awkwardly few people seem to be able to recognize this at all. To me this motion smoothing is absolutely obvious and I find it mostly terrible. Real HFR adds to the realism of animations in my opinion. But it seems that for some super subtle reason some people's visual processing feels repulsed by this fake HFR - like uncanny valley - it's smoother but in a way that doesn't seem right while you're not sure why.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: