People just don't want to muddle around in the settings. And here lies the problem. It would be great if manufacturers would open up their interface settings to be influenced more easily by mediaplayers or settup boxes. Those in turn, could have a ruleset: if 24p, then disable motion blur if else enable motion blur. If sports enable motion blur. And so on.
It's not even limited to motion blur, I have low light and normal light profiles I'd like to have changed automatically based on the actual light in my room.
 Blur Busters has a good introduction, that's relevant to all non-flickering displays, not just OLED: https://www.blurbusters.com/faq/oled-motion-blur/
It does introduce artefacts. The solution to all of this is for filmmakers to move on from the 1930s and film at 120 fps or higher. If people only want to watch a quarter of the frames, it can be downsampled easily.
The argument reminds me of musicians who didn't want their albums chopped up and sold as individual tracks, to be consumed however listeners desired.
However, 24 fps running at a true 24Hz is perfectly fine, and you will not see flicker - at least not if produced by competent filmmakers. That's because the camera's shutter angle is normally set such that motion blur makes the motion look smooth. Real cinematographers also know exactly how fast they can pan with a given lens without causing any noticeable strobing motion.
Comparisons to video games completely miss the point. Most video games are unable to properly simulate motion blur (some try, but it doesn't usually work well), so you have to have high frame rates for things to look smooth. Since you need to react quickly in video games, high frame rates (combined with the lack of motion blur) are also helpful in preserving crisp edges on moving objects (so it's a practical advantage). And finally, video games generally try to simulate reality for the player, so players are more concerned that the technology makes the experience believable rather than intentionally suspending disbelief (as in film or theater) to passively unpack the narrative nuance of artwork unfolding on a screen.
For those same reasons, sports also do well with higher frame rates, but when fiction (which is obviously disconnected with the present reality) tries to use high frame rates, it falls into "uncanny valley" territory - much like the computer-generated humans of movies like "The Polar Express". As others have noted, a few directors have really tried to push HFR film to the public, but it has never been received well (whether or not HFR is recognized as being responsible for the uneasiness felt by audiences watching HFR content).
As for HFR content being "easily" downsampled - it really isn't, at least not with respect to motion blur (which is an essential point that many people miss in these discussions).
You're right about motion blur making 24fps look less awful, though. I'm sure you know that filmmakers often reduce it in fight scenes to deliberately make viewers a bit uncomfortable. Thanks for pointing out that downsampling wouldn't work without some extra effort (e.g. just having the same amount of blur but with the extra frames such that when it was downsampled it looked normal), as that isn't something I had considered.
I find the suggestion that videogames don't need to suspect disbelief, but films do, very strange.
True, the distinction may seem strange to some, and it's not always a clear distinction. Here's how I'd describe it in more detail (just my opinion of course):
In film, we see a person depicted on the screen with certain characteristics that convey emotion (just as we do with a group of lines comprising a stick figure with similar emotion), but we are forced to activate parts of our brain that span the gap between what we see (which obviously isn't reality) and what that character might feel in his/her universe.
To some degree that same experience happens in video games, but I would argue that instead of seeing the emotions and trying to understand them, we (as players) are the person feeling the emotion. In one, we are witnessing events happen to someone else (and exercising empathy), whereas in the other, we are experiencing the events first-hand (even in a 3rd person view, we're playing as if we are the character in view).
A first-hand experience feels most believable when it's actually realistic (hence the need for high frame rates), but a simple stick figure can feel "believable" if we're engaged and sharing the emotion we believe it to be feeling.
As a side note, maybe that's why I personally prefer classic (8-bit era) video games over modern 3D video games :)
Closed down shutter angles are only needed to mitigate the blur of low frame rates. If you're shooting at 120fps you can use a fully open shutter (which helps with the lighting requirements too), which gives you a choice of 5 different simulated shutter angles when you downsample to 24fps by blending frames. Blending 2 consecutive frames then deleting 3 gives you a look very much like modern 24fps film.
Er, why not? Motion blur is simply time integration. Why can't you simply average adjacent frames during your framerate decimation?
In fact, since "real" motion blur from shuttering uses a likely-suboptimal square windowing function (shutter is binary open/closed), I'd hazard it possible that motion blur computed from a higher source framerate could potentially be more perceptually pleasing (by using a smoother windowing function). You have more information to work with, after all.
In theory you could, but in practice, it doesn't work very well and you tend to get some really nasty artifacts depending on the motion of the scene. There's a lot more to it than simply averaging adjacent frames since those source frames likely aren't capturing perfect 360° shutter angle time slices that seamlessly align (temporally) back-to-back.
Filmmakers filming in 120fps would destroy the film industry overnight because NO ONE would watch what they produced.
I know of at least 6 people who prefer higher framerates, myself included. I don't know anyone who has expressed an opinion the other way. As I said, this whole thread is like some kind of alternative universe to me.
Motion smoothing apparently hasn't stopped people buying TVs.
Like many other things, this is all down to personal preferences; similarly I have friends who can't stand to game at anything <120FPS and one who gets motion sick at 60FPS, while I would rather turn up the graphical detail and survive with 60. Neither opinion is more correct, and you have had several people upthread saying they prefer 24 for films (myself included).
Edit: nope, recreating from higher fps the motion blur that's common in 24 would be way more expensive.
Not true. Something filmed natively at 24 fps and a 180° shutter angle is vastly different in terms of perceived motion than 120 fps converted to 24 fps.
As for downsampling, I believe your math is off. 120 fps at 360° means each frame is exposed at 1/120s. Conforming that to 24 fps gives you the same exposure duration per frame (you're just throwing out 4/5 frames), so you're effectively at a 72° shutter angle - less than half of what's considered "normal".
"By far the most common setting for cinema has been a shutter angle near 180°, which equates to a shutter speed near 1/48 of a second at 24 fps." 
"A 180° shutter angle is considered normal." 
Right, replicating that exactly requires 240fps with fast shutter and discarding half of the frames. Which is a way more expensive proposition.
I'm glad to see a brief tip to that solution in the article and hope it becomes part of an industry specification.
Ultimately any decision a company makes is about money. From the article “It’s meant to create a little bit of eye candy in the store”. TV companies think Motion smoothing increases sales. It doesn’t matter what film directors say about it.
The reality is that TV manufacturers are making a commodity, they don’t want to be. No one wants to sell a commodity, there’s no money in it. It’s a constant race to the bottom on price. That’s why we see 3D TVs, curved TVs and a myriad of gimmicks. Everyone wants to differentiate themselves. If there’s a new industry standard that makes consumers and content creators happy, who cares? If I hire an engineer to implement it will it be worth it?
Reality does not judder.
If you want judder for some artistic purpose, you can have that, just like any other reality-distorting filter.
Some shots are intentionally blurry for valid artistic purposes. Should all movies and tvs be blurry at all times? Some shots are intentionally monochrome for valid artistic purposes. Should all movies and all tvs be b&w or red tinted at all times? Some shots are intentionally too bright or too dark, for valid artistic purposes... You want stroboscopic stop-motion, go ahead and use it.
Superior reproductive technology in no way prevents you from creating a scene that has that and any other kind of artificially applied crappiness like static interference or low resolution or analog scan lines... But crappy reproductive technology does preclude the other 99.9% of the time when you don't want an artificially crappy reproduction.
Video tape was cheaper than film, and tvs and video tape ran at higher frame rates than film, and soap operas were cheaper than movies, and so the one superior aspect of soap operas became associated with "cheap".
It's a freaking ignorant assosciation is all. It's obvious as hell, and you have to be a moron not to recognize why you think "this looks like a soap opera" and see past that.
Luckily the fact that tv's with motion smoothing are selling suggests that the rest of us have spent the 2 minutes necessary to get used to it and realized that it's superior.
The technology to produce movies at high frame rates has been around for a long time (and projection/display technology could have supported it much sooner if the content were there), and yet directors have deliberately, almost universally chosen what you categorize as "artificially applied crappiness" (24 fps). Practically speaking, it is artificially applied, and that's exactly it - it's an intentionally distorted version of reality, and it's that way on purpose. You're free to attempt to remove that filter (by smoothing motion on your TV), but that doesn't represent what the director intended.
Even with "superior reproductive technology", directors are still choosing to produce films at 24 fps, so can you at least appreciate and respect that that's how most of them intend their work to be displayed? True, "reality does not judder" (proper 24 fps content doesn't either, by the way), but reality isn't what filmmaking is about. To argue otherwise misses the point of film as an artistic medium. For other types of content, I agree - reality is the target, but not in film.
Neither do 24fps movies, except when you play them back on home equipment that doesn't display at an integer multiple of 24fps.
Judder is an artifact of the difference between home and theatrical display and preserving it (and giving alternatives a bad rep) is a deliberate attempt to keep the theatrical experience as a premium venue for viewing studio output.
Fighting for judder isn't about preserving anyone's artisitic vision. And the people saying it is aren't even good liars.
I’m not saying that the resampling is right, but I am super questioning this film rate in a digital age.
I’m also questioning the algorithm that just creates a messy blur of nonsense between two frames. Surely we can create a smarter algorithm
24FPS was chosen as much for economic reasons as anything else - it’s one of the slowest frame rates cinema could get away with while still delivering a satisfactory experience.
Remember that higher frame rates meant higher consumption of expensive film before the digital era. Lowering the FPS to the lowest you could reasonably get away with reduced the cost of materials significantly.
That it’s still popular today is at least partly testament to the conditioning effect of decades of 24FPS movie consumption. Audiences often find the higher FPS look and “feel” strange, at least in a cinema context, such as the backlash the 48 FPS cinema presentation of the Hobbit movies received.
I strongly believe that if industry had settled on say 48FPS as the standard we’d be conditioned to “expect” that look and complain that 24FPS looks strange.
Also: cinematic film is what keeps Kodak alive.
Like the interface and OS I assume the TV manufacturers are mostly using cheap versions of what they think they can get away with without pushing up their costs.
…which was remade into a high-budget Hollywood CGI-laden production, starring Bruce Willis and Brad Pitt. I’m not even joking; it’s called 12 Monkeys, and was released in 1995.
More FPS is really no different that increased resolution. There is nothing sacred about 24, 480 or 1080.
It's all conditioning ;)
Wouldn't that make the background be blurred? Or is that kind of blur somehow different to motion blur?
Not sure what that has to do with this discussion.
What would you think about someone that only wants to see black and white photography and won't even take a look at full color photographs?
Higher frame rates are not.
Nobody is disputing that higher resolution is better. You have most of an industry screaming about the terrible trend of higher frame rates.
I’m glad that you prefer it, but that doesn’t make it an unambiguous benefit.
The argument that it's the filmmakers' fault for using 24fps is flawed. What about animation? What about stop motion? Those may never be produced at a higher framerate. Movies with high VFX budgets will be much more expensive due to the additional frames when rotoscoping and compositing.
So if it is inferior and only makes sense for a subset of productions, why is it the default?
Hand drawn or hand posed animation is the rare case where increased frame-rate really would take more human effort, but it's often already produced at extra-low framerate (e.g. 12fps or 8fps), with frames duplicated for playback. The same duplication can happen with higher playback framerates. Modern video codecs handle duplicated frames well.
For new movies, where there is proper data, the 3D camera for each scene still requires more than "zero human labor"; you have to specify the inter-ocular distance, convergence plane, and tweak the scene a little for cinematic effect or 3D weirdness. And since there are thousands of shots, and the work is fairly tedious, it makes little sense to work on it in parallel with 2D during production when shots might just be cut entirely. So you end up with a bias towards 2D or 3D during the creation process.
Dreamsworks apparently has a made-for-3D animated movie division (https://animatedviews.com/2007/dreamworks-goes-3d/), where presumably they are flattened to 2D in post. Reviews of their movies (https://www.3dor2d.com/reviews/how-train-dragon-hidden-world...) unsurprisingly say "great use of 3D". Since that's the goal of the division I'd expect them to continue doing 3D-first movies.
In contrast Pixar so far has stuck to a more 2D style. In this interview (https://www.youtube.com/watch?v=HepIGDJK98s) their stereo guy says they do 3D in post. The 3D in their movies is described as "well done but a minor aspect" (https://www.3dor2d.com/reviews/Toy-Story-4-3D-3-D-Movie-Revi...). In the interview the guy basically says that he could use more 3D but he's more concerned with keeping the director happy so he doesn't. Pixar is now focusing on originals so maybe one of their next films will be 3D oriented. But they're making tons of money anyway so who knows if the right director will come along.
Of course. I should have been more specific; I was talking about stereo conversion. You addressed that. Thanks for your explanation. It's not as labor-free as I had assumed.
Padding is usually what's done. You can see it in NTSC (30fps) DVDs which come from something originally produced for PAL (25fps). Every 5th frame is duplicated.
It makes me curious what framerate movie projectors run at. Maybe the "cinema experience" includes goofy frame duplication and nobody realizes it.
Silent movies were shot 18 fps and projected 54 fps. The frame rate for talkies was boosted to allow enough bandwidth for the audio. The reason silent movies look jerky today is every third frame is projected twice. They are smooth when properly projected.
Silent movies were shot and projected at many different framerates. Some early silent movies were shot at 18fps, but the average frame rate increased over time, and the standard 24fps was picked as a compromise that was around the average frame rate at which films were displayed at that time.
> And an entire cinematic language has developed around the rate of 24 frames per second — the way actors perform, the way shots are composed and cut and cameras move. (This is why an awards show or a news broadcast shot on video at a higher frame rate looks and feels different from a film.)
This is a real stretch. Very, very few filmmakers are doing anything specific for 24fps that they wouldn’t do in 48 or 60. If they did, they’d slow things down so you could see them, but instead we have ever faster and faster fight sequences in Marvel and Transformer movies where you can’t even see the details during the action.
The “cinematic language” of a news show vs an action movie is different, but has almost nothing to do with frame rate, and if frame rate was the main issue, we’d be doing news in low frame rate and action movies in high frame rate.
Horizontal pans in 24fps have started to drive me crazy. Films do it all the time, and you can barely see anything while that’s going on. Higher frame rate pans are so much easier to watch, even if they make the movie look like BBC tv.
I agree if we’re talking about motion interpolation. I’m personally sensitive to artifacts from the 24->60 conversion, and I sometimes turn off frame rate up-sampling, unless the show has a lot of panning, and then I prefer the frame rate up-sampling despite the artifacts it causes sometimes. Like Game of Thrones for example has so much panning that I kept turning on the motion smoothing. But then the artifacts were so bad I kept turning it off. There isn’t a good option, so I wish it was filmed at a higher frame rate, and then displayed lower for people who prefer that.
Above I was really only referring to high frame rates in general and deconstructing the argument that 24 is inherently better for artistic reasons. While many filmmakers and viewers like 24fps, most filmmakers aren’t choosing 24fps nor are they doing anything specific with 24fps. 24 is just a standard that almost everyone is stuck with, most people wouldn’t even consider it a choice today, save the few directors with big enough budgets and studio backing that they can distribute films in a non-standard frame rate.
I wonder if this tech becomes mainstream, it will eventually start to be used for normal content.
There were other problems: the index was garbled and useless (it still had the page numbers of the paper book in it and the formatting was completely off), the text could be resized but the formulas were rasterized and did not scale with the text and were frankly hideous. This was form a major publisher. I'd much rather have had a PDF.
But for novels, changing font size and hence reflow of text are super nice.
This has no value except a being a gimmick which can be used to say "look how smooth/real it is". Worse is to set this to be "on" by default. They should have just had a button on the remote with a marketing-inspired name, like "motion flow" or something, which people could have turned on if they wanted this.
I can't help myself and I end up hunting for artifacts to be annoyed by more than enjoying the show.
Every TV that uses motion interpolation is the same, it's horrible.
How can the new smoothing conversion be worse than that?
Also, if they want the picture to be displayed in a certain way, couldn't they push they movie as 60/frames per second where in fact it's 24/frames per second with static frames? Wouldn't that effectively disable motion smoothing?
This is the amount of research people do before buying something they'll stare at for 4h per day for the next 5 years... Never mind that the native framerate of the panel is only 60Hz.
This is why we get Vivid HDR mode ("Torch mode") and motion smoothing enabled by default.
I'm more complaining about the automatic 24->60 (or 120 or 240) conversion that's enabled by default on a lot of new TVs. I'm not at all opposed to higher frame rate video, I like it. I've just never watched something on a TV where the 24->60 conversion didn't look significantly worse to me than playback at the native frame rate.
Maybe my unconscious assumption is that people's preferences about the ideal frame rate can change over the years, but that reinterpolating the video to a different frame rate will always look worse than native, regardless of the source and destination frame rates. Maybe this isn't true, but that is my assumption.
Music as well as Films are pieces of work who are only really existing in their projection, a playthrough or a concert. Everything that mangles with that projection without beeing a willful choice of those projecting it makes it harder for artists to speak to an audience in the way they intended.
There are perfectly fine reasons to change a projection, e.g. for the or visually impaired or because you just prefer it that way, but ultimately it should be a willful decision to deviate from the expected default..
I understand the artist, but most people won't get best of anything. You don't see coffee companies complaining about your grinder settings even though you definitely could get a better taste if you only paid attention.
While the Wikipedia article on 3:2 pulldowns mainly discusses converting 24FPS film to 30/60 FPS broadcast, the same conversion is virtually always done on digital material as well, in software at playback time or by the video codec changing the frame rate.
As mrob also writes, you only get smooth frame rate changes usually if the new frame rate is an integer multiple of the original one, as then the extra frames introduced is consistent. This is why 30FPS video doesn’t have this problem at 60hz, but 24FPS does.
My guess at the time was it was an image stabilization algorithm put on by the studio since sports broadcasts involve long-range videography. Presumably the logo or eyeballs were getting "stabilized" in place while the rest of the person wasn't, making them appear to jiggle up and up down compared to the person's movement. It was eerie. But these were major sports streams, like world series baseball, and I'd be surprised if they messed it up that badly.
We borrowed a movie, fired it up - and everyone thought there was something wrong with the movie. The image quality looked almost like something from handheld camera, just very non-cinematic. We returned the movie, and went with another one.
Nope, same problem. We just figured that's the way HD looked like.
Why do they need this feature for sports? Couldn't they broadcast the sports at 60Hz? If they don't broadcast sports at 60Hz, then what framerate are they broadcasting it at, and what are the 60Hz TV's intended for in the first place?
Phew, then at least displaying 24 fps movies without interpolation should be possible because 120 is an integer multiple of 24.
I saw some videos on youtube showing the motion smoothing in slo-mo, and heads and bodies were moving at different speeds between frames with this.
We don't want ultrasharp moves (as in 60 fps for example), and we don't want a blurry mess either.
This is a parameters directors control for, and is an artistic choice, and aside from very fast action and sports, 60 fps looks like crap.
It's not about what one is "used to" in seeing in cinema either. Real life also has inherent "motion blur" which we still want present in a film. Try moving your hands fast in front of your eyes...
People wanting "a proper standard in 2019" are like those programming newbs that heard that assembly is faster in their freshman year, so think that we should rewrite everything in assembly.
Higher != better always. Also a reason we don't get speakers that can push 200 dB (surely better than a puny 100-120 dB right?)
The inherent motion blur is always there no matter what frame rate you chose. You watch movies with your eyes, and you can't exceed the capabilities of them. The problem with low frame rates like 24fps is the additional motion blur it adds over that inherent motion blur.
Why? You can have nice motion blur in 60fps, what’s special about 24? Why would 24 be more desirable than, say, 36fps? And maybe more specifically: your own example demonstrates that it’s your eyes that are blurring, not the image. When you wave your hand around, it’s not blurry, you only see it that way. That suggests that 1000 FPS with no blur should be better than 24fps with a lot of blur, and that you will still perceive 1000fps as motion blurred.
> We don’t want ultra sharp movies, and we don’t want a blurry mess either
Pans in 24fps are an absolute blurry mess no matter what you do. I can’t stand pans in movies anymore. I do want ultra sharp movies with higher frame rates.
> It’s not about what one is “used to” in seeing in cinema either.
I’ve noticed that kids tend to like higher frame rates and don’t find them weird. This seems to indicate it really is about what one is “used to”.
> Higher != better always. Also a reason we don’t get speakers that can push 200 dB (surely better than a puny 100-120 dB right?)
Hahaha! Like I totally agree with the blanket statement that higher is not always better. But 200dB is physically impossible (max air pressure on earth is ~194dB), and if it were would actually make the listener deaf or maybe kill them. https://en.m.wikipedia.org/wiki/Sound_pressure#Examples
It’s a bad analogy. In audio, sampling rates and resolution do go higher, and it is considered better. People master audio tracks at 200kHz rather than 20kHz, and they use 24 or 32 bits of resolution rather than 16.
Lots of people disagree, that's why youtube is full of interpolated 60 fps music videos and 60 fps live performances.
> This is an industry insider telling you, "No. More FPS is not more better. Adherence to 24 FPS is a deliberate choice by film makers themselves."
> Go check out a video on Adam Savage's Tested channel about kit bashing (entitled "Adam Savage's One Day Builds: Kit-Bashing and Scratch-Building!"). Listen to how he describes working with styrene. How much he loves the medium because "it hides a ton of crimes" and "there's lots of forgiveness within the process". It's kind-of the same with film. Higher FPS tends to look cheap and is less forgiving. The medium starts to work against you instead of with you. It's harder to pull off tricks like the pho-long-shot church fight in Kingsman: The Secret Service or Birdman.
In other words, they can't make 60 fps look good enough. I guess the only hope for us is some sort of deep learning motion interpolation post processing that some day might be able to generate almost perfect missing frames.
I don't see how that is a good thing.
The point is that 24 FPS hides technical/production crimes so you aren't distracted by them and can more fully immerse yourself in the story. I guess if you mostly watch demo videos of steaks frying or furling fabric these TVs are an improvement, otherwise, no.
TV makers are just trying their best to improve things under the constraint of bad quality source material. Motion smoothing isn't perfect, but I'm not convinced it's worse than the alternative.
24 is good enough to produce a caricature of reality, with slightly different rules that are much easier to bend. That’s pretty damn useful for storytellers.
It's conceivable I might "get used to it" but that's like telling someone who hates cilantro that that will "get used to it". They don't want to and neither do I.
I don't buy it.
And storage costs are not trivial. Assuming a modest 10:1 shooting ratio, 8K resolution, 120fps, and 36bit color (12 bits per channel), that's 1.29PB of raw data for a 2 hour movie. Some genres (notably documentaries) have much higher shooting ratio.
This is an important point, and has more to do with the debate than whether or not 24 vs 60 is actually better. Many people don’t care if 60 is better and don’t want to hear the argument.
I expect that TV manufacturers aren’t intentionally trying to change anyone’s mind, but rather are looking at the future generations of viewers. Kids don’t seem to mind HFR as much, and kids outnumber adults who prefer 24fps. Today’s kids are tomorrow’s TV buyers.
A good answer might be to appease today’s vocal film lovers by adding settings or changing defaults, but they might also reasonably choose to cater to the preferences of the next generation over the current one.
It's like, nobody thinks 1080p is better than 4k. Or that 8 bit dynamic range is better than 10 bit. So why do some directors think that 24 FPS is better than 60? Art. Right.
Yeah, armchair critics know more than artists and movie industry professionals...
And 100 million + movie budgets have "sloppy technique" to hide with "low" 24 fps frame rates, whereas your imaginary movies will have good technique to justify higher fps.
(In fact directors shoot with different shutter speeds to get the artistic motion capture they like to have -- from regular real-life like motion blur at 180 degree shutter, to fine motion for action scenes. The frame rate is not the determining factor here).
Not to mention that beside the unrealistic "I'm on cocaine" look, higher frame rates mean dimmer images and more noise -- but it's not like people wanting "higher" understand those things, they just know "higher is always better", and they get their understanding from video games...
I think motion smoothing is on by default because people like it, not because TV manufacturers are idiots.
But cinema at anything higher than 24, looses its sense of Impressionism. 24 is not the way we see the world in our normal life, and that’s just the point. Look at an impressionistic painting. It just isn’t what we see in our normal life, and again that’s just the point. It’s flaws are the point. It’s what makes Impressionism its own medium vs realism. Movies in 24 are simply Impressionism. Yes it was an accident 100 years ago but it worked.
What other great artistic achievements are we missing because the film industry refused to move past impressionism?
After all most movies watched (especially today) are crap.