I saw Avatar in 3D and while the effects were fun to watch, I thought that the expressiveness of the depth-of-field shots was actually reduced.
In 2D cinema, the viewer's eyes adapt to the shot that the director intended, which often involves focusing on the foreground or the background... once in a while a shot puts both in focus, but it's rare.
The visual cue of depth of field is why cinematographers continue to use it even though it's been technically feasible for a long time to simply make all objects in the shot appear in focus.
In Avatar, the critters zoomed toward the audience and the effect was immersive, but I thought many of the close-in shots lacked the intimacy that good 2D shots can often attain.
3D vision is a perceptual use of depth of field to heighten the brain's ability to discern objects. There is similarly no reason why the human visual system couldn't have evolved to perceive objects in focus over a wider range of distances.
In fact, the human perception of a constant, detailed visual field is an illusion that the brain stitches together from the data collected by the narrow beam of detailed focus directly in front of our gaze.
So I think doing accurate 3D requires simulating the way in which the brain picks out objects at different depths, but this can't be done simply by offering one depth of field for all viewers, since our eyes can't bounce back and forth alternately focusing a near and far object and creating the increased acuity we get in normal vision.
3D film is unreal in that it requires us to suspend our normal method of discerning space, and watching it is something we must learn to do, just as we must learn to interpret 2D depth of field as both an attentional and spatial cue.
Depth of field isn't the only way for a cinematographer to to lead the viewer's gaze. There's also lighting, movement, scene (art direction)... Tricks that have been used in live theater for centuries. The depth issue really boils down to a lack of cinematic artistry, probably due to the lack of experience that filmmakers and audiences have with 3D.
Martin Scorsese's "Hugo" is probably the most artistic use of 3D I've seen to date.
> Martin Scorsese's "Hugo" is probably the most artistic use of 3D I've seen to date.
I'm going to have to watch it, do you know of a way to see it in 3D using a normal laptop or TV?
Even seeing it in 2d you get a sense of the 3-dimensionality.
When you increase the width of the aperture of a lens (your pupil) you decrease the depth of field in sharp focus. It's a fundamental limit of optics.
On a bright day you have a lot of light and your pupil is narrowest, bringing as much as possible into focus.
> So I think doing accurate 3D requires simulating the way in which the brain picks out objects at different depths, but this can't be done simply by offering one depth of field for all viewers, since our eyes can't bounce back and forth alternately focusing a near and far object and creating the increased acuity we get in normal vision.
I watched the Hobbit in 3d and thought it was awful. You had this tiny sweet spot right where the director intended you to look and everything else was a sea of blur.
I realized that I like looking at sets because I was frustrated every time I tried.
That was my experience as well. I find it head-ache inducing. It really doesn't emulate true 3-D vision very much at all and I would personally not pay any extra money for the experience.
Also, the movie is Star Wars so there's no drama whatsoever. Literally everyone in the film is killed within seconds of their mission being accomplished.
I had a similar experience.
If the cameras are further apart than your eyes, things will look smaller in size and have an exaggerated stereo effect also. If they are closer, the stereo effect is less pronounced and things appear larger in size.
It's actually the sense of scale I like the most in 3D (and VR). I don't care that much about things popping out at me on a 3D TV. I much prefer 3D that treats the screen like a window you look through, where the 3D effect comes mostly from things receding into the background. As an example, I loved the parts in Star Wars: The Force Awakens where you had characters shooting at things in the distance and the camera sat over their shoulder with them in the foreground.
There's also another issue here - you can't just take any game and 'upscale' it into 3D using a driver or such. The game's geometry needs to be properly modelled to scale for it to really work. There are other more egregious issues too - things like lens flares appearing in the wrong plane, because the game happens to be rendering them to a separate 2D plane.
These and other issues like them are the same reasons that you can't turn a non-VR game into a VR game with a driver and have it be as good as a native made-for-VR experience (although many games do come somewhat close, if you don't have a native VR version of them for comparison).
FWIW I think the 'average' IPD is 64mm.
Apparently you've misunderstood either 3D (or more accurately stereoscopic) vision, depth of field, or both. Cue detailed HN-worthy explanation of each... but frankly, you'd be better off perusing the myriad of resources available on these two subjects.
My point is that stereoscopic vision lets our brains ignore more information by identifying the depth of interest and ignoring the rest.
Similarly, trying to identify the direction of a sound using only one ear is very difficult, and there is inferior signal to noise ratio available to our perceptual system. The same applies to stereoscopic vision.
Couldn't agree more.
It's the same mentality that makes people declare that "[arbitrary tv show] should've been cancelled long ago!"; or what a travesty it is that [arbitrary beloved IP] is getting a sequel/remake/reboot.
If the Simpsons runs for fifty seasons, it doesn't undo the first ten.
If they reboot Batman a dozen more times, it doesn't keep you from thinking that Adam West was the best Batman.
And to categorize 3D movies as simply a fad totally disregards works for which 3D was an integral part of the creative vision, e.g. Avatar, Gravity, Love.
There's simply no good reason to discourage anyone from contributing anything creative to the culture, even if you question their motives.
I don't know about that, there are very good reasons not to, in my opinion. It's the same reason we don't build on top of the gaza pyramids, or replace the mona lisa with a more modern version.
We like to preserve the legacy of things, and when you pollute it with things not in the original design it's easy to end up reshaping what it means and diluting its history.
I certainly don't think that the Luxor hotel and casino in Las Vegas diminishes the legacy of the Great Pyramids in any way. In fact, I think it's really interesting that they were able to use an ancient form to achieve something architecturally unique from a modern standpoint. It allowed them to create an uncommonly large open lobby area, which is really cool to see, and wouldn't have happened if someone had said 'no pyramids, pyramids have been done.'
A remake/reboot is certainly nothing like replacing the Mona Lisa. It's more like a Banksy type person painting their own take on the Mona Lisa on a concrete wall or something.
Do you think that no artist after da Vinci should ever reference the Mona Lisa in any way? Because derivative works were being made by prominent artists before da Vinci even considered his finished. In fact, the famous Mona Lisa arguably wasn't even the only version of the Mona Lisa that da Vinci himself contributed to.
'Meaning' is extremely personal and dependent entirely on the context within which a work is received. No work will ever hold the exact same meaning for two individuals. Furthermore, a work's meaning to a particular individual is likely to evolve continuously throughout their life.
To discourage derivative works is like saying that you don't want to have to consider that alternate meanings are even possible.
Derivative work is a little different than building a sequel though. A sequel may stand alone on it's own, but it also can take away from the original intention of it's prequel because it usually further builds out the universe or plot that took place in the original.
Movie producers often reach for sequels because they are guaranteed cash grabs and prop them up with nothing more than a shallow plot and a few A-list celebrities. To many, this takes away from the intent and meaning of the original.
Look at all the superhero origin stories that have come out lately, and summarize to yourself the story in the movie, without including any of the establishing. Even in the best ones... in fact, to some extent, especially in the best ones... the story itself is quite simple.
The sequel will have a chance to spread its wings much more. The fact that so many fail to do so is... well... an interesting discussion on its own. But the really great sequels are often movies that had to be sequels, because they told a story too big to also have 45-75 minutes of establishing in there with them.
Making sequels in no way affects the integrity of the original.
They are wholly different circumstances.
Sequels absolutely can affect the integrity of the original by expanding the universe of the plot in ways that it's original design never intended or the way it was originally perceived.
If you think that some new works expand the universe in a stupid direction, there's no reason you can't pretend they don't exist and still derive the same meaning from your favorite works.
Disney does this as a matter of policy.
Countless Shakespeare productions and remakes and reinterpretations don't diminish the original form.
That statement does nothing but detract from the conversation. Of course I'm not obliged to pay attention to any of them. I'm not making the generalization that all sequels are bad.
Batman needs some comedy, and it needs to be proud of it, and there has been very little of that in the reboots. In fact, with the exception of a few scenes in the Michael Keaton version (which is my favourite in fact), I've only really seen it in Suicide Squad, and it only had about 30 seconds which included Batman.
The Adam West version is a comedy show, and it does a pretty good job at it.
That said, I like the campier interpretations of the TV show and Burton/Keaton but I also like the grittier Miller/Mazzuchelli Year One and Nolan/Bale interpretations, too. There's a proud tradition of both, and they complement each other (one interpretation playing straight man to the other, if you will).
Recently, I've been watching Sherlock. After doing some reading, I learned that Sherlock Holmes is the most portrayed character ever, "with more than 70 actors playing the part in over 200 films" . Talking with a work mate, he said his favorite portrayal was Jeremy Brett. I also learned that the original works are in the public domain, which is probably why it is so often used as source material.
 - https://en.wikipedia.org/wiki/Sherlock_Holmes#Adaptations_in...
Yes it does, absolutely.
That's like saying an awful ending to a movie can't ruin the whole movie, people can still enjoy the beginning.
It's totally possible to ruin people's memories and nostalgia.
Like jackbooted thugs from Fox show up on your doorstep and say "we heard you write comedy; you're coming with us."
There's no shortage of talented creative people, and contracts usually don't last forever.
Nobody's trying to oppress you personally or trying to control your experience out of principle. (I hope.)
3D is typically enhanced price, and serves as a way of extracting more money from the segment of the population willing to pay more to see movies in the theater. Without 3D pulling in that segment, the profit-maximizing 2D ticket price would probably be higher.
I'd pay the 3D fee to get in the most modern 2D theatre experience possible. I also get frustrated that 2D showing times can be quite limited if I want to see it on a system better than what I have at home.
 Not a joke: on Amazon you can find polarized glasses that work with most theaters where both lenses are polarized to just a single camera (left 2D and right 2D). I've got a friend that experiences motion sickness at 3D showings that uses them to see movies where everyone else wants to see 3D. One of the lesser utilized but awesome tools with some of the home 3D TVs (like my LG) is to have one player with left 2D glasses and the other with right 2D glasses and play a "full screen" coop game.
That's a very friendly way to put it :)
I personally think 3D sucks because it is a gimmick that affects the cinematography of a movie. Even if you can see a movie in 2D, you are consuming a product that was creatively compromised by filming it in 3D. I've seen a lot of films in both 2D and 3D, and I do appreciate the gee-whiz factor of 3D for some kinds of movies, but it's really unnecessary for movies where spectacle is not the primary consideration.
Plus, 3D is really immersion breaking. When you are watching a 2D movie directors force you to look at a certain part of a shot by using a shallow depth of field. In a 3D movie, this results in a really disorienting effect where something in the foreground can be out of focus, and despite trying to focus your eyes on it, it will continue to be out of focus, while the part of the image that is further away is still in focus.
Realism isn't the goal of movies, telling a story is, and 3D gets in the way of telling that story.
Mind you, for movies where the majority of many shots are CGI-composited (or for works that are just plain-old digital animation in their entirety), the 3D is "free": you have a 3D master whether you want one or not, and any 2D release is a post-processed edit.
> In a 3D movie, this results in a really disorienting effect where something in the foreground can be out of focus, and despite trying to focus your eyes on it, it will continue to be out of focus, while the part of the image that is further away is still in focus.
I've always wondered whether this problem could be "solved" with eye-tracking in VR. The gear would project a ray from your pupil to the image, hit a pixel, map it back through the projection matrix to the surface of an object in the scene, and then dynamically adjust the depth-of-field so that that part of that object (and everything else at the same depth) was in focus.
It's not free because although the assets are in 3D, the final render still needs to be done, and for stereo you need to render every frame twice, once for each eye. When I was working on animated features the extra production cost for stereo was ~10% of the total production budget.
3D is a good differentiator, and allows for a price raise that otherwise doesn't appear justified.
That's because large numbers of people like to watch 3D versions in preference to 2D, especially for movies like those in the Star Wars series (less so for, say, romcoms, which is why you don't see them in 3D at all.)
Depending on where you live, not really.
What we're really seeing is that people don't like to lose something they had previously, regardless of how small. People who prefer 2D really are having fewer showings available, but the trade-off is choice, and allowing other people to actually experience their preference, which is a good thing.
Very often the only 2D option is the dubbed version—because kids, I guess—if any, because the other rooms are already split between subtitled and dubbed.
My experience was that 3D IMAX is way worse than regular 3D (which I'd also rather not watch).
That said I'd still prefer 2D.
But I don't know your friends.
It depends on the personalities of everyone involved, the strengths of their feelings on the matter, etc. There's a lot of variability involved.
I'll do something I fear will be boring or go to restaurant I don't love for my friends. But I won't take a 3D headache or eat at a restaurant that has nothing but stuff I'm allergic to, because headaches suck. (Turns out both my clauses converge on that "because".) At the risk of channeling $STEREOTYPICAL_MOM, are people who'd ask that of you really your friends?
> At the risk of channeling $STEREOTYPICAL_MOM, are people who'd ask that of you really your friends?
They don't know, because I don't tell them. Otherwise, they'd bend over backwards to accommodate me, which would make me more uncomfortable than the physical pain does.
Source: Myself, a dad that only sees movies a couple times a month during a very limited time period when the babysitter is available
This is not true. Every 3d movie takes up IMAX screens that could be used to show 2dd versions of the movie. The 3d IMAX version has less visual fidelity than the 2d version and actively makes people feel sick.
Because movie theaters believe they can charge more for 3d often 3d movies take up every single IMAX screen and do not allow people to enjoy the best possible experience.
3d is a stupid gimmick and ruins movies.
Some 3D movies are great, like Avatar that actually shot things using 3D cameras. In other movies, like Captain America, they add the 3D in post and it looks like garbage. I appreciate a 3D movie done well, but too many are half-assed.
The real problem for me is if I can't see it in IMAX I'd rather see a film in not-3D but because the theaters have a hefty 3D surcharge, they're not interested in running the 2D version. I'm paying a ridiculous tax for something I don't even want, and I have no options other than going way out of my way to avoid it.
Maybe you like everything in 3D. That's fine. There are a lot of people that don't care for it and yet are forced into paying for it for lack of options.
It'd be great if they had both!
So yeah, go away you stupid gimmick.
[EDIT] Damn those early digital projectors were cool, basically huge light-bulbs wired up to a Centos box with some weirdo DRM shit baked-in. We would receive movies on ordinary hard-drives and plug them in to a central server rack to be ingested and then distributed out to each of the projectors. Pretty often in those days it would simply take too long to ingest and transfer to a screen, so we'd have to cancel early showings of new movies.
Here's a recent example: http://www.cinemablend.com/news/1600740/to-3d-or-not-to-3d-b...
Instead of different polarisation in each lens, like normal 3D glasses, it just uses identical polarisation in both lenses.
Friend: "I can only make the movie after 8"
Me: "The 2D one is at 7:40, but there's a 3D showing at 8:10"
And thus, 8:10 it is. Then I have to stake the glasses off for 30 seconds every 5-10 minutes because the image makes me dizzy as all hell.
As an aside, this reminds of the prime example of "people are so stupid you can't even imagine": I checked some epilepsy forum when Avatar came out because I suspected it's dangerous and yes, people said it is not advisable and some got auras and some even seizures from the movie so I decided 3D movies are not for me. Not everyone though -- there was someone who posted "yeah, I saw the warnings here and went to see Avatar 3D and got a seizure" and I am like "you can't be real".
Edit: to whoever downvotes this, I had seizures when I was 20 years old. Despite numerous examinations including sleep deprived EKG, CT, MRI they never found out what causes them. So if you are downvoting this because I sound like we do not know what causes seizures then you are either ignorant (because we indeed do not know and I am the damned walking example) or have knowledge I really badly need. If the latter then please share links to new neuro research which will help me and my neurologist to decide whether I need to stay on medication or not. I would be glad to stop after two decades.
Each of your eyes gets a half of the light available to them both before and after.
3D is and has always been a gimmic.
It doesn't improve the story telling. It doesn't contribute to the plot in any way.
But then, some movie makers remark, 'story' ? 'plot ? what strange words you use.
I think using 3D effectively is hard, and most films being made don't have a good reason to use it. It also makes my eyes feel a little uncomfortable, so I'm not crazy about it getting widespread use. Most filmmakers don't have the visual sense (and budget/tech crew) of Baz Luhrman or James Cameron.
The Wizard of Oz was a really masterful tech demo for color, because color was a pivotal part of the story. 3D's problem is that it has been totally driven by technologists and financiers rather than storytellers. If the first big 3D film were The Great Gatsby rather than Avatar I think we'd have a very different sense of what 3D is. Although The Wizard of Oz may be unique in the way the tech manages to serve the story just as much as the story serves the tech.
3D as we know it is probably never going to happen. Most movies don't gain much from more immersiveness, and the tech may always be clunky. I've seen some VR movies lately, and I predict they will supercede 3d entirely. Whether they'll end up being the next fad is an open question.
And presence is characterized by lack of nausea.
If your only experience with VR is cell phones or the Playstation 4, then you probably have never experienced presence.
I just got my third sensor a couple days ago, and it has made a night and day difference (there is also an option for 360 tracking with two sensors, but I didn't set that up). But even when I had two sensors the tracking was fine except under certain situations where my body would occlude the controllers. Not a problem anymore after the third sensor.
Article about it: http://www.thisisinsider.com/doctor-strange-how-you-should-s...
When new movies come out, google "should I see it in 3D $moviename" type articles. Unfortunately it's January now; not many good movies coming out.
Obviously, that doesn't help very much since it isn't in cinemas.
Not an absolute rejection, but a concern that it took focus away from other parts of the story-telling and wasn't necessary.
He supposedly sabotaged attempts to add speech to Gold Rush (1925) after the actors pushed for it.
City Lights (1931) was basically his ode to silent movies, famous for its endless retakes of the initial meeting with the blind flower girl , while he tried to figure out how she could mistake the Tramp for a rich man without words. In the end the solution was simple, and both funny and provided motivation for the Tramps desire to help her: The Tramp walks through the car of a rich man to avoid passing a police office, and so when he stands before her she has just heard him slam the car door. He accidentally knocks the flower he's purchased out of her hand and realises she can't see that he has picked it up. As he hands her his money, the owner of the car comes and drives off and the girl thought he had left without his change, and he didn't want to break the illusion and so walks off without money he badly needed for himself.
It's one of my favourite movies because it masterfully made his point that you can tell a complex story without it feeling like you're missing something because you can't hear dialogue.
He added speech in his next movie - Modern Times - but his character still didn't speak dialogue (but did sing).
It was first with The Dictator (1940) that Chaplin himself spoke on screen: Finally he had something where the speech added clear value by conveying more than he knew how to convey with just pictures.
In the same way I think we will see more and more movies eventually come out in 3D as the industry gets enough experience with it to see where it adds clear value and leave it low key other places, rather than add it to make a spectacle of it.
I think one of the first to do it well was Prometheus. A friend saw the 2D version and afterwards told me elated that it felt like it was made as a 2D movie - no weird camera work solely to make 3D effects stand out etc.. Meanwhile I'd seen it in 3D and been blown away at how good it looked in 3D. The effects were clear and beautiful but not in your face. Crucially they didn't alter the visual language noticeably.
Too bad they didn't put as much thought into the script.
No in any large numbers. Besides, 3D cinema has been coming and going as a fad for 6 decades now.
But even if it was true that people said that about color movies, people can say the same thing about different inventions and be wrong in one case and right in another.
>A lot of people really enjoy 3D, trying to logically convince them that they actually shouldn't like it is not going to work.
Well, they are not that many to sustain 3D TVs (as TFA tells us), and they have never been that many to make it not be a fad in the cinema either.
In which case, whether some enjoy it is a moot point.
This implies that "large numbers" of people complain about 3D. I suspect this isn't true, and that it's really a very loud very small minority.
> But even if it was true that people said that about color movies, people can say the same thing about different inventions and be wrong in one case and right in another.
Yes, but the point is that some evidence needs to be provided. Since some statements are wrong and some are right, the argument needs to be more than "it's true because I said it."
> and they have never been that many to make it not be a fad in the cinema either.
What does this mean? At what point do you concede that it's not a fad in the cinema?
No, it just implies that they don't care about it enough to e.g. sustain a 3D TV lineup.
>Yes, but the point is that some evidence needs to be provided. Since some statements are wrong and some are right, the argument needs to be more than "it's true because I said it."
The article is one piece of evidence, isn't it?
>At what point do you concede that it's not a fad in the cinema?
At the point it surpasses regular viewing and studios don't stop making such movies 5-10 years down the road?
I don't really understand how enjoyment is a moot point. Enjoyment is pretty much all that matters for a consumer entertainment product.
By observing the typical behavior of one person using it?
Think of a Chinese finger trap. It's been around forever. And yet one person is not going to play with a Chinese finger trap every day. Once you experience it and figure it out there's not much to do with it. That's a novelty.
Then they had indoor shots that used depth of field, while still using some 3D effects and totally ruined it for me.
The whole point of 3D is you can choose what to focus on! The moment you ditch that the effect is ruined.
But there's this thing called cinematography. And when it is done well, it adds another layer of enjoyment to the movie viewing experience. 3D is part of cinematography.
So, yes, plot is important. But if you are watching a movie instead of just reading a book, it is usually because the added elements are worth it.
And I say that as someone who usually reads the books as well.
3D is fun, and films like Avatar and Tron and Pacific Rim are totally worth in 3D.
But Gravity was awesome, animated 3D movies from Disney/Pixar are generally OK, and I really enjoyed the 3D version of Titanic that James Cameron produced in 2012 for the 100th anniversary of the sinking. (It's the same movie that came out in '97 but converted to 3D. Most of the time, you don't even realize that it's 3D. It's quite subtle.)
Life of Pi
Pretty much every Marvel movie
The new Star Trek films
Star Wars: Episode VII
The Hobbit films
How to Train Your Dragon and sequel
The LEGO Movie
Why not just listen to radio plays?
Actually it wont just be diminishing returns, but it will give movies the over-sharpened, soap-opera/sports-coverage style image (and even worse if paired with higher frame rates).
I never watched soap operas and very little sports coverage so I probably don't mentally associate clear images and high frame rates with those types of content.
On that note, to hell with motion blur in games. The first thing I look for when I notice a game has it enabled by default is a way to turn it off, I'm not watching a film and responsiveness is much more important when I'm not sitting there mindlessly consuming content being played for me.
Regarding motion blur, I agree, it's way overdone in games, and I disable as well.
Blurring is a hack to deal with the fact that higher res is expensive. Higher resolution is preferable over AA if you can afford the monitor and the GPUs to drive it.
The same reason we know what people would find too salty or too spicy. Because, outliers aside, humans tend to have the same psychology.
Not really. It's not just some historical accident why we ended with 24fps (or close) and not 50fps or 5fps or such.
If you live with a technology for long enough, you can start to see its flaws as advantages instead.
Say, your cousin has been cooking as a hobby for a while. They move to a bigger house with a new kitchen and can finally get a big spice rack and a deep frying machine. They proceed to deep fry everything and put all possible spices on it. It tastes bad and monotonic.
Your friend plays in a band. They've been gigging a while and got new PA equipment that they say sounds less muddy than their old cheap crap. You go to a concert. The treble is so harsh you can't stand it and have to leave.
Some examples where some new tool is, in theory, better. But it turns out a worse end result because lack of skill in using it.
I find many technically oriented people don't understand this. They point out to some number saying "well this makes it absolutely better". Sure, numbers might be easy to measure. But it's the end result that's important to most people.
Not every film has to be made with the latest technology. Sticking with analog or black-and-white or 24fps is a perfectly valid artistic choice. But it's an artistic choice made by the film creator, not some advantage to the older stuff. Cinemas should be using the most accurate reproduction they can. Filmmakers can then dial it back in the actual film if they think that's better.
The comment I was replying to wasn't saying 8k might produce worse results sometimes, it was arguing that 8k and higher fps would just be worse, period.
Then, let's look at the reproduction part only.
Technically, it might work. More choice in reproduction. But maybe in the real world, movie theaters (and certainly in the television world!) would crank up the brightness and edge sharpening and frame interpolation to make the director's artistic material look totally horrible.
If you think about the whole life cycle of any kind of art delivered to some audience, it's lined with these huge pitfalls at every point.
Maybe I'm obsessive compulsive about it.
Likely a lot would depend on good defaults and good training. If you were as skeptical about people and organizations as I am, you would assume it could on average worsen many movies. The original Murphy's Law and all that. :)
Some directors still shoot movies in black and white from time to time, not because they think color is inferior, but because they think black and white is better suited for the specific movie they want to shoot.
Only I'm someone working with 4K video and beyond professionally, not some grandpa tied to the ways of yore.
Maybe take the opposite reading of this observation? E.g. that those tech-savvy people know what they are talking about, and are not merely nostalgic or whatever?
First, vinyl vs mp3/WAV is not the same as 24fps vs higher frame rates, and conflating technical issues (each with their own characteristics and/or tradeoffs) is not really illuminating the subject matter.
Some things are not merely an issue of "technological capability" but tied to human physiology (the eye, etc).
The same way that a screen with 50,000 lumens is not naively "brighter = better" but blinding, or a 150dB headphone is not "louder = better" but physically damaging the ear.
Yeah, 150dB dynamic range is great, but nobody can hear it, and nobody can tolerate the upper range of volumes it takes to reproduce the full range on a speaker.
With color, on the other hand, it's not like that. The more, the better (24bit, 32bit, 64bit, etc).
With fps, again, there are issues related to the eye, how the "afterimage" works, when visual information becomes too distracting or overbearing, etc.
Note also that we, for example, have been technically lowering the resolution of photographs (smoothing the skin) to make portraits appear more pleasant since forever. It's one of the main staples in fashion and magazine portrait photography.
And that's not because "some old-fashioned people like smoother skin vs detailed skin". It's inherently better looking unless one likes wrinkles, pores and detailed nose hairs.
Reality is basically infinite fps. We all get by just fine, and the general consensus is that the effect is quite pleasing as long as the subject matter itself is.
Smoothing skin in photographs is entirely different. That's a selective effect applied before it reaches the display. Using higher resolution doesn't somehow force photographers not to airbrush.
As another commenter said, sometimes people deliberately decrease fidelity for effect, like filming in black and white after color was available. Doing it deliberately in a specific context for one work is totally different from a blanket declaration that 8k is fundamentally worse than 4k.
Extreme close ups, action sequences, camera panning and traveling are not part of reality. Nor do we change camera angles several times per minute in a discontinuous fashion. Cinema is not a full real world simulation, it is a technical way to tell stories.
I get that cinema isn't reality, but the more capable the medium is, the more choices the filmmaker has for telling their story. I can buy that 24fps might be the best choice sometimes. I don't buy that it's the best choice for everything. It would be a crazy coincidence if a framerate chosen a century ago due to technical limitations when dealing with cellulose just happens to be the perfect framerate for cinema.
The Hobbit (shot 48fps 5k 3D)had mixed results using HFR. The scenes shot on location with less CGI looked awful (like low budget early 80s BBC mid day dramas) while the green-screen, heavy CG scenes felt like being in a video game (in mostly a good way). Billy Lynn's Long Halftime Walk (shot at 120fps 4k 3D) looked absolutely horrible, like early HDCam home movies. It was impossible to get swept up into the movie and made an ok script and good acting feel much worse than it actually was.
I do believe someone will crack the code on HFR, but it will first require the right source material (The Hobbit and Billy Lynn's Long Walk Home were not it). I suggest utopian sci-fi or something set in a sterile environment. But even beyond that they have to figure out the lighting, makeup and set-design (and the extra burden HFR and 8k puts on post-production, especially for CGI/VFX).
One element that actually helps make a film feel cinematic is a slight softness to the image. The best looking digital cinema uses on camera filters and/or post-process to help achieve this look that comes naturally from film shot at 24fps.
Younger audiences who've grown up on HD and HFR video games are less bothered by the differences, but audiences usually don't really know what they want until they see it (one reason that early audience feedback is poison to the process).
background note: 20years experience working in production and post-production
As for the rest, I don't doubt that it's hard, requires new techniques, isn't always the best choice, etc. But I don't buy this idea that it's always worse. Which doesn't seem to be the argument you're making, but it is the one I was responding to.
I'm getting a lot of good arguments about why certain videos should be shot using less than the maximum possible. But that's quite different from saying 8k and HFR is just plain worse.
There is no physiological phenomenon that makes low frame rates more natural or appealing. The real world doesn't have a frame rate. The preference for low frame rate is learned. No one ever selected 24 fps because it was better. They selected it because it was technically feasible at a reasonable cost and crossed the line into acceptable fps.
> Note also that we, for example, have been technically lowering the resolution of photographs (smoothing the skin) to make portraits appear more pleasant since forever. It's one of the main staples in fashion and magazine portrait photography.
This is wildly different. Firstly because most things on screen are not human skin and secondly because we've been smoothing skin in real life for centuries with makeup. The desire to see skin as smooth and flawless doesn't mean that people generally want the world to be blurry.
Most, no, just the most important (actors' faces).
Douglass Trumbull, a pioneer in cinema techniques, is developing technology to allow mixed frame rates and resolution. So those panorama shots could be 8k HFR, while maybe the close up shots of the actors are 4k 24fps. It will be interesting to see if this actually works in a film that requires suspension of disbelief. I wouldn't hold my breath for this to reach cinemas in large numbers anytime soon.
All this is to say, it is much more complicated than you make it out to be.
Your list boils down to "use it appropriately and don't assume old techniques are appropriate". Obviously there is a lot of learning the industry would need to do to use 8k well.
> Douglass Trumbull, a pioneer in cinema techniques, is developing technology to allow mixed frame rates and resolution. So those panorama shots could be 8k HFR, while maybe the close up shots of the actors are 4k 24fps. It will be interesting to see if this actually works in a film that requires suspension of disbelief. I wouldn't hold my breath for this to reach cinemas in large numbers anytime soon.
Sounds interesting, and promising, though I agree that it seems unlikely to be widespread anytime soon, even if it works beautifully.
> All this is to say, it is much more complicated than you make it out to be.
That's an odd comment to end on. At no point did I ever say it was simple. I said it's absurd to treat 4k like it's the pinnacle and anything beyond that is somehow actually a loss.
Sorry, I think this was meant for another comment, not yours, as I don't see the sentence I thought I was responding to in yours.
>Your list boils down to "use it appropriately and don't assume old techniques are appropriate". Obviously there is a lot of learning the industry would need to do to use 8k well.
This takes more than a "competent film crew". A competent film crew should have no problem working in well established techniques and workflows but wouldn't necessarily be prepared to venture outside that. If I was directing a production in 8k, HDR, VR, 3D or any edge cases, I want more than a competent film crew. I want creative thinkers and problem solvers. I want crew members who have experience on a wide range of projects, everything from digital video to imax (you might be surprised at how often even the crews of big budget productions have limited experience outside the status quo).
In the early days of the RED camera, the best footage came from cinematographers who had worked in lower budget HD productions, not film cinematographers. The HD crews had already been working in similar workflows, but the competent film crews were flummoxed by this one piece of equipment and even though they could see the results on set, they would still send back footage that was way underexposed and often unusable (and this was often from very well respected and experienced cinematographers).
I believe few people in the industry believe that 4k is the pinnacle. But most do believe that the technology to move to 8k is not even close to ready or worth the added cost, that workflows for 4k are just now becoming standard (the majority of projects are still finished in 2k, although that will change with distributors like Netflix now requiring 4k delivery). And that audiences won't care enough about beyond 4k enough to pay extra. Are you ready to pay extra for an 8k screening? The theaters have to recoup the cost for new projectors while they're still paying off the brand new 4k installs. Oh, and there aren't many cinema lenses that can cover an 8k image (especially since many DPs prefer the quality of older lenses).
There are old timers that lament the loss of film and resist moving from 24fps. But they'll be replaced by the younger generation who will be more open to experimentation and pushing the medium beyond its limits. The industry is driven first and foremost by profits, so once the pencil pushers see profit in 8k and HDR, the whole industry will move in that direction.
But I wasn't saying it's trivial to leverage 8k well, just that if 8k is too much to deal with in some cases, effectively reducing the resolution seems a tractable problem.
Edit: since it's an ongoing theme in this discussion, I should point out that my "just record each scene" description is from a technical perspective only. Making it result in a nice-looking work of art is, of course, another matter entirely.
And it's only recently that mixing frame rates in the same timeline has worked well. 5-10 yrs ago, we would have to convert the footage, typically using hardware specifically built for conversion (Teranex or Alchemist). Then came along desktop software that could do decent jobs converting. Now, if i'm cutting in Premiere or FCPX, I can just drop the footage in and the software will take care of it, usually without issue (Avid still has problems with non-native frame rates and it's recommended to convert before importing the media).
And it's been this way since playback frame rates were standardized (and automated). Projectors had no way of changing playback speed on the fly depending on what frames were projected. Television was locked into one broadcast frame rate spec (29.97 in N. America, 25 in Europe) and TVs were locked into one of those specs. Tapes and disc playback was typically locked into one in the early days, although DVD eventually allowed for multiple playback options, as does Bluray, but the hardware typically converted them for playback on 29.97 screens (pre-HD). We're still limited to what the screen can playback to some extent, the broadcast specs of 23.976, 24, 25, 29.97, 30, 50, 59.94 and 60.
With computers and monitors we have the capability to playback multiple frame rates, if the software allows it, and that is where the current issue is. I can playback different frame rate QTs on the same screen, at the same time without issue. But there is no software (that I know of) to create or playback videos consisting of multiple frame rate videos. Game engines might be able to change playback on the fly, but I have zero knowledge of that tech.
And it would be advantageous to have tech that allowed switch on the fly playback. I'm currently consulting on a documentary that uses source footage from at least 3 frame rates (24, 25 and 29.97). And the editor is cutting in Avid, so we have to convert before import, which slows down the creative process and adds complications to the finishing process.
It's not always a naive "the more the better" issue.
Too much resolution can result in unnatural and distracting results -- your eyes don't have that much resolution as when you see a person in 8K in a extreme closeup. It's neither natural not flattering to see an actors individual skin pores, for example.
Similarly, faster frame-rates capture motion more accurately (sharper under motion), but result in a movie that looks distracting and soap-opera-ish to the viewers eyes.
Your eyes can't see human faces 8 feet tall either but no one seems to think that movies are better on 20 inch screens. To the extent that movies look unnatural in extremely high resolution, that's an issue where directors need to learn to use the medium effectively. It's certainly not more natural to see blur or pixelation instead of pores.
> faster frame-rates capture motion more accurately (sharper under motion), but result in a movie that looks distracting and soap-opera-ish to the viewers eyes.
Only because we've been trained to expect films to be a juddery 24 fps experience. If high frame rates become commonplace they'll look commonplace.
If I move close enough to your face that it fills my entire vision I can see it in glorious 16k, skin pores and all. It's the extreme close up that's unnatural, not the fact that you can see details in extreme close ups.
As with 3d, most directors are lagging behind in adapting their style to changes in the medium. Some things that were developed for low-fidelity grainy film don't work in 8k. Some fight scenes that look good at 24hz look garbage at 60hz. But on the other hand 60hz and 8k allow for speed and details that just produce blurs on old 24hz mediums.
Rather, I think the reason why theatres don't benefit from 8k is that from a certain distance increased resolution doesn't have a benefit. E.g. If you ever get close to a billboard you may notice that the resolution of the ad is quite poor, but from street level it looks perfectly fine. We aren sitting much farther than 10' from the theatre screen, so more resolution won't be noticeable given the distance from the screen for most of the audience.
First is that the technology is just plain uncomfortable. The glasses feel bad, and my eyes hurt after a while.
Second is that filmmakers seem to have no clue how stereoscopic depth perception actually works. Objects only have perceptible parallax out to a couple dozen feet. Beyond that, depth is perceived purely by other means. But 3D movies keep applying parallax to objects much farther away. All this does is make them look closer and therefore smaller. The worst example I saw of this was an IMAX film at the Smithsonian about the development of the Boeing 777. There was a scene of a distant 777 in flight which popped the plane out of the screen. The result was that this 100-ton building-sized machine looked like a child's toy.
If the technology can be improved so it doesn't hurt, and if filmmakers can figure out how to use it without it looking utterly stupid, I'll give it another shot. Until then, I'm sticking with 2D.
I generally am not a fan of 3D, but my theatre seems to light 3D movies perfectly, while their 2D showings often seem dark.
When it's badly done it makes my eyes hurt after a while and I find it tends to make everything look blurry
Perhaps a broader reason is that 3d seems to encourage filmmakers to go for visual spectacle over good storytelling.
She loved the experience, so I sat there for three hours watching a blurry movie because the 3D glasses were awful and causing pain.
I can certainly see how too much early exposure would actually interfere with the development of the usual association of cues that support perception of distance.
They're usually released simultaneously for blockbusters, e.g., 1-2 IMAX screens showing 3D and 1-2 screens showing 2D. For others, they might start out only showing 3D on opening weekend and then transition to 2D as the movies age and move to smaller screens, e.g., Rogue One is still playing in my area but only available in 2D now.
In fact, you can test this yourself. Go look up "split-depth GIF", then compare the experience you get from those to a traditional 3D image (either in theaters or from a 3DS or something else that uses parallax to fake 3D). This was the first time in my life I've ever actually experienced the '3D' effect coming from any 2D plane.
Another fun thing to try: go try and hit a baseball, catch a ball, whatever. You may notice you are not very good at this, even with a fair amount of practice. These small objects don't give off depth cues your brain normally uses since it can't rely on stereoscopic vision, the lack of shadows, small change in relative size and a lack of other objects to use for depth positioning can make this a difficult if not almost impossible task for someone with amblyopia.
Want to know how I know all of this? Because I've done all of this, my left eye is horrifically bad and has been since the day I was born (80/20 in the left eye, 40/20 in the right), and even after TWO strabismus surgeries it decided to misalign itself again and I've suffered from occasional double vision ever since.
If you're still relatively young (20's or early 30's) you can check out vision therapy, there's been a lot of great success rates on retraining the brain to properly integrate the information from both eyes that will allow you to see "normally" and likely remove these issues. Personally I've opted not to because at 25 years old I really have no desire to change how I perceive the world (it's not a comforting thought, especially since once you've retrained your brain you can't exactly undo it).
I went to see Rogue One to the only IMAX cinema in my city. First time I went to an IMAX and I didn't like it at all. This surprised me because IMAX is supposed to be such a big deal, but the picture quality was awful: only the center of the screen was in focus, everything else was awfully blurry and dark. It really ruined the movie. Is this supposed to be a premium experience?
I can't wait for the 3D fad to die. It will probably outlast me, however.
It pains me that they aren't just focused on higher and high quality of projection instead of 3d.
3D at home... on a smaller screen... just doesn't work as well as a spectacle.
They had a preview for The Walk and that preview showed the true depth of 3D.
I imagine smaller chains opening new locations may pass on adding 3D capability to their theatres in future, however many may continue showing 3D films on the equipped screens if only to justify the extra $15k/cinema in equipment they were hustled into buying while making their mandatory 3D conversion.
The quality of projections in all the theaters that I visited is year to year worse. Somehow they switch to constantly worse technologies for 3D? I suspect the older ones but only on a few places had higher FPS than these done now? Now is watching the movements annoying, I see the objects move like in the strobolight -- not smooth at all.
It was plain old 2D except for a handful of scenes where they flashed an icon in the corner to tell you to put on your glasses. You put them on for a minute, saw some cool 3D effect in an action scene, and then you took them off before you got a headache and before it affected the rest of the movie.