Hacker News new | past | comments | ask | show | jobs | submit login
Netflix Originals: Production and Post-Production Requirements v2.1 (netflix.com)
399 points by Vagantem on June 22, 2017 | hide | past | favorite | 239 comments



James Cameron ("Avatar", "Titanic", etc.) used to argue that high frame rate was more important than higher resolution. If you're not in the first few rows of the theater, he once pointed out, you can't tell if it's 4K anyway. Everyone in the theater benefits from high frame rate. This may be less of an issue now that more people are watching on high-resolution screens at short range.

Cameron likes long pans over beautifully detailed backgrounds. Those will produce annoying strobing at 24FPS if the pan rate is faster than about 7 seconds for a frame width. Staying down to that rate makes a scene drag.

Now, Cameron wants to go to 4K resolution and 120FPS.[1] Cameron can probably handle that well; he's produced most of the 3D films that don't suck. He's going to give us a really nice visual tour of the Avatar world. For other films, that may not help. "Billy Lynn's Long Halftime Walk" was recorded in 3D, 4K resolution and 120FPS. Reviews were terrible, because it's 1) far too much resolution for close-ups, and 2) too much realism for war scenes. Close-ups are a problem - do you really want to see people's faces at a level of detail useful only to a dermatologist? It also means prop and costume quality has to improve.

The other issue with all this resolution is that it's incompatible with the trend towards shorter shot lengths. There are action films with an average shot length below 1 second. For music videos, that's considered slow; many of those are around 600ms per shot.[2] They're just trying to leave an impression, not show details.

[1] https://www.polygon.com/2016/10/31/13479322/james-cameron-av... [2] http://www.cinemetrics.lv/database.php?sort=asl


You neglect to mention the fact that we are so used to seeing 24 fps that anything above it doesn't look like a movie.

Why do home videos have that "home video" look? The biggest reason is the 60 fps frame rate - it just doesn't feel cinematic. Even the 48 fps of the Hobbit films felt too "lifelike" and not cinematic enough.

A lot of prominent directors, as you've mentioned, say they'd like to move towards a world with higher frame rates. But that'll be a bitter pill to swallow for a viewing public that unconsciously believes "cinema" means 24 fps.

3D in particular is very difficult to watch at frame rates as low as 24 fps - a big reason it makes so many people nauseous, and a big reason so many directors are saying we need higher frame rates.

High resolution may not be a huge positive but it is definitely not a negative. There's nothing inherently cinematic or better about low resolution like there is about 24 fps, and if excessive sharpness feels jarring in a scene, the cinematographer can elect to use a lens with a softer focus.

And the strobing effect you mention - unless we're talking 3D (where motion blur feels wrong), a low shutter rate and consequent good amount of motion blur easily avoid strobing.


Viewers will get over it. There were industry people opposed to sound, to color, to wide-screen movies, and to digital cameras. They're mostly over that. Adding grain in post-processing is on the way out.

(Film is really dead. There are some directors still making noise about shooting on film, but it's digitized for editing.)


we've had >24fps for a number of years.

It just doesn't look film-y. I suspect it won't be until VR that we'll see proper high framerate.

Its just such an oddity that I don't think people will take the risk, given the expense in retrofitting cinemas. (plus virtually no TV is actually capable of properly doing 60fps)

just one point:

>Adding grain in post-processing is on the way out.

thats with us to stay. most film grain from 2008 onwards is digital (yes even on film films) because most will have gone through a VFX pipeline with DI. Grain is stripped out and put back in after.

grain is a good visual cue for a number of things, just like desaturation of colour. its a tool thats not going to go away


It doesn't look film-y is exactly the kind of excuses we used to hear in the past. The reality is that once you've watched movies in 48/60fps you can't really go back to slow framerate movies as you see them blurry and stuttering. I personally can't wait for 24fps to be a thing of the past. Especially for action movies.


At which point it will look like a youtube video to many, not like a movie in the cinema. High frame rates haven't been successful for several years, I don't see why this should change. Same for 3D. Maybe there will be another trend that enables it, but as of now 3D wasn't a great success.


> but as of now 3D wasn't a great success

I too find 3D gimmicky often, but probably because the technology varies grandly between production, cameras and theater displays. On the other hand, there are 3D movies every day in every big cinemas and 3D TVs as well. So I'm not sure we can say that it wasn't a great success.

> High frame rates haven't been successful for several years

The number of cinemas that can display 48fps is not great, the number of cinemas that can display 60fps is zero? So I don't know how you can say that "High frame rates haven't been successful for several years".

Actually if you look on Youtube, high FPS videos are successful.

> At which point it will look like a youtube video to many, not like a movie in the cinema

There are some great youtube videos out there, don't know why you're saying this. Cinema is what you're defining as 24fps, sure because you're used to it. If tomorrow we start watching a lot of 60fps movies then you will define it as Cinema. Objectively 60fps is better for action movies anyway, the rest will follow.


> The number of cinemas that can display 48fps is not great, the number of cinemas that can display 60fps is zero?

any cinema that can do 3D can do 48FPS, at least.

RealD uses a single projector, with an electrically controlled circular polarising filter on the front.

This is why 3D in cinema for any kind of action is terrible, because you get juddering nastyness.

To get round this, some places project at 96 FPS(well I thought it was 144, but that might be the limit of the projector where I worke)


> any cinema that can do 3D can do 48FPS, at least.

You'll have to explain to me why they weren't showing The Hobbit in 48fps then. You sometimes had to go to a different country to see it in 48fps.


because buying the film in HFR costs extra...


AFAIU 3D TVs are not a thing anymore

http://www.businessinsider.com/3d-tv-is-dead-2017-1


I think it will change because high frame rates look much better. It's not what people are used to, but what people are used to changes over time.

3D has two major problems. First is that the technology sucks. The glasses are heavy, bulky, and don't do a particularly good job of filtering out the opposite eye's channel. Second is that filmmakers don't understand how to do 3D at all. Every 3D film I've seen loves to add parallax where there should not be parallax. They don't understand that binocular depth perception only works out to a few dozen feet, which causes anything with observable parallax to be perceived to be nearby, and that in turn causes large objects to look tiny. Seeing a spaceship or airplane or mountain that looks like a toy because the filmmakers decided to "pop" it out of the screen is the exact opposite of a cinematic experience.

High frame rate doesn't have this problem. The technology is good, and using it properly in films doesn't appear to be a failing.


Exactly!

Fake parallax that just turn epic scenes and scenery into tabletop models.

It's obvious, but why do they ruin their efforts like that. Don't they watch their own movies after post 3d editing?

I think even Avatar made it too far. I think I've whatched some animated films that didn't blow totally, but almost every other film that I have seen in 3d was a disappointment.


This is complete speculation, but my guess is something like: the people who might understand this (skilled directors and such) are used to 2D and don't much care for 3D, and the people who push 3D (executives) are too obsessed with making things "pop" to realize what they're doing.


True 3D cameras are a massive pain in the arse

Either they are huge, to get two cameras side by side, or they have a half-silvered mirror arrangement (with colour disparity)

Add to that the rigs wobble (vomit inducing) and the distance between the cameras is far to wide, it all looks a bit poop, or requires a huge amount of post work to make fly.

So the normal thing to do is manually cut out each object (rotoscope) and adjust the divergence to place it in 3d.

every object, every frame.

it mostly looks a bit poop.

Not to mention is normally done quickly, like clash of the titans was converted in ~1 month.


Having worked in the industry, I've seen UltraHD with a proper setup. (with the 192 channel sound)

for documentaries and sports, yeah, its brilliant.

But for "film"? it sucks.

why?

Because it looks like a play, but with bad acting. Everything that we have learnt, subconsciously, about film, is based on 24 FPS. Any action of any narrative substance in a modern film is linked to a slowmo. This relies on 24FPS. Because things are smooth, we register it as different.

Now, I suspect where High framerate will be a thing is in VR. But thats a new medium, in the same way the talkies were.


Watch a bunch of 60fps movies and I assure you that you will look back at 24fps movies and think those look weird.


... I have, I do, I see lots of them.

ultraHD is 8k @ 60fps.

its great for sports and nature documentaries. films look like plays. actors look stilted and wooden.


In the 2D animation industry it's mostly irrelevant since we are animating on ones, twos, and fours (every single, second, or fourth frame at 24fps) depending on what is happening in the shot.

There is also not nearly as much tweening as you might expect. Sometimes animating on ones just cannot accurately give you the same effect as letting a persons brain fill in the missing info. Which is why watching The Hobbit in 48fps pulled me out of the movie at some points; I appreciated the extra clarity, but there were details that would have otherwise just been a blur that became distracting.


> there were details that would have otherwise just been a blur that became distracting

That's a good explanation which fits with my experience. I wonder, could that aspect of 48fps be mitigated by bumping up motion blur (i.e. lowering shutter speed) during such moments.


When I first watched the Hobbit, I didn't really notice any difference other than pan-intensive scenes didn't look as washed out. I don't recall thinking it was too lifelike or realistic. When I heard that it was because it had a higher framerate, I decided to start using frame interpolation via SVP[0] on my video player to artificially create more frames between each original frame, and I'm really happy. It isn't perfect, in some scenes there can be artifacts, but it mostly looks great. In action-packed fighting scenes you can finally see what's happening, and not just one big mush of colored abstracts.

I liked it so much that I even went out of my way to buy a TV that had a built-in frame interpolation that's been said to be better than most other high-end TVs.

[0]: https://www.svp-team.com/wiki/Main_Page


I finally understand people who don't like 3d movies. Your comment makes me feel ill! I hate high framerates in movies. Just ruins it imo.


Is this a real concern of ordinary people? I am not a movie expert, just somebody who watches movies a lot and I have never thought to myself "this new movie with 48fps doesn't look like a movie, it's too lifelike, not cinematic enough".

I assume people who think like that would be very marginal minority of movie experts. I don't believe average viewer would even think of such argument.


It's the opposite, actually. "Experts" are pushing for higher framerates, but "average viewers" complain that something feels wrong, home-video-y, and just plain weird on high-FPS movies. They can't put their finger on it, they certainly won't mention the framerate, but they tend not to like it.


Because ordinary people don't spend the time to appreciate it, it just look too different for them. Exactly how color movies looked too different at the time.


You're pretty wrong here. Critics panes color movies, consumers like them. This is the inverse of that.


Not true. This pattern is the same you saw with iPhone "nobody would use that" or CDs or ipads, or video games or ... there are always people who are against progress.


It's not a technical argument they make, it's just a gut reaction to 48fps looking different. The fact that it's more lifelike can actually make it feel fake, because it can feel like you're watching actors on a set.

Personally I do think people will get over it eventually, especially for less "cinematic" stuff. Or possibly variable framerate will become a thing, and directors will choose different framerates for different effects.


I see. I remember seeing Hobbit movies for the first time and they looked slightly different from other movies. But to me it seemed like a better quality so I didn't complain.

Though, I heard some people say that scenes in old LOTR movies looked more realistic. This was mostly because Hobbit used more CGI though, for example orcs in Hobbit movies were all CGI but before in LOTR they had real actors to play orcs.


> The fact that it's more lifelike can actually make it feel fake, because it can feel like you're watching actors on a set.

I personally didn't get that feeling either. Just got a feeling of "it looks better". Especially that dragon scene.

> variable framerate will become a thing

It already is a thing with slow motions, but it is not what you're thinking of.

Have you ever tried to do slow motion with a 30fps video? It doesn't look good. So your idea will probably look bad, inserting 24fps sequences in a 60fps movie will just look laggy.


4k is less resolution than 35mm film, which was used very successfully for most of the history of cinema. Chris Nolan shot Interstellar & his Batman films on 70mm, which exceeds 8k resolution.

So I think you're quite wrong about 4k being "too much resolution for closeups" or being "incompatible with... shorter shot lengths".


4K is about 25MP. That's around the resolution limit of 35mm film in still photography. The exposed area is considerably smaller in movies as the film is run vertically rather than horizontally.


Without getting into the old and terribly complicated film vs digital argument, I think it's widely agreed that 35mm and 4k are at least in the same ballpark of resolution, and that IMAX exceeds 4k, so I maintain that shooting 4k does not present any new issues around closeups or quick cuts.


How is it 25 MP? 4096 × 2160 pixels = 8.8 MP.


Oh, oops, I thought 4k referred to the vertical resolution. In that case, yeah, film possibly has a bit more resolution still.


If you do that for three colors separately that is around 25MP.


24 frames/second is a minimum. Their FAQ says you can shoot at a high framerate.

Therefore, the best way to interpret "Bitrate of at least 240 Mbps (at 23.98/24 fps) recording" is probably that if you shoot at 24 frames per second, then you must have a bitrate of 240 Mbps. Not that 24 frames/second is the only framerate allowed.


This likely gives them confidence that if they were to remaster for a different color-space or higher resolution, that they could. For a 4K original shot in 8K, Netflix could send it back through the production process for a more reasonable cost and be able to launch the title quickly.

I'm surprised they don't ask for VFX sources to be archived though. ST:TNG and Babylon 5 both suffered badly from loss of the original VFX.


Most VFX shots loop through not one but several software packages. Sometimes even through multiple VFX vendors. Project files are proprietary and become inaccessible over time. And different VFX houses write their own add-ons that are not shared with a vendor like Netflix.

So collecting VFX shots in a pre-rendered state is notoriously difficult. Doesn't mean it's not worth trying. But you'll probably end up with various decomposed elements (models, rigs) and not something you can easily and quickly re-output in 4k, HDR, etc.


Often you can't reliably re-render shots from a few months earlier on the same project due to the pipeline changing so rapidly. Nevermind going back to the tapes for files that probably came off Irix workstations originally. The file formats for assets (models and textures) are backwards compatible enough that you can usually reconstruct models and maybe animation. So having an archive of the original files doesn't mean that you can re-render at higher resolutions without a lot of effort. The originals would need to be redone with more detail in any case to benefit from the higher resolution.


I'm not familiar with modern VFX software. But couldn't you just take a snapshot of the server and then load that onto a box in the future if you wanted to re-render it?


Archive-by-VM has been attempted from time to time. In the case of VFX there isn't a single server, but many different workstations, servers, render blades, etc. So preserving the entire pipeline in amber is a challenge.


And, somebody has to pay for it. The producers of the current project won't allocate any of their budget to preserve something for a (hypothetical) future sequel/re-release - that is somebody else's problem. The only places that I know of that do a decent job with archiving are animation houses like Pixar that own their own IP.


its a question of cost. Sure you can keep a pipeline frozen, but thats very expensive, and there is little need to do it.

its just cheaper and simpler to re-do it from the clean footage, or perhaps get some of the original models.


Possibly but I believe some software packages use a render server to farm the heavy lifting out to


The VFX elements for TNG were mostly kept, so it was mostly new post-production - it was DS9 that got bit hard, since there was much more CGI in the later seasons making an HD remaster too expensive.


I'll bet we see neural network-based upsampling (aka super resolution) being used for HD remasters in the near future.


Nobody's done this yet?


It's an area of active research: https://github.com/nagadomi/waifu2x


Maybe they could find the guy that put this together:

https://www.youtube.com/watch?v=Lymh5p6UbbU


They may have gone back to them for the Blu-ray release of ST:TNG. If you haven't seen them, the picture quality is amazing - far better than the DVD releases. Word is that CBS/Paramount said the cost was such that they'll never do it for the other Star Trek TV properties.


Never is a long time. I still have hope.

For DS9/VOY and for Babylon 5.


>I'm surprised they don't ask for VFX sources to be archived though. ST:TNG and Babylon 5 both suffered badly from loss of the original VFX.

What do you mean by this? Is it common practice to "remix" a show when it's rebroadcast? I just assumed they more or less press play on a .mp4.


I think it's not about re-broadcasting, but about providing the show in a better format. The originals were shot on 35mm film, and then that was "mixed down" with the visual effects to some lower-definition format (maybe SVHS, but I'm just guessing) and the result sent to TV stations to broadcast.

To make a high-definition format (e.g., blu-ray or HD streaming) they'd go back to the original 35mm masters and re-convert them. But then all of the visual effects had to be re-added on top.

There are various articles that mention quality issues with the remastering:

http://trekcore.com/blog/2015/08/netflix-brings-vfx-fixes-to...

http://trekcore.com/blog/2012/11/review-star-trek-the-next-g...


I'm not sure about Babylon 5, but I believe TNG was shot on film, but the effects were all done at broadcast resolution due to cost. This meant that when they released the DVD version, it was not difficult to create HD versions of all the "real" footage, but the effects were all in standard definition, so they looked awful. For the Blu-ray remaster, I believe they redid all the effects with modern CG, so they no longer look like crap.


Both ST:TNG and B5 took their original source material and brought it down to SD and added the visual effects in at that level. So, we still had higher resolution original source material, but it had no effects in it. And these shows have a LOT of effects. When it came time to release ST:TNG on Blu-Ray, they had to recreate the effects and apply them back to the original material, which was a large and costly undertaking. Since it didn't make enough money, they have no plans to do this things like ST:DS9, unfortunately.


Wikipedia has more details about Babylon 5's transfer issues. They shot the show on Super 35mm film so they would have a widescreen source, which they telecined down to 4:3 for broadcasting. But when the show was remastered for widescreen in the 2000s, the production company did not return to the original widescreen film. They converted a 4:3 PAL copy to NTSC and then upscaled that to widescreen!

https://en.wikipedia.org/wiki/Babylon_5#Mastering_problems


That conversion path hurts to even read :(.


> ST:TNG and Babylon 5 both suffered badly from loss of the original VFX.

Yeah, Star Trek (VOY/DS9) suffers really on Netflix - I wonder what they chose as source - straight DVD rips? The intro of VOY, for example, has weird color artefacts when the deflector dish of Voyager flies by the camera.


They do say it isn't a full list of requirements to to talk to your content specialist. I imagine they'll want those files too.


rec 709 isn't a different colour space...


That's beautifully clear - I wish I worked with specifications so lucid. I've got almost no real knowledge of the field it's governing, but I believe I would know how to successfully shoot some footage that Netflix would accept off the back of reading it.

One thing intrigues me though - albeit likely a function of my lack of knowledge on the matter - do these requirements implicitly rule out shooting on film for Netflix?

(I mean, I'm sure that ${hollywood-bigshot} could negotiate, but for Joe Public..?)


> One thing intrigues me though - albeit likely a function of my lack of knowledge on the matter - do these requirements implicitly rule out shooting on film for Netflix?

In terms of resolution, no. Standard 35mm can produce a good quality 4K transfer (depending on the speed of the film you're using), and if you shoot on VistaVision (which is where you use 35mm film but rotate it 90 degrees) can go even higher.

Of course, the cost to do so is massive, and it's getting larger ever year. The only TV show I know this year that was shot on 35mm was Westworld.

Joe Public can't afford to shoot on 35mm for his/her Netflix original series even if they wanted to. Netflix would almost certainly only let an extremely experienced A-list director shoot on film, both because they'd be able to negotiate it but also because on film you can only usually afford a 3x overage (that is, you shoot three times what you end up using), whereas with digital that can go as high as 10x. It takes a really disciplined director and DP to manage that.


3:1 would be an extremely low -- I've never seen a shooting ratio that low in my life. Shooting just 4.5 hours of footage for a 1.5 hour film, for example, is unheard of.

Even a 10:1 shooting ratio was pretty low for indie features in the 35mm days. Nowadays it's not uncommon to see shooting ratios upwards of 30:1 on digitally acquired productions. (Although that number varies a lot with the director.)


Even ${hollywood-bigshot} = Bong Joon-Ho who made a movie for Netflix that premiered at Cannes film festival wasn't allowed shoot on film:

http://variety.com/2017/film/news/bong-joon-ho-working-with-...

At first, Darius Khondji, my cinematographer, and I wanted to shoot ‘Okja’ on 35mm, but Netflix insisted that all Netflix originals be shot and archived in 4K,” Bong said in an interview with Variety


>Even ${hollywood-bigshot} = Bong Joon-Ho who made a movie for Netflix that premiered at Cannes film festival wasn't allowed shoot on film

That's just BS on Netflix's side though. They still feature all kind of non-Netflix movies that are shot on film.


Netflix has even added amateur youtube series to their collection. With low resolution and terrible audio quality etc. They have no control over content they don't produce. Doesn't mean they can't set standards for the things they pay to produce.


For a film they green light, probably not. But if the film is already shot and they want it, I don't see that getting in the way.


I don't know for real, since it's not my field. But generally any field that you start digging into turns out to have its complexities, that you may not have expected. While this looks pretty decent to the layman's eye, if I got this level of detail for the software projects I work on, it would also be much better than what I normally get to work off of, but I still would have plenty of room for interpretation.


To paraphrase Bill Gates (who never actually said the original, but anyway) 4K should be enough for everybody.

Having seen 1080p stretch and play nicely on a 30 feet cinema screen, and not being much worse looking from regular Hollywood titles even for front seat viewing, I don't see the allure of 8K even for "future-proofing".

Sure, monitors and tvs might improve their resolution in the future. But I don't se human eyes improving much (regarding angular resolution vs distance) or houses getting any bigger to fit a 30ft tv.

4K is good for reframing (cropping) and higher detail, but after some point enough is enough.


I spoke to a director friend of mine a few years back about shooting in 4K. He says that when he works (whenever budget allows), he would always shoot in 5K, not because he wanted the extra resolution for the full frame but because he wanted the ability to crop the shot down without losing resolution. Some shots would be scaled down from 5K to 4K, but others would be cropped to 'zoom in', or allow for minor panning that wasn't present in the original camera work.

8K presumably provides the same benefits but to a greater scale; you can scale it down to 4K, you can 'zoom in' on parts of the shot (such as a subtle detail, a pickpocket stealing a wallet for example) without having to use a second camera or record the shot twice, and so on.


My phone (google pixel) does something like this for it's image stabilization.

When you film in 1080p it's actually a bit more zoomed in than you are expecting because it's really using the whole 4k+ sensor and applying some fancy coding to that to pan and zoom around in that space to provide some stability.

It works incredibly well!


Yeah, but what about "zoom and enhance"? :)

Seriously, sure, 4K is enough for output but who says it's enough for input? As long as sensors keep getting better, the industry will keep finding ways to take advantage of it until it's essentially required.

Imagine a future where you can zoom in on any detail as well as you could with a high-res sensor at capture time?

It's not necessary for today's viewing experiences, but we know little enough about what is going to become popular that I wouldn't put ANY bets on "4K" being enough forever.


I sometimes help film live events for a friend when he needs extra hands. Getting the framing perfect is really hard unless you're very good with the camera. Panning to follow a moving subject while simultaneously making sure everything else in the shot is desired is... really hard. Imagine being able to simple film a "general area" in 8K and the post-production team could handle all the framing/panning and still get 1080 output. That would be an amazing feature of filming in higher resolution. So the higher the better... let's get there, and make everyone's life easier.


>So the higher the better... let's get there, and make everyone's life easier.

Not really. For one, 4K already makes life more difficult, adding huge trnascoded files, slow renders, proxy workflows, etc. That's on a pure technical level.

On an aesthetic level, it's a bad habbit too and cropping in post is a lazy copout or an emergency kludge. Setting your frame is more than just "getting what you want in the final output", it affects parallax, compression, etc. Croping from 2x the frame is not the same optically. And deciding in post means a less well thought out frame in the first place. So, it's nice as a emergency kludge to save ones ass, for documentary, ad work or news, but not so good for movies.


You are correct it would not work for movies. For my use case, it sure would!


"zoom and enhance" is one of the ways 4K is used today in Sports replay to see details that might not be visible in a 1080p recording.


Enhance!


I use my 40" 1080p tv as a second monitor from time to time, and it's way too ugly to be deemed as 'good enough' IMO.

Perhaps action movies won't be much improved with better resolution, but I'll want screencasts, browser windows, nature documentaries or really any detailed image to be shown in 4K at least.

Actually I'd want screen to stop being "TVs" or "computer monitors", it should be just a high def big rectangle than can be used to display anything. For that we'd need to come at least to retina level of text rendering from any comfortable distance. That should mean 8k as standard def in my book.


>Actually I'd want screen to stop being "TVs" or "computer monitors", it should be just a high def big rectangle than can be used to display anything.

The problem with this is that things you want in TVs are different than things you want in monitors. Its very rare that you need a 120+hz refresh rate, 1ms response time television, but you do often want that in a computer monitor, and response time especially is very, very expensive to reduce (modern monitors often have <5ms response time, televisions will often have 10x that).


> Actually I'd want screen to stop being "TVs" or "computer monitors", it should be just a high def big rectangle than can be used to display anything.

I would love to see this as well.

My ideal "television" has at least 8 HDMI inputs plus a couple of composite or component video inputs. There would be no cable/antenna/tuner input. There would be buttons on the remote for power, volume, input, plus menu and enter to access the settings. There would be a handful of audio outputs, and a switch to disable the internal speakers, which would allow it to feed into an audio system for those that like to have that separate. It would come in a variety of sizes, but only in 4K or 8K resolution.

This doesn't have a tuner or any TV-specific features, nor does it have any of the Smart TV stuff. People who need those are not the target market for this TV.

I wish this was a thing that existed.


Amen. My current TV has the Android TV stuff so tightly integrated that when you're making the CPU work hard, there's huge delays switching inputs or settings.


Buying a smart TV is like buying an all-in-one PC. Much easier to replace the Fire Stick if it's outside the box.


Unfortunately buying a TV without the smarts also means sacrificing the DVB-T tuner, as well as the price being significantly higher. It looks like the commercial version of a ~2012 panel (http://pro.sony.com.au/pro/product/professional-displays-bra...) retains at least some of the smarts. I like the idea of driving a TV via RS232 but the replacement for the tuner - a HDHomeRun - is sold at ridiculous prices in Australia.

I bought my 4K TV at $800 - at RRP of $1800 I would have been really angry but I can buy a 4K HDMI switch and get over it.


Actually, a lot of the lower end Vizios act like this now. They are essentially a monitor with chromecast. And people seem to rip them in reviews for that...


>I use my 40" 1080p tv as a second monitor from time to time, and it's way too ugly to be deemed as 'good enough' IMO.

That's because monitors are mostly about static images, and are viewed in a much smaller distance than TVs.


For traditional 2D screens, I tend to agree with you - that 4K is quite enough. But for Virtual Reality, these higher definitions are a must. 8K per eye will be much better than 4K per eye - at least that's the thought at the moment.


Mostly agree - but even with a screen displaying 2D, how is different. I get the whole "your eyes can only see n number of pixels" what in the world difference does that make? You can't see individual bits of matter photons are reflected off of, but the collection of mater that builds up that "image" to your eye.

Sure the increase in resolution is "imperceptible" in terms of individual elements building up a particular frame, but to assume a visual cortex can't discern increased detail is just arrogant and borderline laughable


If resolution was the only metric, that makes sense.

But reality is that we're seeing higher color depth ("HDR"), higher framerates, higher quality sound streams (e.g. Dolby Atmos and beyond), and even technologies we aren't envisioning today because the hardware isn't "there" yet.

If 8K is only a resolution bump, then boring, but it may be much more than that. 4K is.


Actually it's the reverse situation.

Shoting on lower resolution gives you better color depth/accuracy AND higher framerates to chose from.

It gives better color because (all other things being the same) signal and color accuracy degrade with higher resolution -- less photons "hit" each diode on the sensor.

And there's literally no digital cinema or amateur camera that doesn't allow for much higher framerates in lower resolutions than it does on higher resolutions. That's because the higher the resolution the more taxing it is for the sensor and processor in the camera to keep up with high frame rates. 4K x N fps means 4 times more volume than 1080p x N fps.

Lastly, sound is totally orthogonal, because that's recorded off camera with special recorders.

So we might be getting better color depth and sound quality lately, but that's not because we moved to higher resolution.


4k and 8k is going to be more useful with large and extremely large screen sizes. HD is perfectly designed for traditional TV use cases - with a screen on a table in a living room.

4k and 8k means a completely different use model, like a giant wall-sized screen in the same living room. It's the difference between a movie theatre and an IMAX movie theatre.

When that happens, cinematography can expect to be different, sorta like IMAX movies have different cinematography. Sometimes they even composite multiple HD films together on screen without losing resolution. This would actually be extremely useful for sports, where the background shows the entire field at 8k, with windows showing stats or replays, etc..


Have you actually seen 4K in the cinema?


Yes. So?


Was not clear from your comment as commonly the movies mastered in 1080p these days.


...And that my friends, is how you layout specs. Simple enough for anyone to read and understand, yet concrete enough to minimize interpretation variances. Love that change log.


No. This is exactly not how to layout specs. Specs should limit themselves in scope to the deliverables not the mechanisms that produced those deliverables.

It is reasonable to say you want a painting that is 4 x 8 and fits on your wall. But dictating that all paintings must be painted with Kolinsky Sable brushes is unreasonable.

4k 10-bit log EXR ACES deliverables, sure thing.

But please don't tell Roger Deakins he can't use that camera[1].

http://nofilmschool.com/2012/09/roger-deakins-talking-about-...


You were modded down to goatse.cx territory, but I agree. This spec is a recipe for conformity, not creativity. Never mind Deakins, can you imagine handing it to Lars von Trier or Robert Rodriguez?


not really. this how you do not do it. I'm particularly bugged by their list of approved cameras. it's silly. exclude most of the interesting cameras in the market for 4k now.

my bet is that they are receiving so much submissions, and actually buying all of them, that I guess the company, or a few clever individuals, saw the "consulting" or equipment renting/financing potential and will try to monetize on selling cookie cutter packages.


What cameras would you prefer shooting on that isn't listed? I'm not seeing anything that I'd prefer to shoot on (other than film).

Cameras like the Sony A7s or the Panasonic GH5 are great for the low budget film, but if Netflix green lights your project, you can afford much better cameras. Unless the project calls for really small and unobtrusive cameras, in which case Netflix would most likely approve the use of whatever camera best fits the project.


I would say that the Arri Alexa SXT is a better camera than most of the cameras listed there.


I suspect the Arri Alexas are not on the list because they do not have "a true 4K sensor (equal to or greater than 4096 photosites wide)." The SXT has a 3404 x 2202 resolution when used in open-gate format.



The Alexa SXT doesn't match the first requirement - a true 4K sensor. It has a 3424x2202 sensor.


That's valid. And kinda surprised it's not listed. Any other cameras that would make sense to be on the list?


The projects i am involved are only being paid $1mi by Netflix for long feature. It is hardly money that takes you out of low budget line. Granted, it is "free" 1mi since they still keep box office and rights....

If you are not filming in a place where equipment rental is free like the US, GH5's become very inviting as your second (and third) camera.


Is this for projects funded (at least partially) by Netflix before production? Would love to hear more about your experiences (both with Netflix and general filmmaking outside the US).


> I guess the company, or a few clever individuals, saw the "consulting" or equipment renting/financing potential

Not really, at that level everyone in Hollywood is renting their cameras anyway. Indeed, one of them cameras on that list (the Panavision DXL) is only available to rent, you can't buy it.


They did say that additional cameras could be approved on a case-by-case basis. And anyway, professional shoots and productions aren't hobbyists. They don't want "interesting" cameras, they want proven, reliable cameras.


They said this how you lay out specs, not this is which cameras you use


Ah, that Canon EOS C700 is a symphony of light capturing technology, though.

Here's a sample "A Day in Kyoto" shot at 4K 120fps raw:

https://www.youtube.com/watch?v=_MeEKCYvApM


I always find the idea of future proofing interesting here. Like the way Seinfeld reruns are in HD even though the technology didn't really exist at the time--because they shot a TV show in actual film and then could re-scan it later to keep up with modern tech.

Crazy expensive but obviously given the value of those reruns the cost made sense.


I don't think they shot Seinfeld in 35mm to be future proof. It was the best way to film at the time.


Friends reruns also got the HDTV rescans. In some shows you can see things that weren't supposed to be on-camera in the 4:3 format.

It's just a matter of which shows are worth the cost and time to do it.


They did this to some of the star trek series. Some scenes you can see things that were not meant to be seen. E.g. here Kirk and Khan are arguing, and then suddenly two people dressed like them start fighting: http://www.youtube.com/watch?feature=player_detailpage&v=B_c...

I can't find it now, but I believe they even redid some of the early CG effects by just rerendering the old models with modern software. It's worth preserving that kind of stuff for future generations. One of my favorite video games was rereleased over a decade later with updated graphics. Sadly they had to recreate everything from scratch because the original files were lost. And the original models were very high detail. They just converted to 2d and down sampled a ton because of the limitations of PCs at the time.


What is the vid you send supposed to show?


I think the point the parent was trying to make was that 4K resolution can be detrimental. You can plainly see that the two actors that play the original characters didn't actually perform the fight scene. In the original broadcast the resolution was so low that makeup and effects didn't really have to be that great in order to work.


Agree with this statement. Future-proofing was, however, a happy side-effect.


For movies yes, but for TV at that time a lot was with video cameras.


Nitpick: HDTV tech came about, depending how you count, in ~1980. There were lobbying efforts to bring it to the US in 1987. Seinfeld came out 1989.

References: http://ecee.colorado.edu/~ecen4242/marko/TV_History/related%... http://electronics.howstuffworks.com/first-hdtv.htm


Wow, this is pretty scanty. For comparison, the BBC's technical requirements:

http://dpp-assets.s3.amazonaws.com/wp-content/uploads/specs/...


These two specifications are for different use-case. Netflix's is about having consistent quality of ordered works and preserving the ability to do lossless-ish conversions, while BBC's is technical specification of what can be broadcast without excessive additional conversion steps.


True - the equivalent Netflix spec is this one;

https://drive.google.com/file/d/0B9DJydDVOVKKLVdCdlF2cFVDVEE...

...again, a bit less detailed.


I'd like another post going into more depth on audio (sound design, mixing, mastering...). I've seen a few things on Netflix where the bad audio engineering totally ruined the experience.


Best pricing info I can find quickly:

* c300 mk ii 6998

* c500 18995

* c700 28000

* Varicam 35 12949

* caricam lt 16500

* red dragon 32520

* red weapon 49955

* red helium 49500

* panavision dxl cant find

* sony f55 28990

* sony f65 cant find

* sony fs7 12949

* arri alexa 65 cant find

* ursa mini 4.6k 5995

* ursa 4.6k 4995


ARRI Alexa 65 and Panavision DXL cameras are not for sale. They are only available for rental from ARRI Rental and Panavision respectively.

Sony don't manufacture the F65 anymore, so you won't find a current list price. It originally retailed for $65k for the camera body, and $20k for the SR-R4 recorder, so basically $85k. [1]

[1] http://nofilmschool.com/2011/09/sony-officially-prices-f65-t...


I'm curious why they didn't qualify the FS700 (considering they include the FS7) or any external recorders. Being able to record straight to a pair of SSD's in an Odyssey would seem preferable to dealing with Sony's bolt-on recorder + proprietary cards.


It's amazing that a show that'll be watched by millions can be produced on a few thousand dollars worth of equipment. It brings the barrier to entry way down. It's still gonna cost upwards of $100,000 per episode, even for a very low budget small-scale thing, due to crew, locations, sets, insurance, etc. But, it's a really interesting time for television and film production.

It's also interesting how in charge of this trend Netflix is; they're right out front (Amazon is doing interesting things too, with their pilot voting feature). They're providing the formula for how to make shows for Netflix (which would probably be acceptable across the board at Amazon and whoever else is doing original programming at this level and on this scale).

I really like this kind of openness about technical requirements, because it feels like an open invitation to make things. This, more than any hand-wavey "There's more opportunity for small productions than ever before!" kind of statements feels like an actionable piece of knowledge and a part of any business plan that involves making content for this new market. Democratizing that knowledge by posting it on the web is super cool. It seems small, I guess, but it's a cool indicator of where the market is going.

If I were fresh out of film school, or currently working at a production company, I'd be looking at that and thinking, "I bet I could raise the money I need to put together a pilot to these specifications." You'd still have to put all the pieces together to make a good show/film like you always have, but the technical side of it looks really achievable today in a way that it never has.


You're not counting the cost of glass for these cameras. Glass can be insanely expensive.


Yeah, I looked up some of the cameras, and it looks like the Sony FS7 can be had in a bundle with a 28-135mm lens for around $10k. I wasn't leaving it out...I think $10k is in the category of "amazingly cheap for a camera suitable for professional film and television production".

Sure, if you wanted to shoot beautiful cinematography, you'd want more lens options. But, if you're running on a budget, you could shoot a whole film with that one lens, and it'd still look better than most television shot in the 90s-00s.


My impression is that people generally don't even buy these cameras - they make a shooting schedule and then rent a camera (and associated lenses, mounts, dollies, etc)


Yeah, that's usually true. But, initial purchase price dictates rental price in a competitive market, which most of the hubs for filmmaking and TV production are. So, a $10k camera rental will cost an order of magnitude less than a $100k camera rental, all other things being equal (all other things aren't exactly equal, but still, it'll cost a lot less).


The RED cameras are pretty much unusable without a lot of very expensive add-ons. The real cost comes out to 100K+.

Source: https://www.youtube.com/watch?v=3t1PQJmM8P4


Yeah -- pretty much all of them are like that

I was putting some quick foundational numbers in there... I would expect all of those prices to be 5X or more than what you see after all the extra stuff that one would need to have a complete setup might be...

let alone time, training, education, talent, etc etc etc...


They come with a lot of "emotional" baggage

Well, its died down now, but it used to be a proper cult. old $work used to make software that was a key step in transcoding from their shitty jpeg2000 "redfile" to something more usable.

yes they had high resolution, but they had rolling shutter and were still quite expensive. Not to mention poor colour reproduction.


... and crashing. I hear a lot of complaints about crashing.


I wish they'd push for 4K 60FPS so they could upgrade their releases to this in the future.


60 FPS is undesirable for most movies/shows, as opposed to 24 FPS which creates a certain feel that most people prefer.


I think that's why he said "in the future", when people might eventually realize that 60FPS is good.


24fps is a form of impressionism, plain and simple. An engineer claiming that 60fps is better simply because it offers realism, is as naive as claiming that a hi-res photo is obviously better than a Van Gogh because it "has more detail". Can you imagine commenting on an impressionistic painting and remarking, "well it's nice, but if the painter has used finer brush strokes we'd see more detail and it'd be more life-like". The lack of detail is precisely the goal of the artist.

Films at 24fps are precisely the same thing. Their lack of detail and realism is precisely what has made the medium so successful over the last 100 years.


Actually, the comparisons between simple and double framerate are not "true" one factor at a time comparisons because the motion blur is not the same (linked to the camera shutter).

I love watching 48/50/60 FPS for psychedelic things or music visualization (and games of course ! like most of gamers) but it should be used carefully and intentionnally by the content director / producer. Some scene/content would require high motion blur and some others scene/content high framerate...


It's not nearly as naive as somebody claiming that an specific pattern of impressionism is the ultimately superior representation for everything.


It's also a way making me feel like I'm going to have a seizure every time they do a horizontal pan.


It's like the old doctor joke, "then don't do that".


No. A better comparison would be to keep taking pictures with an old camera and refusing to use the latest technology because it doesn't have that feeling. Sure some people do it but don't you prefer the nice stuff?


If directors were meaningfully able to choose between 24fps and higher framerates, then your argument would be a lot more convincing. Since in practice only well respected auteurs can get away with the extra cost and weirdness, I don't know if you can really call it an artistic choice.

I can believe that some scenes might benefit from the visual effect of a lower framerate, but I can also imagine other scenes, such as complex action and long horizontal pans might benefit from higher framerates. Also I think the "too real" feeling might work quite well to convey grittier settings.


Hogwash. Films are 24fps because that's how fast the cameras and projectors run and that's what people are used to. If what you were saying was true it would be by far the most unlikely coincidence that has ever occurred in the universe that almost every movie ever made chose precisely the same impressionist style.


Actually 24fps was chosen as a balance between audio fidelity and trying to save on film cost. I never said it was designed from conception to create that look.

It's a great example of a happy accident, or serendipity.


Cool :) can we have 60fps movies now that the technology is here?


Yes! The Digital Cinema standards support: 24, 25, 30, 48, 50, 60, and 120fps. The distribution channels exist for these frame rates, the challenge is finding a way to use them in a way that doesn't look displeasing.

Ang Lee shot Billy Lynn at 120fps and Sony Pictures had every intention to distribute it in 120fps. In fact they created a version at 24fps for legacy sites, 60fps as well as a 120fps version. At the last minute the studio pulled the 2 HFR versions and released only the 24fps version (note: not pulled for technical reasons) I've seen 20 minutes of the 120fps version and personally did not like the look.

It remains to be seen if James Cameron can utilize HFR in a manner that the general public finds appealing. He has stated that the Avatar features will utilize high framerates in some way.


All 3 hobbit movies where released in the high frame rate format (albeit 48 frames not 60) and the general complaint was that they felt "soap opera"ry due to the way people felt about them. https://www.forbes.com/sites/johnarcher/2014/12/16/hate-the-... Is a decent commentary on the issue


One of the reasons for higher FPS is that it gives artistic freedom to make panning shots etc. High speed panning in 24fps is horrible.

Maybe it would be possible to shoot at 60fps or even higher and then dynamically adjust the frame rate to keep the 24fps but smooth some scenes as required?


High speed panning is terrible in general.

Making it look good is akin to polishing a turd.


The issue is simply motion blur. You can just shoot in 60fps with a shutter as you’d use for 24fps, you get all the feeling, all the motion blur, and none of the chopiness.


That is simply not possible. While it is possible to render a digital scene at 60fps with any arbitrary shutter speed (e.g. 1/48s the default 180° shutter used in 24fps film productions), it is not possible in real life with a real camera. There are only 48 48ths in each second and you will not be able to start a new frame while you are still capturing the one before it. 60fps limits your availible shutterspeeds to a theoretical maximum of 1/60s or faster. At 24fps you are theoretically limited to 1/24s. In practice there will be a delay between frames so oth numbers will be slightly smaller.

Film or video shot at high framerates will necessarily have to be shot at higher shutter speeds than films shot at a lower fps. This has an obvious visual impact on the image.


It’s correct that you can’t directly reach the same result, but all media currently produced, for TV and film, is shot at different shutter speeds than it is shown.

As you need to add VFX, basically all content is shot with as fast a shutter as possible, and you add the motion blur back in post. And, as you rightly mentioned, when you do that you can choose any shutter speed in post.

Dealing with motion blur when rotoscoping is a bitch, btw.


My complaint about the 48fps (after going in hyped to see this Spiffy New Format) was that everyone looked like they were speed-walking all the time, which was really distracting. It was like a sped-up silent film. Nothing else I've seen in higher framerates has had that problem, though—just The Hobbit.


I didn't feel like that at all and it surprises me anyone did. Do you have the same problem when you see people walk in real life?


No, and other high-fps video I've watched (all on much smaller screens, though) hasn't had that effect. I've wondered if the sensation was generated by some combination of 3D + 48fps + viewing angle. Or maybe something to do with the post-processing particular to that film. My wife reported the same effect—people walking with a casual gait but seemingly slightly sped up. Also affected other movements of people and objects, but walking was the most noticeable. The level of distraction it induced was similar to when audio and video are slightly out of sync (though, to be clear, they were not, that's just the kind of constant low-level distraction in caused)

We saw the second one in 24fps 2d (we mostly avoid 3D because we rarely care enough about it to pay extra, but it was the only way to see the first one in 48fps) because we both hated the first one's presentation so much, so I don't know whether we'd have experienced it in that one.


I had the same impression of "speed-walking", and I saw it in 48fps 2D. It almost felt as if the projection was lagging sometimes and then sped up the film to catch up or something, it was very distracting.

I've never had the same impression from TV or home movies which is 50/60 fps.


It's a widely discussed effect that tons of people have complained about, including famous directors -- so one can hardly be surprised others find it as such too.

I for one, don't like that kind of high frame rate motion -- it looks as fake as bad CGI graphics.


Was that the frame rate or just hobbits trying to keep up with Gandalf?


> My complaint about the 48fps (after going in hyped to see this Spiffy New Format) was that everyone looked like they were speed-walking all the time,

I saw all the Hobbit movies in 24fps, and—especially (at least, this is where it first really stuck firmly I in my mind as wrong) in the underground scenes in the first, but also in parts of the other two—had a similar impression.

I think it had to do with some other element of cinematography particular to that film series, though it's quite plausible that 48fps exacerbated it.


I agree. The orc chase across the rock-studded field did not look good. I found it distracting, as the higher frame rate didn't allow my eye/brain to predict their motion.


Hence "in the future" when people feel less strongly about that.


They looked amazing to me and I had the reverse feeling afterwards when watching a 24fps movie. For some time they looked slow and stuttering/laggy until I got used to them again. Can't wait for more 48/60 fps movies ^^


60fps is not yet good. The Hobbit at 48fps did not look as good as 24fps and Billy Lynn's long Halftime Walk at 120fps 3D was one of the worst looking films I've ever seen.

I do believe someone will crack this nut (as Cameron did with 3D). But it's going to take the right project and very creative filmmaking techniques. Personally, I think the first one that works will be a sci-fi in a sterile setting, so the HFR will work with the narrative, not be a distraction.


I understand 60 fps is more 'realistic', but I hate it. It completely ruins a show/movie for me because I can only think about how weird it looks, without being able to determine exactly why.


Sorry, hate 60 fps. Everything just looks fake.


I would argue that 60FPS will never be "good" for visual media like film. Video games benefit greatly from 60FPS, but when I'm on YouTube and I see something in 60FPS that has a person in it, I generally turn it off unless I'm really incentivized to watch it, because it looks terrible (the exception is soap operas, which I don't watch).


Most people prefer it because they're used to it, and that's what they feel is "movie like"


Is 24FPS the video equivalent of vinyl's so-called "warm" sound?


To some extent this analogy works for me, but I'd argue 24FPS can still be a legitimate artistic decision - it can lend a film a more 'dream' like quality, for want of a better description, that the hyper-realism of high frame rate recording can lose. I think this was a large part of the reason so many people didn't like the "feel" of the Hobbit when it was screened in some cinemas at 48FPS. Rather than just assume that more is better, hopefully the future will see frame rate as just one more artistic choice in the same way focal length or aperture is used today to convey different feelings or effects.

The warm sound of vinyl on the other hand is just poor sound quality masquerading as something desirable.


That's a pretty good analogy. Film is attempting to portray a 3D world in 2D. Even if everything looks really good, it's still fake and 4K at 60fps just draws attention to that fakeness. For sports, news, even documentaries this isn't a huge deal because you are not trying to get people to suspend their disbelief. But for narrative cinema, it can take you right out of the picture.


I don't understand your argument… if the analogy were true, listening to a CD or streaming from the web would make most people think the recording is "fake" or artificial, longing to listen to it on vinyl. Which I guess most people probably haven't had the chance to compare to CD yet.


Well, lots of people do prefer vinyl. One of the things that soured me on the whole "audiophile" thing that I did for a while was the fact that some of my favorite music sounds worse due to the high quality of the playback. Prince's 1999 does not sound good on high quality headphones from a digital source. That and the money and snake-oil.

I think some of this will change with time as people get more used to the look and the production gets better (effects, costumes, etc.). Music is also an artificial construct from the get-go, even live, but most narrative film is meant to reflect real life in some way, shape, or form. That means that anything out of place is caught immediately by our expert eyes.

All that said, when I was in the industry, one thing I noticed was how easily you could gloss over minor visual details like set dressings or small props (think food on plates), but even the slightest error in sound production was enough to destroy the effect.


Thanks for sharing that insight. I guess that in terms of perception and experiencing – at least when it comes to what consumers are used to –, sound is a different beast altogether. And then you have to differentiate music recordings from sound production for cinematic content, where the latter is of course more about recreating real life.

It's funny that 2D stereo still works so well. Perhaps that will also change again, with binaural recordings and object-based audio, but that's something I don't see the music industry using anytime soon.


2D stereo is all about the equipment. We have two ears, headphones have two speakers and cars have two sides (I realize there are often 4 speakers in a car, but the layout impedes the sounds). Considering so few people consume music at home in a high quality setup, but rather on the go with low quality equipment, I don't see that changing. Interesting that the long running debate about Beatles in stereo vs. mono is centered largely around the fact that the music was mixed to be listened to on a mono phonograph speaker that most people had. As far as we've come, low quality is still how music is delivered.

There are some good DVD-A or SACD's with 5.1 audio, but they are niche and expensive.


>if the analogy were true, listening to a CD or streaming from the web would make most people think the recording is "fake" or artificial, longing to listen to it on vinyl.

Well, some people do. But unlike 24 fps vs 60 fps movies, mp3/cd has huge usability improvements against vinyl (not to mention it's free if you pirate or stream), so there's no real competition.

But take musicians for example: even most young electronic musicians, born entirely in the digital era, prefer analog synths for their warmth and bite, or try to emulate it on the PC with external processing or purposeful degradation of the signal.

Or take guitars. Who likes a clean, undistorted signal? Even for jazz, people still prefer a coloured, saturated tone.


Regardless of whether or not the analogy is good, the point still stands. 48 fps looks fake and totally destroys immersion.


No, I don't think so, not even in terms of being purist arguments that people make.

Long version if you're interested in why people some people may think vinyl sounds better than CD: http://wiki.hydrogenaud.io/index.php?title=Myths_(Vinyl)

It's hard to compare these two arguments since "CD sound" is the standard these days, whereas the "technically inferior" 24fps is still the cinematic standard. So, while people may label 60fps video as "unnatural", nobody would label a CD recording as such. Also, there are people who've never listened to vinyl to this day, whereas almost anybody will be used to the "film" look of 24fps, which has been the de-facto standard for decades.


Music also changed to leverage the recording mediums' "artificiality". There's lots of music that simply could not exist live. Creating a sound that sounds like a live performance (i.e. "natural") is no longer necessarily the goal.


I don't know if I would say it's just because people are used to it. 24 fps is highly stylized. Movies are shot in 24 fps, wide aspect ratio (often much wider than 16:9), heavily color graded and often have film grain applied to them. It makes them feel quite distinct from normal digital video.

I'd argue it is color grading and wide aspect ratio more than anything that makes movies feel like movies, but 24 FPS and it's less lifelike movement also does make them feel distinct.

On the other hand, no one argues that sports look a lot better in 60 FPS. For home movies I take, I also want at least 60 FPS, because it preserves reality better.

But movies aren't about preserving reality.


Honestly, having been immersed in a lot of 60fps content for a while, going to a movie theater and having it be 24fps was rather jarring. It strikes me as odd that they are heavily advertising high quality surround sound and 4k laser projectors while leaving the low hanging fruit of frame rate.


Color is undesirable for most movies, as opposed to black&white which creates a certain feel that most people prefer.


That's an opinion.


Increases costs dramatically. 2.5x the data.


video size != costs! Most movie production costs are mostly humans.


Video size absolutely correlates with cost. It's not 1:1 but there's a relationship. The entire pipeline masters from uncompressed sources. When you need to color, crop, re-frame, or add VFX, you need to read/render those files at original resolution. Render time (and equipment that can do it) becomes a factor, particularly for post-production vendors, who pass on the cost (and mark it up).

True, Post Production is not the largest piece of the pie, but it still adds up to millions of dollars.


Only uncompressed. Especially when there's not a ton of motion.


Not necessarily 2.5x. Video compression is a thing.


presumably they're storing the raw, uncompressed video? at the same time, I would imagine storing a couple extra terabytes of data isn't a significant cost in the scheme of a production.


I'm surprised they don't allow UHD resolution (3840×2160). There are probably a few people in the world who can reliably tell the difference with true 4K.

An average person can't even tell the difference between 720p and 1080p.


When you step back, it seems to reveal an almost silly obsession with video resolution as a proxy for quality and future-proofing. It feels like it's coming more from marketing than true quality or future-proofing goals (just look at the audio standards...).

On the other hand I definitely applaud their focus on having good backups and preserving the original source material.


UHD isn't technically 4k, its quad HD, hence the funny resolution.

cinema 4k is 4096 wide by 2160 https://en.wikipedia.org/wiki/4K_resolution#Digital_cinema and predate quadHD by quite a way.

firstly the better cameras are proper 4k, QHD cameras tend to be a bit rubbish. Secondly there are many more tools for dealing with cinema 4k, than QHD


They do allow it.


    Camera Requirements

    4K Resolution:
    • Camera must have a true 4K sensor 
      (equal to or greater than 4096 photosites wide).

Also notice that not a single UHD camera is on the approved list.


The cameras have to be 4K+; the final product can be UHD. See the "IMF Master" section.


I was talking about the source material from the beginning.

The fact that they allow UHD in the IMF master makes it even less reasonable, because downscaled 4K-> UHD footage will have lower resolution than the native UHD footage.


Can someone explain the 24fps to me? It seems out of place old-school in light of the 4K & 240Mbps etc.


24fps - these days - is less about technology and more about standards and aesthetic. There are plenty of reasons why films have been shot in 24fps for the past handful of decades.

When you start increasing that in YOUR film, you lose a sense of familiarity in your audience and their (subconcious) willingness for suspension of disbelief probably goes away with it.


> There are plenty of reasons why films have been shot in 24fps for the past handful of decades.

Really? I don't see any tangible reason why a low framerate would be superior to a higher one. Even on a very basic level high framerate is superior to low framerate and 24 is well...the absolute minimum.


> Really? I don't see any tangible reason why a low framerate would be superior to a higher one.

Bandwidth and data transfer, for example - also many devices eat through battery like nothing else when you feed them with high-framerate stuff, and old-ish TVs may not support high-framerate material properly.


>Bandwidth and data transfer

The bitrate is specified in the specs so that's not it.

>old-ish TVs may not support high-framerate material properly

These are the specs for how it's shot not how it's broadcast.


So were black&white movies.


I have never watched a tv show show and thought "You know what this needs? Not better writing, plotting, story arcs, acting, or directing, but more frames per second or a higher resolution".


You may not have noticed it consciously, but I bet your subconscious noticed it. Once I started to notice motion blur in movies I can't stop noticing it. Often times its really hard to tell what is going in a scene with lots of motion. All objects become unrecognizable blurry smears across the screen. Of course it effects the way scenes are filmed - directors take it into account and film things differently than they otherwise might have at higher fps. Avoiding shots with too much movement or camera panning, that might have been very superior.

Conversely, high frame rate stuff "feels" so much more real. It's hard to explain, but if you look at side by side comparisons, the higher frame rate is definitely preferable. The guy who did the tech demo of full motion video on a 1981 PC had some interesting points about this. He did experiments to find the optimal trade off between frame rate and resolution. He found higher framerate with lower resolution was better than the reverse, and had some nice examples of it. PC gamers have known this forever and are obsessive about high frame rates.


Maybe I'm an outlier, but there have definitely been times (long camera movements/rotations in particular) where I thought "jeez, that's a low framerate".

Some movies have it worse than others, and maybe it's my own fault for training myself to look for it with interactive games, but there are definitely times where it stands out as an outdated technical limitation.



Really? Everytime I watch an action movie and can't understand anything I wish it had more fps.


Bit of a false dichotomy but OK. I want my movies to have all of the above, including more frames.


It's not a requirement, just a specification of minimum bit rate.

Most movies are still presented at 24fps because higher frame rates apparently look different. The Hobbit was released in 60 fps and the reception by critics was mixed. Personally I don't see it but I'm no expert.


> The Hobbit was released in 60 fps and the reception by critics was mixed.

48fps, not 60fps.


>It's not a requirement, just a specification of minimum bit rate.

Sure. Why not specify a progressive 60fps though. Every second fkin phone can do 60fps these days...spec'ing future netflix catalogue at 24fps seems very odd.


People are used to 24fps and don't want to change. The same thing happened when we transitionned to sound or films in colors. Fortunately some directors don't care about this and will probably pave the way for the next generations (james cameron, peter jackson)


This is really tragic to watch. I like Netflix in many respects but this is just incompetent. Not one Academy Award Winning film for best picture would qualify for these specs. Not one. I expect very few if any nominees either.

The Arri Alexa is eliminated by these specs, for crying out loud. The single most popular camera amongst high end feature cinematographers.

This is driven by some misguided belief that input resolution == output resolution AND that resolution is the measure of quality.

I really hope they get their head out of their asses on this at some point.

It's good to have quality standards, and thank god they aren't Turner Classic Movies (the fuck was that all about??). But these specs are as arbitrary as saying all of your food must be cooked in copper cookware.

We tell stories, not pixels.


The Arri Alexa 65 is on their list.


Yes, and not the camera I'm referring to.


Pretend most of HN has no idea what you are talking about, especially which n is "THE n".


Good point. And I suppose I didn't do a good job making my point clear.

It isn't a matter of which camera is better, but that the camera is a part of the creative process of filmmaking and is not strictly tied to the quality of deliverables. It is a brush with which an artist chooses to paint, for creative and practical reasons.

For a company like Netflix to control quality the goal must be to capture the creative intent with the highest fidelity[0]. Keeping in mind that the pixels serve the content, the content does not serve the pixels, and pixels are just one of very many things that serve the content. This can and should be done by specifying the quality of the result, the deliverables, not the tools used in the creation process.

[0] Fidelity as apposed to quality is an important distinction (that seems to be lost on Netflix). "Pi", "Saving Private Ryan", "No Country for Old Men", and "Deadpool" have different objective measurements of image quality, but they can all be presented to an audience at high fidelity.


Thank you, that makes much more sense to me. Otherwise it is on par with a developer complaining about their workload running on equivalent AMD processors vs Intel.

It sounds like Netflix could use someone like you.


Netflix sells pixels. Not stories.

They are the McDonalds of movies, and a McDonalds is all about standardization and efficiency.

If only allowing for cameras that deliver a narrow and predictable band of technologies let's them work more efficiently, and if they are able to make and save money that way, it makes perfect sense for someone who works at scale to do exactly what they are doing. They are a market marker now, and thus can afford to play around with previously established norms.


They may be marketing pixels "more is better", but they are selling content. Customers come, and remain because of content. Bad content at 4k is still bad content.

It's true good image quality does help, but my point is they already have both simply by specifying the delivery requirement.

What they are doing is more like a company that says they won't buy software that's written on anything other than emacs, or Notepad.


I don't and will never own a 4K TV, but having seen it at friends' houses I think image quality can go too far. It is so "uncanny" vivid and sharp it's distracting. Maybe because I grew up with NTSC when a 20" TV was a "big" television, but I can't watch the 4K stuff without getting queasy.


I'm so, so glad I don't have to care about deliverables requirements anymore. Every studio has totally different set of requirements that are as complex as this and it's a real bear to make sure you're fully in compliance with them.


I wonder if Netflix is experiencing quality problems? Sometimes the lower-budget content that's a few years old looks pixelated or over-compressed. In these cases, it's somewhat obvious that whoever produced it just didn't know better.


If it's third-party content, that's not Netflix's fault: they have to use whatever footage the studios give them, and sometimes what the studios give them is junk, either accidentally or deliberately.


Do they have a similar Pre-Production requirements doc?


These are the kinds of high-quality production standards that all companies should employ. Technical excellence above all else


I assume this doesn't particularly apply to documentaries and such?


The Canon C series (100/300/500/700) are popular among documentary filmmakers. And this in mainly for projects produced by Netflix in the pre-production stage. Not for projects that have already been shot (but not yet distributed).

I'm in post on a project shot mainly on the Canon C100 (not on the approved list). We will be talking to Netflix at some point about picking up the project and I have no worries about in not being in 4k or shot by an approved camera. If they like the project and want it, the camera format won't matter.


I would assume that the document is only in regards to productions that they have a hand in during production. (Self produced)


Did you read the change log? The first item there implies the answer is yes:

>Unified Spec: With version 2.1, we made the decision to unify the two previously separate Features/Series and Docs/Coms documents into one overall set of requirements.


Now if they would stream UHD content at 240 Mbps, they would almost double the quality of UHD Blu-ray (144 Mbps). Or any increase of the 15.6 Mbps they are using now.


It's probably too early in HEVC's life to know how much of a real increase in quality that would bring - you'd also exclude most of Netflix's customers (my connection is exceptionally fast by Australian standards and I couldn't stream that). Not to mention blowing their storage costs out the window. Have you noticed their audio bitrates, btw?


Only 24fps? Wow


I guess this is old news, but apparently if you get hired by Netflix to shoot anything and you want to use an Alexa.. good luck. You should get in line and wait for an Alexa65 when it's available.


Can anyone break this down a bit for the lamen. What's the ACEs pipeline? Frame chart, power windows... And more?


This is Netflix setting the standard for the whole industry.


How so? There's an entire film and television distribution industry that (currently) dwarfs Netflix in scale. They all have their own requirements, though usually not written out with such clarity.

And, these are not onerous requirements. They ensure good quality but don't impose significant costs. There are several cameras here for under $10k, which was unheard of even a decade or two ago; when I worked in television in college (20 years ago), the cameras in the studio cost ~$80,000 each, and the decks we edited on cost ~$25,000 each.

If you're making a show specifically for Netflix, you follow these guidelines; if not, you can make it however you want, but these aren't terrible or adding significantly to the cost of a production. I mean, there are several line items on most productions that'll be more than the camera. And, in post-production, resolution and file size barely matters today; a computer can chew through 4k almost as easily as 1080p, and multi-terabyte hard disks are cheap-as-free.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: