> "There's a magic to the grain and the color quality that you get with film."
Can be simulated by shooting at the same frame rate (24 fps) plus some post-processing? It probably wouldn't be simple to write an accurate simulation, and it could be quite GPU intensive to process, but I imagine it's possible to get results from a digital camera that are indistinguishable from those using film.
I find it's still really noticeable in the skin tones and highlights. You can find lots of comparisons online; once you see film beside digital, the difference can be pretty huge. Some people in Hollywood still really care about it... one recent example: http://motion.kodak.com/motion/Publications/InCamera/Creatin...
“It’s like we’ve forgotten how great film looks when you see it in comparison,” Alsobrook remarks. “We looked at each other, and it was a done deal. There was no question we were going to shoot film. It has a rich, creamy look to it that you just can’t get any other way.”
Aren't scenes shot on film digitised for editing and colour graded just like digital ones? I think a lot of the difference between movies is probably down to the processing rather than the camera. No matter how they shoot the movie they all seem to end up blue and orange.
Darn, looks like I can't edit my post anymore, and HN formatting screwed it up lol. First link is for 300, second one is for the sequel.
I guess it is quite subjective, but what makes the second one look better to you? For me the first one, whilst chalked with noise/grain, has a much better rendition of the skin-tone, along with detail in the highlights, and the depth of the reflection in her eyes. I guess it's hard to describe really, but that's kind of why I would side towards the first image.
The main thing is the noise on her skin - it makes it look very artificial. Like it's not an actual person. On her cheek it almost looks like she has beard stubble.
The background of the first one also looks like a pointillist painting instead of the real world. (Although I might not notice it in motion.)
I'm not a fan of the skin color either since real people don't look like that. To me it looks cold and artificial, like I'm looking at a painted robot instead of a person.
I do wonder how much of this is film vs digital, as opposed to post processing. Film might not be able to stop grain, but I don't think it needs to look like that for the color. It suspect it was over sharpened and over contrasted in post.
I don't know how much has changed since I was doing it, but in the past there was a noticeable difference in dynamic range between digital sensors and film. This was less of an issue for stills since you compose each shot, but with film, especially if you are shooting cinema verité style, you have to especially careful in uncontrolled environments of shadows and light sources to make sure you don't lose info.
I'll add that I think that convenience and workflow is the #1 reason for the change. The cost of film and processing in all but small independent productions is a minuscule part of the budget.
They're distinguishable. They've been shooting at 24FPS digital for a long time now, and digital post processing has been standard for decades.
It really is difficult to say what sets film apart, but it's generally agreed among cinematographers digital and analog that the filmic quality is pretty unique to the film itself, though of course it can be faked fairly well.
It used to be latitude / dynamic range, but digital sensors have already matched and surpassed film on that. I'm sure what the previous commenter said can be done (simulating film to a point where a movie director wouldn't be able to tell). It's different from analog vs digital audio, video has always been time-sliced so it's perfectly possible to recreate the exact same light.
With digital capture, color response is predictable, so you get accurate but boring images straight out of camera. Highly saturated colors, highlights and shadows can also clip because you don't have enough bits, so there's less room for error.
With film, color response varies by film, ISO rating, light source and exposure , so each one has a very particular character ("Kodachrome look", "Provia look", etc.). Also, analog media isn't prone to clipping, so you can get deep reds, deep blues, detailed highlights/shadows and still have the rest of the scene perfectly exposed without a lot of work.
It's hard to achieve the same characteristics with digital in post-processing because the limitation is at the capture step. It's like trying to fix a dull audio recording - no amount of work in post can bring up a frequency if the mic hasn't captured it.
If you could get indistinguishable results, studios would have done it to many of their movies already.
I haven't been to a movie theather for the past 4 years or so (kids), so I don't know what it's like now, but back then you could definitely tell. Film was still superior and if they could simulate it I'm pretty sure they would.
There's important differences in color response, although a lot of the most modern films have very similar color response to digital (because the one was designed to mimic the other, obviously). The grain is also difficult to mimic precisely; film, being an analog medium, has sometimes superficially higher resolution than a lot of digital sensors but this is because it degrades continuously (whereas digital tends to be superb until you reach the handful of pixels range).
This is all, of course, in the context of color films. Black and white films are particularly difficult to mimic with digital, partly due to spectrum rendition issues that are difficult to recreate in post processing, partly due to very different grain structure (B&W grain is generally sharper than the dye clouds you get with color films), and it must be admitted partly due to romanticism on the part of film partisans.
"Digital files need to be regularly transferred, putting them at greater risk of being damaged."
Wait, this doesn't make any sense to me, isn't much easier to make a backup of a digital file compared to an analog movie? The digital file can be replicated any number of times with no loss of fidelity, and can be transported on the internet. How is film a better preservation strategy?
IIRC there's actually quite a bit of overhead and risk when doing digital storage on spinning or tape storage. Properly prepared and stored film will last quite a long time and often what degradation there is follows a known pattern. Replicating many times can be destructive to film but if archival is your goal, you do it a limited number of times and almost certainly have digital backups as well.
Except when ten years have passed and everyone's forgotten about the movie and tending its backups. Corruption happens, then it's too late. If you have a film print in a vault, it can be ignored or forgotten for decades and there's still hope.
Oh yes they will - not least because digital archival is quite a bit more expensive than print archival, but doesn't generate much revenue, so guess who gets the shitty end of the stick in times of squeezed budgets? Also, while big studios can afford to invest in that sort of thing, lots of high-quality films originate outside big studios, where there's even less awareness of or capital for proper digital archival. In quite a few smaller studios the archive consists of disconnected disk arrays sitting in a box, a collection of LTO tapes, and a print master.
Anyone have any ballpark numbers on how the finished products are saved? Quick estimation of RED 4K at a _minute_ is about 2.5TB . The file-vaults of studios then will eventually rival the NSA's data collection warehouse!
From a recent IEEE Spectrum article: "If you were to make a 2-hour motion picture with an extended color range in 4K and at 48 frames per second, the raw (uncompressed) movie file would occupy more than 15 terabytes. For comparison, the total amount of data in all of the e-mails sent in the United States in one year has been estimated to be 10.6 TB."
The email estimation seems off by several orders of magnitude. If you follow the link in the IEEE article, you get this:
> It's estimated that the average size of an email is 75 kilobytes. Say we decide that 75 kb average is way off the mark and bump it up to 200 kilobytes per email times 144.8 billion emails per day times 365 days per year and were at just under 10.6 terabytes per year (3 of those Seagate drives will do the trick).
But doing that math myself results in:
200 000 * 144 800 000 000 * 365 = 1.05 * 10^19
In other words, 10 exabytes. 1,000,000 times more than 10 TB. So, 3,000,000 Seagate drives, not 3.
Well, the data for the final master copy is measured in terabtes but when you add up all the ingredients (unusued footage, VFX intercopies, safety copies, multi-layered VFX and audio stuff) a sci-fi blockbuster can hit a petabyte, easy.
I don't really get the every bit is precious thing when archiving film. Take the movie Casablanca, for example. Why would anyone care about more than the final cut, plus a few outtakes and screen tests? What's so precious about a blizzard of minute detail?
Well, suppose the master is damaged, and you need to go back to the original footage and reconstruct the scene using the same post-production methods that were available then? alternatively, suppose business conditions or personal conflict led to a truncated version of the original film being released, but the source materials allow the reconstruction of the director/ screenwriter/ producer's original vision?
The former has been the case with Fritz Lang's film Metropolis, with the complete version available today being a built (IIRC) from a mix of different masters and some raw footage, and the latter has been the case for another silent classic, Erich von Stroheim's Greed, which is a historical masterpiece but was heavily cut at release for both content and runtime. More recently, the version of The Good The Bad and The Ugly you get on DVDs now restores 19 minutes that were cut from the original...and I could cite a bunch of other important films (at least, if you're a film nerd) that only exist in their current form because someone knew the release version was compromised and went looking for the original production footage.
Not necessarily. Long term archival of data is hard and expensive to do... Storing stuff is easy, retrieval not so much.
I've heard of cases where 25 year old data stored in disk was corrupted over the years and had to be retrieved. In one case, somebody "lost" records of a few million bucks of receivables due to a bitched system migration.
sure it is, but you try opening up a word processor or spreadsheet file last saved in 1991 - it might not be corrupt, but there's a good chance that it won't come in properly. Likewise you have a floppy disk, hard disk, or CD-ROM from the same period you might have difficulty hooking it up - the CD-Rom is your best shot, but that assumes it was correctly encoded in the first place.
As in every other industry, film people tend to resist new technology, then stampede into it, then do the same thing with some other technology a few years later - we don't do backwards compatibility all that well (because there's not a whole lot of money in it) and while there are technical standards for everything from color space to perceived audio loudness, they are frequently slightly abused or subverted for artistic or budgetary reasons.
We didn't. Digital movies have been around longer than you think (eg Collateral was shot on a Viper > 10 years ago), and film is stil the only viable choice if you are shooting 70mm or for Imax delivery.
However I'm being pernickety because film is my industry (and because I'm middle of checking off a list of deliverables for a feature right now, which requires me to be extra-pedantic). It has been a rapid and near-total shift. To shoot on film nowadays you need to be working with a very large budget (where your unfashionable preference will not have a substantial impact on production costs).
It's not 100% digital. I don't recall what the exact number is but it also depends on what you mean by "movie". Many Hollywood movies and TV shows are still done on film but an indie guy with little to no money certainly won't do that nowadays.
He's talking about the number of movies that are made, not sales of film. Among Hollywood movies and TV shows with a reasonable budget, I wouldn't be surprised if 20% or more use film (as a very wild guess).
"Kodak's motion-picture film sales have plummeted 96% since 2006, from 12.4 billion linear feet to an estimated 449 million this year. With the exit of competitor Fujifilm Corp. last year, Kodak is the only major company left producing motion-picture film."
I didn't realize Fuji had already discontinued manufacturing motion-picture film. I keep thinking the film business will hit rock bottom and the trickle of remaining demand will be stable enough to maintain what little is left of the industry. Sigh.