Can be simulated by shooting at the same frame rate (24 fps) plus some post-processing? It probably wouldn't be simple to write an accurate simulation, and it could be quite GPU intensive to process, but I imagine it's possible to get results from a digital camera that are indistinguishable from those using film.
One instance where it's very obvious to me was 300 vs its sequel 300: Rise of an Empire.
I tried to find 2 similar images:
So basically I think digital is getting there but it still has a little ways to go in matching the perceived quality.
Although high-def video most often is used for blue screen shoots, director Zack Snyder and Fong shot on film. "We wanted the film grain to show," Fong said.
I guess it is quite subjective, but what makes the second one look better to you? For me the first one, whilst chalked with noise/grain, has a much better rendition of the skin-tone, along with detail in the highlights, and the depth of the reflection in her eyes. I guess it's hard to describe really, but that's kind of why I would side towards the first image.
The main thing is the noise on her skin - it makes it look very artificial. Like it's not an actual person. On her cheek it almost looks like she has beard stubble.
The background of the first one also looks like a pointillist painting instead of the real world. (Although I might not notice it in motion.)
I'm not a fan of the skin color either since real people don't look like that. To me it looks cold and artificial, like I'm looking at a painted robot instead of a person.
I do wonder how much of this is film vs digital, as opposed to post processing. Film might not be able to stop grain, but I don't think it needs to look like that for the color. It suspect it was over sharpened and over contrasted in post.
I'll add that I think that convenience and workflow is the #1 reason for the change. The cost of film and processing in all but small independent productions is a minuscule part of the budget.
It really is difficult to say what sets film apart, but it's generally agreed among cinematographers digital and analog that the filmic quality is pretty unique to the film itself, though of course it can be faked fairly well.
With film, color response varies by film, ISO rating, light source and exposure , so each one has a very particular character ("Kodachrome look", "Provia look", etc.). Also, analog media isn't prone to clipping, so you can get deep reds, deep blues, detailed highlights/shadows and still have the rest of the scene perfectly exposed without a lot of work.
It's hard to achieve the same characteristics with digital in post-processing because the limitation is at the capture step. It's like trying to fix a dull audio recording - no amount of work in post can bring up a frequency if the mic hasn't captured it.
This is all, of course, in the context of color films. Black and white films are particularly difficult to mimic with digital, partly due to spectrum rendition issues that are difficult to recreate in post processing, partly due to very different grain structure (B&W grain is generally sharper than the dye clouds you get with color films), and it must be admitted partly due to romanticism on the part of film partisans.
Wait, this doesn't make any sense to me, isn't much easier to make a backup of a digital file compared to an analog movie? The digital file can be replicated any number of times with no loss of fidelity, and can be transported on the internet. How is film a better preservation strategy?
> It's estimated that the average size of an email is 75 kilobytes. Say we decide that 75 kb average is way off the mark and bump it up to 200 kilobytes per email times 144.8 billion emails per day times 365 days per year and were at just under 10.6 terabytes per year (3 of those Seagate drives will do the trick).
But doing that math myself results in:
200 000 * 144 800 000 000 * 365 = 1.05 * 10^19
The former has been the case with Fritz Lang's film Metropolis, with the complete version available today being a built (IIRC) from a mix of different masters and some raw footage, and the latter has been the case for another silent classic, Erich von Stroheim's Greed, which is a historical masterpiece but was heavily cut at release for both content and runtime. More recently, the version of The Good The Bad and The Ugly you get on DVDs now restores 19 minutes that were cut from the original...and I could cite a bunch of other important films (at least, if you're a film nerd) that only exist in their current form because someone knew the release version was compromised and went looking for the original production footage.
Edit: more about that http://arstechnica.com/gadgets/2014/05/could-disney-finally-...
I've heard of cases where 25 year old data stored in disk was corrupted over the years and had to be retrieved. In one case, somebody "lost" records of a few million bucks of receivables due to a bitched system migration.
As in every other industry, film people tend to resist new technology, then stampede into it, then do the same thing with some other technology a few years later - we don't do backwards compatibility all that well (because there's not a whole lot of money in it) and while there are technical standards for everything from color space to perceived audio loudness, they are frequently slightly abused or subverted for artistic or budgetary reasons.
However I'm being pernickety because film is my industry (and because I'm middle of checking off a list of deliverables for a feature right now, which requires me to be extra-pedantic). It has been a rapid and near-total shift. To shoot on film nowadays you need to be working with a very large budget (where your unfashionable preference will not have a substantial impact on production costs).
I didn't realize Fuji had already discontinued manufacturing motion-picture film. I keep thinking the film business will hit rock bottom and the trickle of remaining demand will be stable enough to maintain what little is left of the industry. Sigh.
It is missing the up-arrow because you have already up-voted it.