
A digital raw file is not an original image - gorm
http://kenrockwell.com/tech/real-raw.htm
======
lutorm
These are all valid points, but it's not like film is without flaws either.
Film may not have pixels, but it has grains. The flipside of "always being
able to pull something out" of film is that it's outrageously nonlinear. And
film is not particularly light sensitive.

Astronomers used to use huge glass plates to capture telescope images, but
when CCDs became available there was a revolution in data quality. It's true
that astronomers probably don't care about the same things photographers do
wrt capture medium, but in astronomy, CCDs wiped film off the map.

The final killer for film: _You can't back it up._ Your slide may last
forever, but if you lose it, or your house burns, game over.

~~~
mustpax
Not only that, but to last a long time, film has to have been processed
absolutely correctly (developed, fixed, and fix-washed) and stored absolutely
correctly. Any mistakes, and you are a long way away from an archival copy.
This takes experience, skill and some more experience. Correctly operating a
digital camera for a hobbyist is much easier.

Also, while individual digital recording devices have short lifespans, it is
trivial to make completely lossless identical copies of digital information.
However making a good copy of an analog recording is notoriously difficult.

------
hristov
Ken Rockwell is once again being Ken Rockwell. He is just so full of
confidence and spews such large overgeneralizations that any little kernel of
truth he may have started with soon disappears in a pile of bullshit.

No film is not forever, film stock goes bad too. One very good example are the
dozen of famous classic movies for which no good quality print remain (i.e.,
many of the hitchhock films). Digital formats may change often but digital
data is much easier to copy and transmit between formats which makes it much
easier to protect and store without losing quality.

And film is not inherently more truthful than digital data. Film also depends
on sensing and is subject to sensory errors. For example different types of
color film can render the same scene very differently. Currently, digital
sensores are not as accurate as film in terms of details but this may change.

One ironic thing is that one of the main reason raw formats are not "truthful"
as Ken Rockwell says is actually partially Ken Rockwell's fault. See reviewers
like Ken have been so obssessed with noise in digital cameras, that camera
manufacturers have started to use noise reduction algorithms before the raw
stage so that their images will show up better in online reviews (such as the
ones that appear on Ken Rockwell's site). This of course is not good news
because almost all noise reduction results in loss of detail, and a raw image
should be free of noise reduction because it should be the image provided by
the camera sensor as faithfully as possible. A user can always apply noise
reduction later.

------
frankus
How is this different from the old "analog is better than digital because it
has infinite resolution" argument?

Any recording medium has limitations, and whether it's digital or analog is
neither here nor there.

If you rescan your 35mm slides at high resolution in 10 years, you'll get a
really nice picture of the individual grains in the emulsion, not a better
picture.

~~~
mr_dbr
Might be different for stills, but nearly all 35mm motion picture film, which
goes through DI, is scanned at either 2K, or 4K... 4K is about 8.5 megapixels,
not _quite_ the 175megapixels the article claims

This post [http://forum.blu-
ray.com/showpost.php?s=1e1304228adc42f80162...](http://forum.blu-
ray.com/showpost.php?s=1e1304228adc42f80162b53a1e0d4b5a&p=1256004&postcount=111)
explains the resolution of film well (quoting here for posterity),

> Film resolution is specified in resolving power in c/mm (cycles per
> millimeter) or lp/mm (line pairs per millimeter). Diferent stocks have
> different resolving power. Normally finer grained (slow) film has more
> resolving power, higher sensitivity (faster) film has less resolving power,
> etc, all else being equal.

> A simplified triangle of image quality capability is made by grain-speed-
> resolution. If you try to get more speed, graininess usually increases
> and/or resolution decreases, etc. With advances in film emulsion technology
> the triangle gets bigger. You get higher speed with the same fine grain,
> equivalent sharpness, etc.

> If you have coarse big grain you get more speed (sensitivity) but the
> resolving power is decreased, while if you have finer smaller grain,
> packaged in a more uniform way into a thinner emulsion layer, you get better
> sharpness and the ability record finer detail per millimeter, but less
> sensitivity (you need more light) somewhat similar to having more pixels
> packed into a sensor.

> The lens on the camera also has a resolution limit and the combined
> resolution of the film emulsion and the lens resolution that ends up on the
> final image on the negative is less than each's.

> So having the resolving power of the final image (c/mm) and the size of the
> image (mm) you can multiply both and get what the resolution of the
> film/camera/lens system is capable of.

> Also what we perceive as grain on photographic images is actually grain
> clumps as the grains are randomly distributed in irregular patterns within
> the film emulsion. (The smoothing and more uniform distribution of grain in
> film emulsions is one of the ways film quality has improved over the years)
> We're not looking at the individual grains themselves when we look at images
> in normal picture and movie viewing magnifications. To see the real
> individual grains you have to use those microscopic enlargements where the
> image is blown up so much you can barely make any of it.

~~~
frankus
Good information.

It's worth noting that the sort of optics common on a 35mm pro cinema camera
are a bit large for a hobbyist still photographer to be toting around.

~~~
mr_dbr
Not at all... <http://dslrfilm.com/?p=340>

------
coderdude
"Film images last forever, versus memory cards and hard drives which we rarely
are able to read after more than 10 years."

Film stock does not last forever either, it degrades with time. New film
stock, according to Wikipedia, can last "hundreds of years" however.
<http://en.wikipedia.org/wiki/Film_stock#Deterioration>

Interesting article, but more interesting is his usage of the trademark
symbol. 28 times within 175 words. Calm down buddy.

~~~
skwiddor
my friend works digitising film stock.

He told me that a Portuguese tv station is deteriorating and the fumes it
gives off spreads the process to the other film stock and even if they had all
the 35mm film digitisers in the world running 24/7 they would still lose a
significant portion so they are having to make some tough decisions about what
to save.

oh and it was Bell-Labs that invented pixels.

------
andrewvc
This is all well and good, but to get that kind of quality out of film you
have to shell out for a drum scan or a professional C or R print with an
enlarger.

You'll probably spend about $10 / roll for a contact sheet, and $40 / frame to
get a print / scan that shows off this quality. Plus, your time, going down to
a lab, dropping off, and picking up film. Then, you'll spend more $ on top of
that archiving your film (and that expensive scan) perfectly for 20 years when
ultra-super-drum scans are available at $2 / frame. In this fictional world,
20 years from now, if you're a pro your client no longer cares whether your
image now looks better, they're done with it. If you're a stock photographer,
you'll have to forgo the $0.001 iStockphoto is now paying out for full rights,
and if you're an amateur your old photos are good enough.

Worse is better, digital is better at the things almost everyone (including
99.9% of pros) care about.

------
blasdel
Christ what an asshole!

He refers to exposed film as if it hadn't been fucked with -- you can't go
back and reprocess it differently (like you can with RAW).

He describes RAW files as being a fixed array of pixels with lossy compression
-- that's _anything but_ RAW -- the good formats are the un-demosaiced
elements straight from the sensor, and that means that you can do things like
rotate the image at subpixel resolution (especially for non-Bayer-filter
sensors like the Foveon).

~~~
lurkinggrue
Rockwell is a known troll in photography circles.

I don't shoot jpeg since I don't always trust what the how the camera is going
to boil the image down and would like option of going a different direction
later.

Raw just gives you a bit more latitude than jpeg and a hell of allot less
hassle than film.

------
ars
It seems he has never heard of film grain.

Film has a definite resolution. It's not as easy to measure as digital, but
it's there.

I looked at film in a microscope, and the grain looks like colored clouds,
i.e. instead of square pixels, you have roundish blobs, but they are certainly
there.

------
mattmaroon
The tendency of a digital medium to go bad after 10 years more than makes up
for that with ease of backups. I've got virtually none of the photos I took as
a kid on film, I have 100% of all digital photos I've ever taken and then
didn't purposely discard.

Film has some great qualities, but durability is not one of them.

------
loumf
"light is analog" -- Einstein is rolling over in his grave.

------
moron4hire
I print upwards of 20"x30" with my 8MP DSLR. That's significantly larger than
most people ever get prints done in their home. Shooting digital also greatly
reduces my packing load when travelling. I have 16GB of CF cards, taking up
less space than a pack of cards. I struggle to fill 8GB in an entire week of
shooting while travelling.

At a certain point "more detail" is just wasted effort. You're not going to
stand right on top of a 15' tall blown up photo, counting film grain OR
pixels.

------
zokier
The problem does not lie only in the capturing medium, but also in the display
medium. I mean like, with film when you want to show a picture to someone, you
craft a print from it, which usually is quite high quality in terms of
resolution and color. But with digital, you create a lowres web-version of the
image, and send it via email or put it into a webpage, and then its viewed
from some cheapo, poorly configured, low-res display.

I mean, where is my HDR 64bpp 300 PPI display? I remember hearing about HDR
displays some 5 years ago, current bit depth is from 90's and almost 10 year
old CRT can pull more pixels per inch than todays "highdef" screens.

edit: throw in factory calibrated too into my "requirements"

If our displays were better, it probably would expose the limitations of
current digital capturing, and therefore push capturing tech forward, so that
remaining gaps between film and digital could be closed.

------
sp332
The latest ep of HD Nation over at Revision3 shows off the new transfer of The
Wizard of Oz to Blu-Ray. It's amazing how much visual data is captured on film
that can be recovered years later. <http://revision3.com/hdnation/wizardofhd>

------
jrwoodruff
I worked in a film lab in college, around 2002. We had one of the first
machines capable of printing digital images to photographic paper in the
state, because the lab owner had a hankering for tech. The first week the
machine fired up, we compared a digital file brought in from a pro studio and
printed at a larger portrait size, 16x20 I think, with the same image taken
with a medium format camera. We all agreed the digital image was at least on
par, if not better than, the medium format print. The digital image was
sharper and every bit as colorful as the film, but the film image grain was
clearly visible.

I don't know what camera was used to take it, certainly not a D1H, but I do
know it was of the studio, tethered-to-a-computer type, so I want to say we
were looking at about a 10 megapixel image.

------
yu
Am continuing to learn about both digital and film. Bought a Canon Elan 20?
years ago. To point out what I picked up in reading, chasing the links...
Zeiss.com: "35mm ZM lenses achieved 400 lp/mm on film; ZF ~300 lp/mm, medium
format lenses up to 280." Looked up average grain size, resolving power,
sampling theorem... For me, it remains: choose the tool for the task; consumer
DSC JPEG did that in recent years.

------
pyre
> _I'm consistently amused by innocent hobbyists who go through the
> aggravation of shooting digital camera raw files just to get what they think
> is marginally better technical quality, or the ability to go back and do it
> right a second time, but who completely forget that if you're willing to
> shell out this extra effort, you could shoot film and get better results
> today, and even better results tomorrow._

If the 'cost' of this is 'oh noes I need to buy Adobe Lightroom or Apple
Aperture as well as my camera,' then so be it. The majority of people --
amateur or not -- that are shooting in RAW are doing it with DSLRs. _They have
already shelled out hundreds of dollars for the camera and possibly more for
the lenses, who cares if they find out they need to purchase a ~$100 software
program to fool around with their raw images._

{edit} Even if people don't know that they need to buy a separate program to
process RAW image files, they get what they deserve. If you're on that much of
a budget, why are you spending hundreds of dollars on a DSLR without doing
even the basics of research into what you will need? Not only that, but you
can do RAW editing in free apps like GIMP, you just won't have all the bells
and whistles that the higher-grade apps have. {/edit}

> _20 years from now we can re-scan our film and get 2029-level image
> quality._

Unless your storage location burns down. Assuming that the film still exists
-- and isn't lost in storage -- in 2029, blowing it up to 100 megapixels is
just going to show you 100MP of grain from the imperfections in the lens you
were using.... Or maybe he's advocating that amateur photographers should
spend thousands of dollars on the absolute highest quality of lenses to take
photos of their cats to upload to Flickr? Remember he started out this 'rant'
talking about being 'consistently amused by innocent hobbyists who go through
the aggravation of shooting digital camera raw files.'

> _The Wizard of Oz was shot on film in 1939. Today it looks great on Blu-Ray
> High-Definition DVD._

Maybe could it be because The Wizard of Oz was stored in a high quality vault,
temperature- and humidity-controlled, possibly in an old salt mine to minimize
the possibility of bacteria as well? What about that 'cost' to amateur
photographers? How many amateur photographers from 1939 -- or even
professional photographers from 1939 -- have 100% of their collection sitting
around today in the same pristine quality as the Wizard of Oz cans?

{edit} Also, the Wizard of Oz wasn't shot by 'innocent hobbyists {/edit}

> _All we had was radio in in 1939, so how did they know how to make High-
> Definition DVDs back then? They didn't, but by having the forethought to
> shoot on film, they knew they always could scan the original raw film images
> with better equipment in the future._

Huh? So the film-maker said to themselves, "Hmmmm I know that we don't have
the ability to shoot HD video presently, but in 2009 the ability to scan this
back into these 'computer' things will make it possible to relive this film in
HD quality on a 60" plasma screen!" _They used film because it was all they
had available._

> _With film, we always have our original images._

Unless we don't (see: acts of god, etc). Digital images can be backed up.

> _At best, digital raw files are a close approximation of what came out of a
> sensor, not what went in. Digital raw files have already folded, spindled
> and mutilated the original image with the limitations of that era's
> technology. Digital camera raw files have already limited the original image
> in at least three different ways._

The point of a _RAW_ file is that it is the _RAW_ data that came off the
sensors. Not a 'close approximation of what came out of the sensor.' That's
why there are a million different RAW formats, because the internals of the
cameras are different camera-to-camera let alone manufacturer-to-manufacturer.
That's why there _can't_ be a standard unless we are standardizing on the way
that the sensors are setup, and I would argue that the would slow innovation.

> _Archiving film always allows you direct access to your original living,
> breathing live images at any time in the future._

Unless you stored it wrong. And what's up with the imagery 'living,
breathing?' Are you saying that the photo would be less 'living or breathing'
if I had used a high-quality printer to take my RAW images onto archival
quality photo paper?

> _If you demand the best quality for serious subjects, and don't mind
> investing a lot of time to get it, then step all the way up to film._

Wait a minute. I thought we were talking about hobbyists here. Have we now
cross over into the realm of the 'professional?' Wtf are you talking about
'serious subjects?'

> _Pixels were invented in the American space program of the 1950s and 1960s
> to allow computers to process images. The concept behind using pixels to
> approximate an image is that so long as you have enough of them, you'll have
> a close-enough approximation of the original image._

There's also mathematics behind this all too. Look up DFT (Discrete Fourier
Transforms).

> _Most digital shooters are wary of this, knowing that whatever they shoot
> today in digital may or may not be good enough to sell to tomorrow's market.
> Got raw files shot in 2002 on your then state-of-the-art $5,000 Nikon D1H?
> Enjoy going back to your 2.7 megapixel files! You may as well delete them
> now._

Might as well make fun of early-adopters of film technology where people
always had stern expressions because they had to hold still for a minute or so
for the film to capture enough light to produce a photo.

 _Look at those losers! They couldn't even take a picture with a smile on
their face! Now they're lock in time forever frowning! Not like me! I waited
until I could _smile_ in my photos!_

> _You can't go back to a raw file and get more resolution. With film, you
> don't have to make a resolution decision until you scan it._

You don't have infinite resolution with film. This is a fallacy.

> _[...] because you still have your original raw image captured alive and
> well._

> _With film, you always have access to your original live image._

> _[...] or just look at the naked film in all its original glory._

He keeps using these terms like 'alive' and 'breathing' to describe analog
film and prints. I'm beginning to sense a bias.

> _I support my growing family through this website._

Ha! I've found your bias. Your _family_ is analog!

------
moron4hire
Here's another question. What does Ken Rockwell have to fear from digital
photography? Seriously, why else would he expend so much effort debunking this
"conspiracy" (his words "(digital) raw is actually a conspiracy to addict you
to having to buy software the rest of your life")?

~~~
duskwuff
>> What does Ken Rockwell have to fear from digital photography?

The terrifying possibility that a potential client will pass over Ken
Rockwell's RealRAW EX PRO™ and opt for a less expensive photographer who uses
a digital (gasp!) camera.

~~~
moron4hire
Oh no! Someone call Bernanke! We need to stimulate the film photography
industry!

------
clistctrl
You could knock out an entire room of girls in a matter of seconds with the
first photo on this guys page: <http://www.kenrockwell.com/katie/2008.htm>

------
cakesy
Maybe this guy has a point, but he start of talking about absolute trash. Yes,
I can and do play VHS vidoes today. Yes, camera cards do die out after 20 or
30 years, but what has this to do with the fact I can make backups, or just
keep copying them around forever, try doing that with film. And the wizard of
Oz looks so good because of digital remastering, and they didn't have any
choice but to use film. When you start of your argument with this bullshit, I
stop reading.

