
Microfilm Lasts Half a Millennium - adrian_mrd
https://www.theatlantic.com/technology/archive/2018/07/microfilm-lasts-half-a-millennium/565643/?single_page=true
======
freedomben
As a kid I got to play with microfilm and microfiche at the library while my
mom did research. I don't have a lot of pleasant memories from childhood, but
this was an exception. My mind was intrigued and blown by this.

Thinking back on the experience, it was incredible. I could go grab a copy of
a newspaper from any particular day in any year. I remember reading the actual
newspapers from the early 20th century. Big historical events that are in
history books, really take on a different meaning to you when you read them as
they were reported in near-real-time. The ability of microfilm to snapshot
history in a way different than say viewing a jpg or png, is fascinating.

If you are ever bored, and can find a library that will give you access, you
should definitely check it out. It's an experience I think all hackers should
have. Such simple yet powerful machines, built on a highly non-technical
system of convention. Very cool. Sorry for the long walk down nostalgia lane.

~~~
jdietrich
If it ever feels like the news is getting you down, I strongly recommend
substituting your normal news habit for watching old news bulletins on
YouTube. Try it for a week or two - if you feel the urge to check the news, go
and watch an NBC news bulletin from 1988 or a BBC news bulletin from 1991
instead. Lots of people have uploaded complete bulletins, which I'd recommend
over isolated clips.

I found it to be a revelatory experience. The news, ultimately, is just
stories. It's not a definitive or objective account of what's happening in the
world, it's just industrialized gossip with a veneer of legitimacy. Without
the sense of urgency that comes with newness, it's often remarkably dull. The
news shows you the variance, but not the trend; someone who caught up on
current events only once a year would probably have a better sense of where
the world is headed than someone who checks the news every day. Some stories
that were huge dramas at the time are now long forgotten; some stories that
barely registered at the time eventually turned out to be part of a crucial
change in society.

[https://www.youtube.com/watch?v=P4-_85x5k3s](https://www.youtube.com/watch?v=P4-_85x5k3s)

~~~
MichaelMoser123
> The news, ultimately, is just stories. It's not a definitive or objective
> account of what's happening in the world, it's just industrialized gossip
> with a veneer of legitimacy

Or it may be that people have a hard time to figure out what matters at any
given moment in time - you never quite know how a news item will play out -
will it become part of History or will it turn out to be just another detail?
go figure when it is happening...

Maybe grandparent is just right: when it is all happening people may be more
honest than after the fact, when positions are clear and everyone has to
justify his own stake in the grand puzzle.

~~~
jdietrich
_> Or it may be that people have a hard time to figure out what matters at any
given moment in time_

It's called "the news", not "the importants". Unusual, shocking and lurid
stories are much more likely to be newsworthy, but by definition they aren't
representative of what is actually happening in the world. Our perception of
the world is much more strongly influenced by emotive anecdotes than cold hard
facts. Violent crime rates in the western world have been significantly
declining for decades, but most people believe that the opposite is true;
multiple studies have found a correlation between news consumption and the
perception of high crime. It's very difficult for the news media to usefully
communicate important but gradual trends - rising incomes across the
developing world, wage stagnation in the developed world, the impact of
climate change, demographic shifts, the erosion of civil society etc.

I think that most journalists are well-intentioned and genuinely believe that
they are seekers of truth, but they operate in a system that reflects our
innate perceptual biases, our acquired cultural biases and the commercial
exigencies of media organisations.

Another useful experiment is to exclusively consume another country's news
media for a week or two, preferably a country you know very little about. A
week of reading a Ugandan or Nigerian newspaper provides a very different
perspective on world events.

------
nayuki
I appreciate how the article highlights the longevity of polyester-based
microfilm and the simple technology (lamp and magnifier) needed to read it.

But as a digital native myself, I'm disappointed that they failed to emphasize
the biggest advantage of digital information storage - the fact that data can
be copied _perfectly_ , as many times as you want. Analog microfilm cannot be
copied perfectly - the original is always better than the copy. This means the
microfilm is a unique artifact that needs to be preserved and taken care of.
One disaster, one fire, or one careless handler, and the object is damaged.

With digital technology, we pay a high price in the lines of code to build an
application and interface, to specify seemingly opaque binary file formats, to
have physical media (floppy, CD, etc.) go obsolete in a decade, and to build
complex CPUs - but never forget that what we gain is the tremendous ability to
copy perfectly. This is what allows us to share a file with someone and know
that their copy is a perfect backup for mine in case mine is lost. This is
what allows us to transfer files from old, slow, unreliable media (e.g. floppy
disks) to new, fast media (e.g. SSDs) without missing a beat. This is why we
can take a sizable collection of a thousand audio CDs and cram them into one
convenient hard disk drive (as WAV, not MP3), with the exact same audio
quality. This is why we can transmit files over the Internet - we surely can't
transmit analog data over a distance without a high risk of degradation.

If we collectively accept that storing information digitally with the ability
to make perfect copies is worth the effort, then there are things we can do to
improve the status quo. We need to make it easier to find and manage file
duplicates - otherwise it's hard to know whether a copied file has been
modified or not. We need emphasize building applications on open standards. We
need to make file format specifications accessible, well-defined, and easier
to understand. We need to make systems that have low implementation complexity
(e.g. CPU with few transistors, OS with few lines of code) to show what is the
minimum knowledge needed to reboot the technology.

While microfilm may last 500 years, digital data could truly be forever - as
long as you copy it periodically.

~~~
imhoguy
But digital copies ain't always perfect - you forget about bit rot [0]
unfortunatelly. One bit flip on stream of compressed picture or archive
distorts remaining result. Entire digital world is founded on analog physical
world with all its imperfections. Of course there are solutions to mitigate
them: checksuming, redundancy, error correction. But how often one checks
checksum hash on copy? Edit: CRC doesn't count as it is too weak.

I agree on open standards and deduplication, I hope content addresable
archiving will get more mainstream, IPFS brings big hopes.

[0]
[https://en.m.wikipedia.org/wiki/Data_degradation](https://en.m.wikipedia.org/wiki/Data_degradation)

~~~
altcognito
> But how often one checks checksum hash on copy?

Your operating system does this.

~~~
imhoguy
Only if you use ECC RAM and ZFS/Btrfs/ReFS. CRC e.g. in TCP is too weak.

~~~
zamadatix
Yes, if one was going to reliably archive information they wouldn't use FAT32
on a single spinning disk. Does this really need to be pointed out? The point
is digital gives you ways to have perfect copies, correct loss, and the
ability to transfer to the next storage medium with no decay. Things analog
could never achieve.

~~~
jacobush
Hybrid analog could achieve this. Film could have a picture printed on it, but
instead of going for maximum resolution (down to grain level) it could have
tiny tiny pixels, not visible to the eye, but readable by a computer. A kind
of steganography, if you will.

The data encoded in the steganography, could be any data of course, but for
our purposes, it would be the image itself.

Such a picture could be copied, with loss, in an analog fashion, but could
also be copied perfectly in a digital fashion, by a digital reader capable of
reading out the steganography information. Even the analog copy could be
digitally read if it was of high enough quality.

Another, more straightforward, way would be to print on the film, first an
analog rendition of the picture, then a digitally encoded pattern of the same
image, rinse repeat.

But from a future safe perspective, it could be smart to encode the data
itself inside the image, as per the first paragraph. Someone copying the
images could unwittingly copy the digital data too!

~~~
adrianN
You might be interested in Optar
[http://ronja.twibright.com/optar/](http://ronja.twibright.com/optar/)

~~~
jacobush
Indeed, thanks.

------
hjnilsson
Well there is sowmthing to be said for simpler formats, that can be read by
basic engineering skills, compared to for example storing something on a flash
drive the requires millions of lines of code to read data of.

For the same reason, I print all important photos for posterity. Who knows if
apple / google cloud photo library will be around in 50 years when I grow old?
I can make sure my printed material remains much more easily.

~~~
derefr
I don’t disagree with your general point, but for this specific assertion:

> something on a flash drive the requires millions of lines of code to read
> data of

Does it really require _millions_ of lines? Even if you’re just targeting that
one model of drive, rather than writing a general purpose SATA/USB kernel
stack? If what’s on there is, say, JPEG images, libjpeg (plus its dependencies
in glibc, I guess) isn’t “millions of lines” either. Where are these millions
of lines?

(I’m speaking as someone who has written a unikernel with disk IO and graphics
support. It was far less than a million lines!)

~~~
HarryHirsch
Compare the BBC Domesday Project to the original Domesday Book from 1085. One
can be read today, the other not so much. You ask what the folks at BBC were
thinking, did they never consult an archival librarian?

~~~
jacquesm
I played with that when it first came out. We had the discs and the player, it
was very impressive but even then I knew that there was no way that it would
last for even a decade.

It was a puny bit of data for todays standards (about 500M in total), but of
the day it was incredible. Back then a mainframe winchester pack with the same
capacity would be roughly the size of a washer-dryer combination.

The data not being publicly accessible at the moment is more of a copyright
rather than a technical issue.

------
fanf2
A super interesting 1960s intersection between microfiche and digital
technology: the IBM 1360 photostore: [http://www.computer-
history.info/Page4.dir/pages/Photostore....](http://www.computer-
history.info/Page4.dir/pages/Photostore.dir/index.html)
[https://en.m.wikipedia.org/wiki/IBM_1360](https://en.m.wikipedia.org/wiki/IBM_1360)

This was a machine capable of storing a terabit of data on write-once
photographic film. It wrote to the film using a scanning electron beam, and
then developed the film using a fully automated film processing lab, and
finally stored the film cartridges in a robotic storage library. Amazingly
cool, but required a lot of expert maintenance and diligent cleaning.

~~~
jhbadger
And the early versions of the PLATO educational computing system used a
computer-controlled microfiche reader to display graphics beyond what was
possible at the time.

[https://en.wikipedia.org/wiki/PLATO_(computer_system)](https://en.wikipedia.org/wiki/PLATO_\(computer_system\))

~~~
8bitsrule
It's as though we got so busy using all of this stuff to do ordinary things
that we kept making the same thing (GUI, net) better and better, but forgot to
keep trying new , richer ideas. As Plato attempted to do. That came out of the
U. of Illinois. Where'd the visionary academics go?

~~~
canhascodez
I believe that the idea that the end of some avenue of human ingenuity has
been reached, and your implicit statement of differing values between the past
and present, are both perennially recurrent throughout history. It's not
necessarily a bad thing if today's academics are pursuing things that are not
among your interests, and it's more likely than not that the speed of the
dissemination of ideas, the recognition of their value, and their broad
application has not appreciably changed in recent decades (assuming that it's
more meaningful to talk about that in terms of a "March of Progress" than a
"Random Walk of Progress"). I would think it most reasonable to assume that
the academics are where they have always been, doing the same arcane and
incomprehensible things that academics have always done, and that any
appearance to the contrary was a fault of my own perspective.

------
eesmith
A few tidbits I know about the topic. One of the organizations which helped
push microfilm was the Documentation Institute, founded by Watson Davis in
1935, which became the American Documentation Institute in 1937.

(As an aside, "documentation" was the hot term in library science in the
1930s. "Documentation" meaning "collection of informational papers" is only
from 1927, says
[https://www.etymonline.com/word/documentation](https://www.etymonline.com/word/documentation)
. As a further aside for those involved in science fairs, Watson Davis also
founded "the Science Clubs of America, reaching at one point roughly a million
school-age children across the United States; he also was one of the
originators of the Westinghouse Science Talent Search".

He participated in the World Congress of Universal Documentation. Quoting from
[https://en.wikipedia.org/wiki/World_Congress_of_Universal_Do...](https://en.wikipedia.org/wiki/World_Congress_of_Universal_Documentation)

> The World Congress of Universal Documentation was held from 16 to 21 August
> 1937 in Paris, France. Delegates from 45 countries met to discuss means by
> which all of the world's information, in print, in manuscript, and in other
> forms, could be efficiently organized and made accessible. ...

> The main resolution adopted by the congress proposed that microfilm be used
> to make information universally available.[8] Watson Davis, chairman of the
> American delegation and president of the ADI, stated that the volume of
> information being produced created difficult problems of access and
> preservation, but that these could be solved by the use of microfilm. ...

> In his address to the Congress, H. G. Wells said that he thought that his
> idea of the "world brain" was a precursor to the ideas other delegates were
> proposing, and explicitly linked the projects being discussed to the work of
> the encyclopédistes:

These are all ideas which percolated through Vannevar Bush's Memex and many
others to become the modern internet.

Also, one of the co-founders of the ADI was Atherton Seidell. He published a
series of articles in _Science_ promoting the use of microfilm. Eg, from 24
August 1934 -
[http://science.sciencemag.org/content/80/2069/184](http://science.sciencemag.org/content/80/2069/184)
.

------
DoctorOetker
I've often hesitated about getting a microfilm or microfiche reader, but what
would really get me over the edge is some kind of similar equipment to make
microfilm or microfiche to use with the reader. Did such "consumer" equipment
ever exist, or did all the libraries and institutions buy premade microfiche
or microfilm, with the exception of a few dedicated conversion companies?

Are there any combined devices/stations that can be used to both read AND
create either microfilm or microfiche?

~~~
VLM
Used Kodak archive writers cost about $30K (not a typo) are quite large, the
modular cassettes for film cost about $1K each, film reloads for each cassette
cost about $10 each, a roll of film holds like "ten thousand" images, and the
machine can burn an image in about one second. All varying by factors of
perhaps four depending on individual model. They eat PDF and TIFF (fax) files,
also others. Fundamentally its "just a printer". Theoretically a completely
hands-off lights-out machine, much like theoretically a laser printer never
needs hand-holding... of course laser printers are mfgr'd "fire and forget"
whereas the kodak archive writers service contract was something like $5K/yr.

Part of the problem is the target market; this was at a financial services
company where the supplier knew they could charge pretty much anything they
wanted and it would get passed along as a tiny fraction of the end user
customer contract cost. If a $100M financial services contract involves a $5K
service contract for some obscure accounting thingy no one really cares. Given
that situation is similar to pre-consumer era 3-d printing, I figure a really
smart hacker who doesn't care about reproducible results or speed or
reliability could squirt something out for a tenth the costs; but even at a
tenth, thats still going to be "laser cutter" like costs. Film is never cheap;
high res film is gonna be worse, just how it is.

------
adrian_mrd
“Microfilm machines trained people’s eyes to read differently: A blur of
rapidly advancing images replaced flipping through pages, a precursor to the
transition from reading books to surfing the web. Once we adjusted to the
nonlinear reading devices, we wanted to jump around instead of advance through
page after page.“

I oft find it interesting how a medium (and/or technology) can fundamentally
alter the basic experience of reading.

~~~
evgen
It was definitely a different way to interact with the medium. Back in the
early 80s I helped my mother do some genealogy research and spent many
weekends in various libraries going through obituary sections of a lot of
major metro papers from 1920-1940. You quickly learned to recognize the
'shape' of the sort of page you were looking for and then would hit the
advance button on the machine to spin forward until you hit something that
looked like the right section. It was the sort of pattern recognition you
probably do today when scrubbing through a video looking for a particular type
of scene, but with pages of text. Every paper had its own style of page layout
so it took an hour or two to learn what you were looking for when you switched
to a different paper, but the process became fairly automatic with a bit of
practice.

~~~
VLM
Human binary search algo for page numbers, you meld with the machine and after
a couple dozen hours its creepy how I could advance exactly 143 pages if I so
desired by turning the dial just so; not any more surprising than video game
accomplishments after a couple dozen hours.

I'm a gen-x so I'm so old my youth predated even cdroms; in fact I got an off
brand early cdrom reader in high school as a very expensive birthday gift. So
in early high school I spent a couple weeks reading microfilm of Scientific
American magazine searching out all the "amateur scientist" columns from the
50s and 60s, some real gold there. I printed out some interesting columns.
Still have some of those printouts. The technology to "photocopy from
microfilm" is pretty obvious optically, the image is projected onto what
amounts to a xerox photocopier. Of course kids these days may not believe
fully analogy photocopiers ever existed, but certainly they did, until
relatively recently as digital multifunction devices took over. Around the
turn of the century all the Amateur Scientist columns were combined into a
multimedia cdrom using turn of the century ancient java to navigate tens of
thousands of one-page-per-pdf files. I no longer have infrastructure that can
run that old of a java version although hand navigation of an image of the
cdrom filesystem is possible (the file names aren't too insane) and I no
longer own a desktop or laptop with optical drive to read the legacy disk,
although I can, for now, scare up a USB cdrom reader if needed.

I lived near a depository library; did you? If you can get access to a federal
depository library then you'll have access to microfilm version of complete
ancient census records; I spent a long time looking at those and printing out
1800s era census records. My future wife bought "family tree maker" cdrom
custom format disks of census data (this is all pre-internet pre-ancestry.com
era by at least a decade). Obviously those custom cdroms of census records are
useless today unless you have a complete early 90s PC emulator, and searching
online works better and there's been several decades of census data released
since then, LOL.

I find it fascinating that I grew up before cdroms, then in my youth cdroms
would replace "everything", and now cdroms are a dead media. I'm and not
really all that decrepit and old outside of SV hiring standards and ageism, I
scarcely have a gray hair, maybe one or two, don't even have any grand kids
yet. CDroms may have been born and died in my young adult years, but there's
still microfilm out there.

------
Animats
Newer microfilm may last 500 years, but the older stuff, especially Kalvar
film, has serious life problems. Kalvar film gives off volatiles which corrode
everything around it except Kalvar film.

------
ggm
Vax/VMS source, on microfiche. Hunt the bugs in Bliss32. Happy days.

------
consumer451
Are there any earlier examples of data compression that microfilm?

~~~
paulmd
The earliest example of that general technology would probably be the Leica
camera, as this was the first time a negative had been designed for
enlargement rather than contact printing (or at least, for high degrees of
enlargement). This is only really _physically reducing the size of the storage
device_ though, not really compression.

[https://gmpphoto.blogspot.com/2018/04/ur-leica-most-
influent...](https://gmpphoto.blogspot.com/2018/04/ur-leica-most-influential-
photographic.html)

In terms of compression as a whole, the first example would probably be
military/commercial codebooks. A relatively modern example would be the
q-code, where a predetermined sequence of words are encoded as a 3-letter
sequence ("QRA" becomes "What is the name of your vessel/station") but I'm
sure there are prior examples.

[https://en.wikipedia.org/wiki/Q_code](https://en.wikipedia.org/wiki/Q_code)

~~~
paulmd
(or, as the kids would say, 'new telegraph who dis?')

Here's a link I found on some early codebooks:
[http://www.cryptomuseum.com/crypto/codebook/index.htm](http://www.cryptomuseum.com/crypto/codebook/index.htm)

------
Something1234
Anybody know why it says that no connection is available, yet a page still
loads. It seems like a new way of defeating ad block.

Edit: Even after disabling ad block, still says that there is no connection.
Even worse, just after I actually buy a copy of their magazine.

~~~
adrian_mrd
Hi, I’m using Purify ‘content blocker’ on iOS (11.4, Safari) and I can view
the article without any issues.

