
Software That Lasts 200 Years (2004) - joeyespo
http://www.bricklin.com/200yearsoftware.htm
======
hoorayimhelping
This doesn't make much sense to me. Bridges, roads, sewers, pipes, etc are
still around because the problems they solve haven't changed, and in the
thousands of years we've been building bridges, we're only seeing the ones
that survived, and completely missing the ones that crumbled. It seems like
survivorship bias.

Software is evolving because we have new problems to solve. I'd point out that
Unix is 50+ years old and still going strong and used daily by millions of
people.

I don't get where the assumption that these are timeless structures comes from
- the New York City Subway system is falling apart and in need of continual
maintenance. Roads constantly fall apart and need to be re-surfaced - it's not
like we use incredibly elegant long term materials, we use blacktop because
it's cheap _now_. Bridges and tunnels are being maintained all the time.

~~~
informatimago
Indeed; the comparison of software engineering with architecture (or other
classes of engineering), notably when taking into account history, are always
misled.

For example, the assumption that old bridges still solve modern problems.
Compare:

[https://www.youtube.com/watch?v=o4eM0qoUhaE](https://www.youtube.com/watch?v=o4eM0qoUhaE)
[https://en.wikipedia.org/wiki/Puente_Romano_%28M%C3%A9rida%2...](https://en.wikipedia.org/wiki/Puente_Romano_%28M%C3%A9rida%29)

On the close to 2000 years old Merida's bridge, only peatons are allowed
nowadays. If you tried to run a traffic of hundreds of 30tn trucks per hour on
it, it would soon fall into pieces. Also, you have to take into account the
maintainance that has been added to it during the ages (for example, the banks
have grown closer and the extremum piles are now under ground).

------
acqq
Knuth had actually this "it should last" goal as he designed TeX:

[http://scicomp.stackexchange.com/questions/3462/increasing-t...](http://scicomp.stackexchange.com/questions/3462/increasing-
the-archival-longevity-of-code)

“Ever since those beginnings in 1977, the TeX research project that I embarked
on was driven by two major goals. The first goal was quality: we wanted to
produce documents that were not just nice, but actually the best. (…) The
second major goal was _archival: to create systems that would be independent
of changes in printing technology as much as possible. When the next
generation of printing devices came along, I wanted to be able to retain the
same quality already achieved, instead of having to solve all the problems
anew. I wanted to design something that would be still usable in 100 years._ ”
– Donald E. Knuth: Digital Typography, p. 559 (quoted from
[http://de.wikipedia.org/wiki/TeX](http://de.wikipedia.org/wiki/TeX) )

[https://en.wikipedia.org/wiki/TeX](https://en.wikipedia.org/wiki/TeX)

"Even though Donald Knuth himself has suggested a few areas in which TeX could
have been improved, he indicated that he firmly believes that having an
unchanged system that will produce the same output now and in the future is
more important than introducing new features. For this reason, he has stated
that the "absolutely final change (to be made after my death)" will be to
change the version number to π, at which point all remaining bugs will become
features.[11] "

~~~
effie
Having a system that produces the same visual output in 100 years is nice, but
to achieve that it is sufficient to maintain the old TeX installation in
usable form. We have release versions for this. Why does he think his software
should be approaching some state of immutability is incomprehensible to me.
Evolution of civilization will bring new requirements, perhaps better way to
deal with colors, better vector image notation, better font handling, better
web/HTML integration, ...

~~~
acqq
Knuth has answered your worries in 1990:

[http://www.ntg.nl/maps/05/34.pdf](http://www.ntg.nl/maps/05/34.pdf)

"I have put these systems into the public domain so that people everywhere can
use the ideas freely if they wish. I have also spent thousands of hours trying
to ensure that the systems produce essentially identical results on all
computers. I strongly believe that an unchanging system has great value, even
though it is axiomatic that any complex system can be improved."

"Of course I do not claim to have found the best solution to every problem. I
simply claim that it is a great advantage to have a fixed point as a building
block. Improved macro packages can be added on the input side; improved device
drivers can be added on the output side. I welcome continued research that
will lead to alternative systems that can typeset documents better than TEX is
able to do. But the authors of such systems must think of another name.

That is all I ask, after devoting a substantial portion of my life to the
creation of these systems and making them available to everybody in the world.
I sincerely hope that the members of TUG will help me to enforce these wishes,
by putting severe pressure on any person or group who produces any
incompatible system and calls it TEX or METAFONT or Computer Modern — no
matter how slight the incompatibility might seem."

------
anuraj
As long as underlying data is preserved, software is better done fungible.
Data formats change, processing models and architectures change, underlying
hardware change - allowing us to do more with less. In my career I have found
there are no stable architectures - systems require refactoring every couple
of years to stay manageable and full rewrites possibly once every technology
cycle to make it in tune with the times. This is not a limitation, but an
opportunity as software is constantly evolving and is more like a
living,breathing organism than a permanent structure.

~~~
rwinn
Sorry for the downvote, I meant to hit up and I can't seem to change it

------
perlgeek
Most of the software I write is based on business processes, which surely
won't even last 20 years (more likely 2 to 5 years), and/or technology (I work
at an ISP and data center provider, so we manage stuff like IPs, VLANs,
Hardware, Racks) which likely also won't last 200 years.

The example from the article about accounting software is another good
example: legal requirements change, which means that the software must also
change; book keeping software from 30 years ago is utterly useless today, not
because it wasn't done well, but because too much changed.

Another example from the article: parking tickets. There were no parking
tickets 100 years ago (80 years ago according to
[http://theexpiredmeter.com/2009/08/first-parking-ticket-
issu...](http://theexpiredmeter.com/2009/08/first-parking-ticket-issued-
in-1935/)), and I doubt that however traffic looks in another 100 years, it
will have not much in common with our current structure.

My takeaway is that we shouldn't worry too much about longevity of the
software; the more important aspect is the data formats. They should be open
and well-documented, so that folks in the future have less trouble dealing
with our data footprint. Maybe our data will still be of value in 200 years;
at least some of it, and at least a bit. But I doubt our software will be,
regardless of how we write it. I do agree with his points about the need for
better review processes and oversight.

Update: Forgot one important point: Input method and user input devices change
a lot. Who wants to use software today that assumed a max screen resolution of
600x400 or even lower? And no mouse, no touch screen, no audio or image or
video input, even it would be highly desirable for the problem domain?

~~~
bithead
Interestingly, Nearly all airline reservations are made on software written in
the 60's. Still.

------
reilly3000
This is a fantastic article and it confirmed what I've learned from the Rich
Hickey talks: software isn't about software. It's a medium for information.
Information is light and it deserves a place to live that isn't creating
uncertainty. The world will be shaped by software, and the root of software
that deserves to last is lasting respect of the people who use it...

------
dchambers
This argument is specious. Bridges only continue to be relevant a thousand
years on because humans have barely evolved in that time frame, not because
bridge builders put any effort into ensuring their bridges would still be
relevant to future mankind.

Computer programs on the other hand are driven by fashion, and evolve rapidly.
Digital documents created by programs will cease to be readable when the
programs that could read those documents fall out of fashion, but this is not
due to any short-sightedness by the developers of these programs, but rather
by users that choose to prefer new (more fashionable) programs even when these
are incapable of reading the documents they previously wrote.

~~~
InclinedPlane
You've explained partly why the phenomenon exists, you haven't explained why
it's not a problem.

~~~
falsedan
It's not the onus of commentators to provide a complete valid alternative to
back up any criticism.

------
memracom
Some folks are working on a standard curriculum for software engineering and a
standard process to follow that incorporates both waterfall and agile
methodologies. In fact it also includes having an engineering team think about
what they are doing and evaluating the quality of their process using a number
of standard dimensions and metrics.

Read more about SEMAT here [http://semat.org/](http://semat.org/)

Stuff like this is a prerequisite to creating software that is done, and can
be used for 200 years without any substantial change.

Could Java ever get to that stage? What about some of the JVM ecosystem such
as Tomcat or Camel?

