Summed up: for 50 years, programmer time has been getting more expensive relative to machine time, so optimizing for programmer time makes sense (in broad terms - clearly there are plenty of cases where speed matters).
It's interesting how probably FORTRAN will never go away.
First mover advantage case study.
What is also interesting is that between LISP and FORTRAN we seem to have the two lineages that combine all programming mechanisms that we are aware of.
I know what you mean, and agree on that.
But LISP and FORTRAN are two different things as I see it.
LISP is a bridge between the lambda calculus and the actual machine.
FORTRAN is not a bridge between the machine and a fundamental theoretical construct like say the Turing machine. FORTRAN is just an attempt at abstraction away from the machine. You might say LISP is an implementation where FORTRAN is a construction.
Programming is a way to get a machine to do stuff. You can approach that problem in different ways.
Whatever 'method' you use, the most important metrics are 'are the results good' and 'is the program maintainable' and 'is it reliable'.
To see the 'lessons' from both sides put in to practice in (relatively) new languages (such as Erlang, or Python) is where the intermediate future lies.
What I'm really curious about though is not so much which of the LISP or FORTRAN descendants is the 'best' for any given purpose, give it enough time and we'll find that out one way or the other. My real question about all this are there other paths besides these two that have not yet been explored ?
Logic programming is sometimes seen as a 'third branch', but is possible that there are ways of achieving results out there that are not as obvious as any of these and that yet hold in them a kernel of what we could do to get out of the absolutely unmaintainable mess that software is becoming.
It is not an 'engineering' discipline by any standard, yet, and if all this is to 'end well' I think it should be.
http://journal.dedasys.com/2008/12/04/the-economics-of-progr...
Summed up: for 50 years, programmer time has been getting more expensive relative to machine time, so optimizing for programmer time makes sense (in broad terms - clearly there are plenty of cases where speed matters).