

Lisp as an alternative to Java (2000) [pdf] - wtbob
http://www.flownet.com/gat/papers/lisp-java.pdf

======
lisper
Just to put this paper in an appropriate historical context, I wrote it back
when I was trying to convince NASA that flying Lisp on a spacecraft would not
be a completely insane thing to do. The Powers That Be had already mostly
accepted that it would be OK to fly Java, but they thought Lisp was too big
and slow. This study and paper were designed specifically to address those
concerns.

~~~
swartkrans
Do you still work for NASA, and did your paper succeed?

~~~
e40
No and no.

~~~
lisper
Yeah, what e40 said. :-(

------
adwf
Yet to this day, I still find people claiming that Lisp is an interpreted
language and that the garbage collector introduces disastrous pauses in
runtime. As if it was still the same Lisp people were using back in the
1980's.

~~~
hga
The 1980's state of the art was a lot better, depending on how long a pause
you could afford for incremental GC (fractions of a second, OK for humans, but
not a variety of applications. OK for piloting bulldozers, as one company
did.)

Interpreted only ... that's a narrow, 4 year window, the first was developed
in 19 _62_.

Probably the key thing in this is that most any good programmer could hack out
a Lisp implementation in a week or two. Making one preform as well as
contemporary languages, strangely enough, took a similar amount of effort, and
most Lisps back then didn't.

~~~
adwf
Well I was born in '84 so it's all history to me ;) I've only ever known
compiled, high performing Lisps with unnoticable GC. So it's weird when I
still hear people trotting out the old "bloated, interpreted, slow" line.

I sometimes think that the AI winter really took computing in a weird turn,
not just Lisp, but computing in general. It feels like we stopped trying to
make computers do all the work for _us_ and decided that humans should be
doing the work for the _computer_. So we ended up heading down the path of
manual memory management in the C style languages, verbose constrained
programming in Java and static typing instead of the easier dynamic. Whilst I
agree that at one point, hardware performance was a serious consideration and
we needed to squeeze everything we could out of computers - that is no longer
the case. We're overwhelmed by hardware performance in most cases. Perhaps
that's why we're seeing a resurgence in Lispy languages nowadays?

~~~
hga
Ah, some more history:

In the '80s, a common startup setup that was more powerful than a PC (for a
while) and ASCII terminals to a UNIX(TM) box was 3 or so el-cheapo Sun
workstations sharing one disk over an Ethernet (or so I've read, I might well
prefer the ASCII terminal to a UNIX box, which is what I tended to use).

Although also think about the PC starting with ones using the 386 (true 32
bits, the 286 was misbegotten and all but useless except as a fast 8086): both
these configurations were cheap, but not _all_ that capable, memory was at a
premium as well as virtual memory in the workstation case.

So using machine but not programmer efficient C and the like was the
zeitgeist. Then about or a bit before the time engineering workstations based
on more capable microprocessors were also capable of running Lisp quickly,
although without hardly as much error checking, the expert system bubble
collapsed. And those promoting it had to find something to blame, and Lisp
became it. Or so goes one narrative of it.

There was certainly a ... Slough of Despond in programming languages we
mortals could use to earn a living about the time you would have started
programming seriously (before college, I assume). For Lisper and programming
language guy like myself it was very depressing; I think what got us out of
that, besides Moore's Law, was the dot com bust, it upset a lot of the
conventional wisdom, coupled with a paucity of resources for development. No
longer could you, at the extreme, do a wild IPO with an idea and a plan to
hire a bunch of blub programmers using a consensus language running on
expensive Sun or whatever hardware. You probably get the idea of what lost
favor (ask if not).

Also, as Paul Graham pointed out, in essays who's influence I can't judge, the
above will at best get you average results, and the average result of a
startup is failure. He also noted that technical failure was a frequent cause
of corporate failure. To take one stark example, compare Facebook competitor
Friendster. Whatever promise it had, that the company's board used to wheel
and deal, didn't matter when it was just too slow to use. Of course (if I
remember correctly), its technical founder was purged by that board.

Which also leads into issues of control. We technical people don't need so
much money, and the all too common downside of losing control is awful for us,
the company, and of course the people we hired for it.

So all in all we more tend to call the shots technically, and are able to pull
it off. And wild successes like Facebook with PHP and Twitter with Ruby on
Rails didn't hurt.

------
wtbob
Now that the dominant languages seem to be dynamic languages like Python, Ruby
& JavaScript, I wonder if Lisp's edge in development speed is as pronounced;
OTOH, I wonder if it has an increased edge in stability and/or security.

~~~
agumonkey
It seems that python development speed means well rounded libraries. And IIRC
lisp is still not on par with that (quicklisp users feel free to correct me).
Other than that I still think that linguistically, lisps still have an edge
(and a sharp one).

~~~
mackwic
But Ruby is an acceptable Lisp[0], and has also a lot of libraries. ;)

So, how will Lisp compete ?

Maybe if we look at it pragmatically, the solution is not in the performance
nor in the time to develop, but in the ease of maintaining the software in the
long term. A field where Lisp is not particularly brilliant.

[0]: [http://www.randomhacks.net/2005/12/03/why-ruby-is-an-
accepta...](http://www.randomhacks.net/2005/12/03/why-ruby-is-an-acceptable-
lisp/)

~~~
agumonkey
I forgot to mention that the main lisp implementations (SBCL) are fast.
Compared to languages like these it's a done deal.

Good point about long term maintenance. Ruby and Python aren't that great
either, people say static typing is the way to go. Lisps are weird because
they allow to hot swap most of the things which makes changing a system easy,
but at the same time it leads to spaghetti images.

~~~
_delirium
My experience is that long-term image-based development isn't really that
common in Lisp, maybe outside of some specialized use-cases. You _can_ develop
by mutating/saving/reloading an image over an extended period of time, but in
practice people mostly develop the "usual" way, with source code that's loaded
into a fresh image periodically. The use of image-oriented development then
gets limited to prototyping and debugging. Being able to redefine
functions/etc. on the fly (even during restarts and the like) is useful while
experimenting and debugging. But then the "real" code gets written into a
source file, which is reloaded into a fresh image to make sure it works. For
modern muti-developer Lisp this is pretty much the only way things are done,
since the expectation is that you check Lisp source into svn or git, not
images.

~~~
adwf
That's pretty much exactly how I develop. I rarely type anything directly into
the REPL, instead typing into the source file in emacs and then C-c C-c it
over to the REPL for me. The advantage is that I get a canonical copy of the
entire source code, whilst still having the ability to inspect and hotfix
things through the REPL. If I then need to reboot the image or deploy to a new
server, the entire source base should be completely stable and consistent.

I often connect to production servers using Slime and fix little things like
typos this way - then check the source into Github and have it permanently
saved. It removes a lot of the stress of doing a full deployment each time you
need to patch something!

------
zak_mc_kracken
Even for 2000, this paper is hilariously biased, starting with the observation
that Lisp wins over Java in every single respect. If I order an evaluation
paper and the paper comes back with 100% for choice B and 0% for choice A, I'm
going to disregard it completely because the author is obviously too biased.

One of the strengths that Java has over Lisp is that it's statically typed. At
the time, the direction of the industry wasn't too clear cut but this is
clearly the trend today.

Another example of the bias is that the benchmarks compare development times
(and Lisp wins, surprise surprise) but not maintenance time, which is
admittedly harder to measure but a big liability that dynamically typed
languages have.

~~~
cbd1984
> One of the strengths that Java has over Lisp is that it's statically typed.

With the caveat that Java's type system can be subverted and isn't expressive
enough to catch any more than the most trivial kinds of errors a type system
can catch.

~~~
zak_mc_kracken
That's a straw man, we're comparing Java to Lisp, not to another statically
typed language.

Because Java is statically typed, it catches a lot more errors at compile time
than Lisp can.

~~~
wtbob
Lisp is optionally statically-typed. I am not aware of any type error Java can
catch at compile time which Lisp cannot catch at compile time.

Now, whether or not a particular implementation _does_ catch it is an entirely
different affair.

------
melling
It's unfortunate that languages can't gain much traction in the mainstream.
Rust, Haskell, Julia, Go, Clojure, Scheme... I remember the fight that it took
before Python and Ruby were accepted. There's the big catch-22, of course, in
order to obtain critical mass.

Maybe if we all adopted a language for our weekend projects, we'd contribute
enough back through blogging and StackOverFlow questions that we'll add enough
knowledge, etc to the net, that it'll become increasingly easier for the next
person.

Btw, I'm using Go for my weekend web site project:
[http://thespanishsite.com](http://thespanishsite.com)

~~~
swartkrans
I'd say Go is becoming mainstream pretty quickly. I think Clojure isn't far
away and Rust is showing big promise. I doubt based on probably under-informed
personal observation that Haskell, Scheme or Julia will ever become widely
adopted.

~~~
akurilin
The thing with niche languages such as Haskell is that not everybody has to
use them, but we ideally would get to enough critical mass that the ecosystem
is sustainable, new and better tools are continuously being developed, and
it's pretty beginner friendly. Clojure is probably at that stage right about
now, Haskell doesn't quite feel that way, but it's getting better (more docs,
guides, books, services like Stackage etc)

Basically it's very nice when a language gets enough traction that the lack of
users is no longer a concern for its long term prospects.

------
anubiann00b
The run time seems to be outdated... Hasn't the JVM been updated and optimized
since 2000?

~~~
wtbob
Yup. I'd really like to see the same comparison done today.

~~~
ivanche
Ditto. If JVM is good enough for, say, Twitter, it's sure that performance of
Java programs got better and that "Run time" and "Final image size" from
Figure 1 in OP pdf would look different. OTOH I'm not in the Lisp land but I
guess there was probably some improvement there too.

------
snoopybbt
That thing is fucking old.

The JVM has been improving a lot since then.

In order for that content to be taken seriously, the tests should be re-run
with modern runtimes for both C/C++, Java and Lisp.

~~~
rjsw
Has the JVM got a smaller memory footprint since 2000 ? Lisp runtimes have not
got bigger.

~~~
jerven
Standard JVM, slightly smaller, much faster. This is comparing to Java 1.2
before JIT/Hotspot and a decade of performance work. (Also I doubt anyone had
7.7 years of experience with Java in 2000, so that would be general)

If we consider JavaCard or Java ME a java dialect then yes, it has gotten a
lot smaller! I think neither of these where available in 2000.

Running Java also does not need a JVM. There are AOT compilers for JAVA. Like
Lisp there are a lot of different implementations with lots of differing trade
offs.

In any case a very flawed comparison, like all programming languages
comparison of this kind as it is so hard/expensive to do it correct.

But an interesting read anyway.

I think of languages as equivalent (they are all turing complete right) so
something buildable in one must be buildable in the other. The question is
what tradeoffs do the languages make for the humans in the loop.

Lisp and Java make different tradeoffs there, and C makes yet another set of
tradeoffs. And which languages we use is more often dependant on historical
accidents than technical excellence (like most human languages) see JS as an
example.

I believe dynamic languages are better for small teams and short project life
times. I believe automatic memory management by default languages are better
for the vast majority of projects than manual memory management options. I
think a written language like Traditional Chinese (Java) is better for running
an empire than one like Latin (Lisp). But a more modern idea like Hangul
(Clojure?) might do even better.

In the end to get the best CPU performance one needs to know the CPU and
architecture as well as an ability to think in bits. And like in human
language one can write poetry in all of them, its just a bit more difficult
than the office memo's we tend to write and read in our jobs ;)

