
The Lisp Machine: Noble Experiment or Fabulous Failure? (1991) [pdf] - lelf
http://pt.withy.org/publications/LispM.pdf
======
lispm
I would not call it a complete failure. For example Symbolics created revenue
of around 1 billion USD during the years it sold the machines in the 80s.
That's quite impressive. A lot of impressive technology had been developed.

Eventually the time was over and the technology was replaced. That happened to
Apollo, DEC, SUN, Cray, Thinking Machines, SGI, Amiga, ... and a bunch of
other tech companies.

It was great while it lasted. With the cold war ending (and with it the
government spending for military and civilian high-tech shifting) and with
more versatile/cheaper technology capturing the market (C++, RISC computers,
...) the time was over.

Basically the Lisp Machine tech was ripe for a full reboot (like Steve Jobs
did with NeXT, but there was no money for that in the Lisp market anymore).
The existing Lisp Machine software was not flexible enough and too limited in
its capabilities. Lisp developers might have still like its capabilities, but
for the average user/developer, it would have needed to be slimmed down. The
emulator needed a DEC Alpha - and not a PC, where it would have found more
users and cheaper hardware.

Apple tried it with Dylan for mobile and application development, but that
never entered the market. The Newton wasn't shipped with Dylan. Lucid tried it
with C++/Energize. Sank the company. Harlequin tried it with DylanWorks. CMU
tried to develop a Dylan environment. More was tried, but most failed.

~~~
hga
_Lucid tried it with C++ /Energize. Sank the company._

Nope, it turned out the "business" person/people who arranged the contracts
for OEM sales of Lucid's Lisp did them as essentially loans, against the OEM's
actual sales as I recall, and when they came due Lucid's investors felt that
the success of Energize was sufficiently iffy the extra investment was not
worth it. Details can be found on Richard Gabriel's web site:
[http://dreamsongs.com/](http://dreamsongs.com/)

In general my take on what happened is somewhat different. It starts from the
observation that when the Lisp Machine was developed, there was a great deal
to be said for a custom TTL processor laser focused on Lisp. It was a good way
to squeeze a lot of performance out of the available technology, pretty much
the only way unless you were crazy enough to try to do it in ECL. And people
could see the utility of what became the engineering workstation from the
success of Xerox's Alto (which added graphics to NLS).

As it became practical to put enough gates on a single chip for a Lisp
Machine, it would have taken a great deal of adroitness to make the
transition, and that wasn't in the cards. LMI didn't have the capital to do
so, in fact, TI bailed them out and invested in them just to keep the company
alive so they could do their own gate array Lisp Machine, the TI Explorer.
Don't know what happened with TI, but LMI was eventually killed by Canadian
politics. Symbolics was _very_ badly managed by then, and continued to make a
number of very bad decisions.

And then, going back to Gabriel, there's the whole worse is better thesis.
Worse is better seems to have fantastically greater survival characteristics,
in fact, nowadays the only wildly successful The Right Thing software
ecosystem is Java.... If there's such a strong ... force, if you will, against
the approach of Lisp and therefore Lisp Machines, then it's hard to see how
they could have survived.

~~~
lispm
Not really. Lucid took the money from Lisp sales and invested it into C++.
Read it in Gabriel's book.

> As it became practical to put enough gates on a single chip for a Lisp
> Machine, it would have taken a great deal of adroitness to make the
> transition, and that wasn't in the cards.

TI developed a single-chip microprocessor. The later TI Explorer and the
MicroExplorer were based on microprocessor CPUs. DARPA financed it for the
'Compact Lisp Machine'. Symbolics did the same - they redid the CPU as a
microprocessor, the Symbolics Ivory. Many Symbolics machines were based on
that chip: XL400, XL1200, XL1201, UX400, UX1200, MacIvory 1/2/3 and the
NXP1000. Symbolics did three iterations of the CPU and had a full design of a
RISC cpu ready.

------
Animats
LISP machines died because they came out near the end of the "expert systems"
false boom, and the beginning of the "AI Winter". In the mid-1980s, the expert
systems crowd were claiming strong AI was coming Real Soon Now. (I was at
Stanford at the time; I met most of that crowd.) Didn't happen. When AI did
start to work, it was statistical, neural net, and number-crunching based,
using completely different technology.

Tagged machines are interesting. There have been some good ones, most notably
the Burroughs machines. What killed them was the triumph of C, which wants a
big flat address space, and the triumph of UNIX/Linux, which wants a vanilla
CPU. A tagged machine works best with languages, compilers, and operating
systems which use the tags properly. A whole specialized ecosystem is needed,
and the motivation for it is weak.

Even the segmentation features in IA-32 were never used much. There's hardware
intended to allow calls across protection boundaries in a controlled way. That
stuff was left out of AMD-64, and is now almost forgotten.

------
frik
Lisp Machine was not alone, similar products exist/existed. Intel iAPX 432 run
Ada, Pascal MicroEngine run Pascal, CreditCards (EU) run JavaMicro bytecode,
ECOMP run Erlang, Burroughs Medium Systems run COBOL, etc.

High-level language computer architecture: [http://en.wikipedia.org/wiki/High-
level_language_computer_ar...](http://en.wikipedia.org/wiki/High-
level_language_computer_architecture)

I could imagine a modern high-level computer architecture that is based on the
LLVM intermediate representation (or GNU gcc RTL). A lot of transistors in a
modern CPUs could be saved if we remove all the legacy x86 ASM. Especially as
modern x86/x64 CPUs emulate an CISC, and are a RISC architecture.

~~~
lispm
None of those were in any way 'similar' to Lisp Machines from Xerox, TI, LMI
or Symbolics. Those were high-end workstations with a Lisp OS, coming in a lot
of different form factors. They were developed from 1981 to 1992.

The main similarities were that they all used electricity. Some of the
computers with Java-processor were similar in that they provided an object-
oriented garbage-collected language and OS.

~~~
frik
> None of those were in any way 'similar' to Lisp Machines

It seems you misinterpreted my sentences. Read the linked Wikipedia. Basically
not ASM/Assembler code (0, 1 form) is executed on the CPU but a specific
higher level language like Lisp/Ada/Pascal/Java/etc. directly.

~~~
lispm
Lisp Machines did not run Lisp 'directly'. Lisp code was either running
interpreted or usually compiled to machine code.

This is an example from a Symbolics Lisp Machine with an Ivory CPU.

    
    
        (defun example-count (predicate list)
          (let ((count 0))
            (dolist (i list count)
              (when (funcall predicate i)
                (incf count)))))
    

The disassembled machine code for above function (for the Ivory microprocessor
from Symbolics):

    
    
        Command: (disassemble (compile #'example-count))
    
          0  ENTRY: 2 REQUIRED, 0 OPTIONAL      ;Creating PREDICATE and LIST
          2  PUSH 0                             ;Creating COUNT
          3  PUSH FP|3                          ;LIST 
          4  PUSH NIL                           ;Creating I
          5  BRANCH 15
          6  SET-TO-CDR-PUSH-CAR FP|5
          7  SET-SP-TO-ADDRESS-SAVE-TOS SP|-1
         10  START-CALL FP|2                    ;PREDICATE 
         11  PUSH FP|6                          ;I 
         12  FINISH-CALL-1-VALUE
         13  BRANCH-FALSE 15
         14  INCREMENT FP|4                     ;COUNT 
         15  ENDP FP|5
         16  BRANCH-FALSE 6
         17  SET-SP-TO-ADDRESS SP|-2
         20  RETURN-SINGLE-STACK
    

This machine also had C, Pascal, Fortran, Ada and Prolog compilers.

~~~
psandersen
You sound knowledgable so thought I'd ask here :) ... Would there be any
logic/benefit in a modern lisp machine? e.g. if intel decided they'd put the
same engineering effort into a 'lisp machine' as they did Haswell, would there
be any inherent advantages or things that could be done differently?

~~~
lispm
Actually a Lisp Machine is much more than just a CPU. It's a whole computer
architecture, language implementation, operating system, user interface and a
computer design. You think of it as it were just a CPU. But it was much more.

Does a Lisp CPU make sense? Economically not, since there is no market for it
and no innovation driver. You have seen what happened to Java CPUs...

Technically? Could be. Instructions were probably a lot smaller. Memory would
be tagged. The CPU would know more about data structures (which in many cases
would get rid of buffer overflow exploits).

Generally compiled Lisp runs quite nicely on 64bit Haswell machines.

~~~
cmrdporcupine
I'm no CPU designer, but I gotta think there would be some advantage to
designing a CPU architecture explicitly around optimizing garbage collection
at least. That is keeping the tree of references in cache at all times to
minimize collection scan times and avoid thrashing the cache during marking
phases. And dynamic languages would benefit from having tagged pointers.

These days I really feel like I/O is most often my bottleneck, not CPU
execution. That said, when I was doing things in Java a lot that I should have
been doing in C/C++ (or something like Rust now) I fought with the garbage
collector and the latencies it introduced.

~~~
gumby
> I gotta think there would be some advantage to designing a CPU architecture
> explicitly around optimizing garbage collection at least.

That's what we all thought in those days but Pat Solvobarro killed it by using
the MMU on a Sun Machine to implement a write barrier.

I don't know how great that is these days (since you get a fault and a cache
miss) but the principle was that the conventional hardware could get better
faster than the specialized hardware could, and so general would overwhelm any
local advantages of the specific.

------
mark_l_watson
I owned a Xerox 1108 Lisp Machine from about 1982 to 1986. It was an amazing
software development platform, and as a business rode the wave of defense
industry investment in AI technologies.

I paid for my 1108 by selling a simple expert system tool for $5000 per
machine license. While I felt good making my toy sort of cost free for my
company, I didn't feel great about selling something simple for $5000.

I continue to be surprised how much of my consulting work in the last 15 years
has used a Lisp (either Common Lisp or Clojure). From my perspective, Lisp
languages are thriving.

------
strlen
There's an interesting conversation between several Lisp machine folks
(including the late Dan Weinreb) and Azul's Cliff Click. This was at a time
when Azul shipped custom hardware. Very interesting, re: hardware support for
garbage collection.

~~~
WallWextra
It is here: [http://www.azulsystems.com/blog/cliff/2008-11-18-brief-
conve...](http://www.azulsystems.com/blog/cliff/2008-11-18-brief-conversation-
david-moon)

It looks like Azul has basically given up on their custom hardware, I guess?
They found a way to make their GC work on commodity x86.

~~~
cmrdporcupine
I haven't looked in a really long time but wasn't Azul's focus running the JVM
directly on the hypervisor? That seemed like an interesting approach vs using
custom hardware.

~~~
WallWextra
A lot less "interesting," in a very good way.

------
cmrdporcupine
Like the Symbolics machines did, the RISC-V has tagged memory:

[http://www.lowrisc.org/downloads/lowRISC-
memo-2014-001.pdf](http://www.lowrisc.org/downloads/lowRISC-memo-2014-001.pdf)

As I understand it its proposed use is for security reasons, for preventing
hostile memory corruption. But I imagine it could also be used for the kind of
type and GC purposes that tagging in the Lisp machines was used for?

~~~
leoc
Apple's already using the ARMv8 tagged memory to speed up Objective C on the
iPhone, right? [https://www.mikeash.com/pyblog/friday-
qa-2013-09-27-arm64-an...](https://www.mikeash.com/pyblog/friday-
qa-2013-09-27-arm64-and-you.html)

~~~
hga
Apple is using some of the bits in a normal 64 bit pointer for other purposes,
something that's been done in production Lisps for a long time, no later than
when the Vax appeared with its byte addressed pointers. E.g. a valid 32 bit
pointer has the 2 LSB bits as zero, allowing for 3 other values and all sorts
of tricks with them.

By comparison, the CADR Lisp machine had a 32 bit word size, with 8 bits used
for tags, and the remaining 24 bits used for immediate data or as a word
addressed pointer. The lowRISC project proposes to add tag bits taken from
other regions of memory that are fetched into a tag cache, and promoted as
additional bits in the L2 and L1 caches and above. So starting with the L2
cache words will be internally 66 bits long.

~~~
cmrdporcupine
I'm baffled why Apple's refcounting implementation doesn't do cycle
collection. There are a few good algorithms for doing this for many years now.
I implemented this one in C++ once:
[http://researcher.watson.ibm.com/researcher/files/us-
bacon/B...](http://researcher.watson.ibm.com/researcher/files/us-
bacon/Bacon01Concurrent.pdf)

------
dschiptsov
Excellence in engineering, unattainable for modern punks.

~~~
stplsd
Yeah, "it even had 16-bit digital stereo sound".

------
amelius
I wonder what the "lisp machine" would look like today (i.e., given today's
knowledge of the execution of rewriting in the lambda calculus).

Would it actually still make sense to make a dedicated design?

------
faragon
Noble experiment and fabulous failure.

~~~
agumonkey
Something I just heard in a documentary :

"A new paradigm is always a tough sell, and ... would not prove a financial
success"

[https://www.youtube.com/watch?v=qxM9pMEnJQ0&index=3&list=PLO...](https://www.youtube.com/watch?v=qxM9pMEnJQ0&index=3&list=PLOQZmjD6P2HlOoEVKOPaCFvLnjP865X1f)

To me Lisp still hold the number 1.5 (sic) place in programming languages top
list. It still is a magnificent thing, combining minimal and abstract traits
so well.

------
enupten
I wish the CL standard was evolving today...

~~~
mark_l_watson
With the QuickLisp package manager I would argue that the Common Lisp platform
is evolving and thriving. Even though my (consulting) customers seem to favor
Clojure now, Common Lisp is still a solid choice for some types of projects
that benefit from rapid prototyping, image based development, and very fast
runtime.

~~~
enupten
Agreed. I think CL is still an awesome platform.

There are however things which can't be affected by packages, without a total
rewrite (like Shen). Things like parametric-typing, and adding synergistic
mechanisms with define-compiler-macro and inbuilt optimizations, would be
extremely useful.

------
popvol
I see the horse is not dead enough, we should continue beating it some more.

The Lisp Machine is what happens when a group of engineers cannot distinguish
vision from tunnel-vision. Probably because they have lived far too long in
the self-congratulatory universe of Lisp, and believe in the divinity of
S-expressions and CONS cells, never once questioning the foundations of their
faith.

~~~
PhantomGremlin
I briefly met the Symbolics engineers and I didn't get the feeling that they
were zealots like you suggest.

This all happened over 30 years ago. Intel didn't have a hegemony in CPUs.
There was funding available to try new things in hardware. Some of it failed,
some of it succeeded.

Thirty years from now we will be rolling in laughter at the absurdity of so
many of today's startups.

