
Why Did Symbolics Fail? (2009) - pjmlp
http://danweinreb.org/blog/why-did-symbolics-fail/
======
mjn
Good article I hadn't run across before; thanks for linking it.

One interesting side story is the odd Symbolics foray into animation and video
through the Symbolics Graphics Division, which afaict was reasonably
successful for a few years. It seemed like the sort of application-oriented
business that might let the Symbolics machines break out of single-purpose "AI
machines" into being seen as general workstations, or at least dual-purpose
AI/graphics workstations. But that business seems to have sort of evaporated
along with the rest of their business, despite not being AI and there being no
"graphics winter". Perhaps steamrollered by Silicon Graphics, and the same
trends towards Unix and commodity CPUs? Or not marketed well enough, by a
company whose public materials focused too exclusively on Lisp+AI?

Some bits from that era: the S-Package 3d modeling/animation system
([http://www.youtube.com/watch?v=gV5obrYaogU](http://www.youtube.com/watch?v=gV5obrYaogU)),
the PaintAmation package
([http://www.youtube.com/watch?v=Cwer_xKrmI4](http://www.youtube.com/watch?v=Cwer_xKrmI4)),
and the first HDTV processing on a workstation
([http://www.youtube.com/watch?v=KppVP8PiZag](http://www.youtube.com/watch?v=KppVP8PiZag)).

~~~
agumonkey
Sorry for the small plug: I gathered some videos and links about S-Graphics
offsprings, mostly Mirai, done at Nichimen in
[http://www.reddit.com/r/nichimen](http://www.reddit.com/r/nichimen)

Some of these videos are already in, but if anyone has other articles or
papers about anything S-Graphics/Mirai, like news about the izware guys, feel
free to post.

ps: in case the subreddit is locked, I'll take links here.

------
Zigurd
I knew several of the Symbolics founding group, especially the engineers, but
several of the management team as well. The tl;dr in the article comes here:

 _Meanwhile, back at Symbolics, there were huge internal management conflicts,
leading to the resignation of much of top management, who were replaced by the
board of directors with new CEO’s who did not do a good job, and did not have
the vision to see what was happening._

Symbolics was supposed to be the consensus Next Big Thing. A rival to Sun. A
gravy train.

There were titanic egos and divas among the engineers, too. A physical
expression of which here: [http://en.wikipedia.org/wiki/Space-
cadet_keyboard](http://en.wikipedia.org/wiki/Space-cadet_keyboard) But several
of them are also remembered for big contributions to the industry.

~~~
mzs
Regarding the graphics, the whole lisp thing did not take off is what I think
happened. When Symbolics began there was a lot of excitement - them, MIT, LMI,
Xerox, even VAX early on. But it did not really catch-on, some of that was
because of fighting about the code between Symbolics, MIT, and LMI that held
it back. So when Symbolics had good stuff, it was pretty much just them at
that point cause they had won the fights and unix vendors were coming along.

So for their price you could get a few Sun boxes say, and you could have a few
users on it. Genera was terrible about that. You could only have one user
logged in at a time and that guy could change anything in the OS or programs
however he felt, and then the next guy has no idea why everything is
different.

Basically the graphics stuff was for effects and animation, and that did well
for a while, but it was so alien to what everyone else was doing at that
point, no other vendors could really make sofware easily portable between
Genera and other hardware/OSes. So they did not and the S- software was really
expensive route, it made the most sense to run it on VMs in OSF/1 of all
things!

The other angle they could have exploited was CAD, but again Genera was the
thorn. There was no way a Pro/E or anything of that sort would make it. The C
on Symbolics was some strange half dynamic thing with GC implemented in lisp.
Again just so alien. Essentially Dassault made a CAD (ICAD?) for it and that
was it. But it was sort of ahead of it's time (there are now add-ons to all
the CAD software that do similar approach, but back then it was so much like
simply drafting on a computer what all the software was doing) and viewed as
odd. Again very expensive and with all single user drawbacks, for example
AutoCAD on PC and Pro/E on Sun were so much more appealing. And again the
whole lisp thing because it was abandoned made it so that even Dassault could
never really port it to modern hardware and it was a big expensive golden
goose so they tried but eventually killed it and created/bought other
products.

------
tsmith5432
I worked in an expert systems company in the 80s (one mentioned in the
Phillips thesis referenced in the danweinreb article). Take all this with a
dose of IIRC.

As part of out work, we evaluated and benchmarked Xerox Interlisp machines,
Symbolics systems, VAXen, later Gold Hill etc. to find a cost-effective
delivery platform. We even eventually funded the development of a delivery-
focused subset of Common Lisp.

One aspect that Symbolics didn't seem to understand back then was cost of
entry and deployment: the Xerox D-machines were (IIRC) around 1/3 the cost of
the Symbolics. Perhaps not as speedy, but adequate for our day-to-day
development work as well as for the end customer's needs.

Symbolics had great development systems, but the delivery answers were late in
coming; too late to help us.

There's lots more to be said about the late 80s collapse of AI (ES)
applications and expectations, but the margins here are too small to contain
it....

~~~
jff
You did benchmarking so you may be able to confirm/deny this: I've heard that
Lisp machines went out of favor because Lisp just ran faster on a VAX. Was
this the case?

~~~
hga
Besides horrible management---echoing Zigurd, a friend and contemporary, I was
at one point reliably told that manufacturing, R&D and marketing were paying
no attention to each other, such that manufacturing had built a factory that
couldn't make the latest hardware R&D had developed, which was done completely
independently of what marketing thought was needed---they were killed by non-
recurring engineering (NRE) costs.

Basically they couldn't amortize the NRE for their custom hardware and later
most especially chips (from memory, first, a chipset that spread the CPU
across several chips, rather like the one Western Digital did that among other
things enabled the LSI-11, then of course an all in one chip) across the huge
number of units that Intel and Motorola sold. They also canceled their RISC
rethink of their basic low level architecture on the day it was supposed to
tape out; don't know if it had a low enough gate count to be like the first
SPARC processor, which was implemented on 2 20,000 gate arrays, one for the
CPU and one for the floating point unit (gate arrays are all alike until a few
layers of metal are put on top).

So soon enough you could run a full Common Lisp, almost certainly without as
much run-time error checking, faster on cheap commodity hardware than on a
Lisp Machine.

Something like that seems to have happened to Azul Systems, which apparently
isn't developing any new hardware, but is selling their version of the HotSpot
JVM to run on x86_64 hardware. A prior generation of their pauseless GC (vs. 1
second per GiB, a big deal if your heap is 100s of GiB) required a software
write or read barrier that cost ~20% of performance (all this from memory).
It's likely that soon enough, even if it perhaps ran slower than their custom
hardware, it was a _lot_ cheaper to run it on commodity Intel/AMD hardware.

~~~
aidenn0
I can't find a reference for this, but I seem to recall that Azul uses
virtualization features of modern CPUs to decrease the read-barrier overhead;
if that is correct, then that's a case of the general-purpose hardware
fortuitously getting features to out-compete special-purpose hardware.

~~~
hga
No, the older version of their GC used bulk VM operations, but not
virtualization features, and there was still a penalty reported ... errr, I
can't find it now. Probably in a Clifford Click blog posting. I just skimmed
the new edition of Jones' GC book
([http://www.amazon.com/gp/product/1420082795/](http://www.amazon.com/gp/product/1420082795/)),
published before it could consider the newer one; it talks about the changes
needed on stock hardware but I didn't see any estimation of costs while
glancing through it. (I'm not searching any more right now because it's
obsolete.)

The base papers are:

Pauseless GC, uses a read barrier instruction in their custom 64 bit RISC
chips:
[https://www.usenix.org/legacy/events/vee05/full_papers/p46-c...](https://www.usenix.org/legacy/events/vee05/full_papers/p46-click.pdf)

And the newer one that they're using in that old hardware and the software on
commodity hardware Zing JVM, the Continuously Concurrent Compacting Collector
(C4), which I have _not_ studied (the paper was published 2 weeks after the
Joplin tornado trashed my apartment and rather disrupted my life):
[http://www.azulsystems.com/sites/default/files/images/c4_pap...](http://www.azulsystems.com/sites/default/files/images/c4_paper_acm.pdf)

It's possible they figured out how to minimize or eliminate the penalties of
the original software read barrier they applied to the Pauseless system in C4
(or perhaps in relation to their custom vs. newer commodity hardware); I just
did a quick skim of the relevant part of the C4 paper and a few keywords and
couldn't tell.

This is all great stuff that I hope to get back to soon....

------
nathell
The dates on Dan Weinreb's blog are all wrong for some reason. This post
likely dates back to 2011 or before. (EDIT: Ah, corrected by the moderators,
thanks.)

I had the pleasure of meeting Dan at ECLM 2008, four years before his death.
He was hacking away on his XO-1 and showing around fancy things. A very
memorable man. He's missed.

~~~
avodonosov
Corrected by moderators? How corrected? The dates are still wrong

~~~
nathell
The original title didn't mention the date at all.

------
dschiptsov
Exactly why Java succeed - the world is dominated by mediocrity while idiots
are mostly on management positions. This is why all we got nowadays are
Windows PCs with Java. It "worth us" or "suits us well".

Designers were too smart (consider David Moon) while management was "as
usual".

~~~
arethuza
I think that's a bit harsh - I actually moved from mostly doing Common Lisp
development (on DEC Alpha workstations) to Java development in early '95 and I
_really_ liked Java back then - I was as passionate about it as people are
today about Go/CoffeeScript/Haskell. Java, at least at the start, really was
something fresh and good.

Sure it became a horrible bloated mess - but isn't that the doom that faces
all succesful software eventually?

~~~
pjmlp
Same here, I remember how fresh it felt by bringing GC to the masses and
providing a more consistent experience across platforms than C or C++ were
capable of, with their compiler specific behaviours, extensions and catching
up with the ongoing standardization efforts.

~~~
SixSigma
Sorry but it was Visual Basic that brought GC to the masses.

------
CurtMonash
Dan was the guy at Symbolics most commonly tasked with making me believe
Symbolics would succeed. He succeeded at the task. I forgave him.

------
rurounijones
For those that had no idea what the hell symbolics are/is (As I did) I think
it is referring to the company: [http://smbx.org/](http://smbx.org/) has info

~~~
herokusaki
It's sad what's become of [http://symbolics.com/](http://symbolics.com/). I
thought it would go to some Lisp-related project eventually.

------
cafard
I highly recommend the book _Patterns of Software_ for which the link is given
here. It includes the story of Lucid's troubles, which were similar to
Symbolic's, though Lucid was not in the hardware business. Quite apart from
that, it is well worth reading.

