
Why Lisp Failed (2010) - tosh
https://locklessinc.com/articles/why_lisp_failed/
======
FigBug
I think Lisp failed because it had no killer app. Most developers don't pick a
language, they pick a project and the select the most appropriate language.

Web frontend -> Javascript

Unix / Linux -> C

Wordpress plugins -> PHP

Windows apps -> .Net

iOS -> Objective C / Swift

Android -> Java

In my entire career (25 years), I've never had a project that directed me
towards learning Lisp. This pretty much leaves Lisp to the type of developer
that seeks out new languages and is willing to spend the extra effort
integrating, and that's a pretty small number of developers.

If Lisp was in the browser instead of Javascript, it would be popular no
matter the complaints about the language.

~~~
dehrmann
That still doesn't explain Java's popularity as a backend language. Unless
J2ME or Applets was the killer app.

~~~
callmeal
Java features that have yet to show up in other languages: Scalability - both
horizontal and vertical. Memory utilization (you can have humongous heaps
without having to worry about managing all that memory). Managability - easy
to "divide and conquer": you can plop a jar and start using it right away.
Something that's practically impossible in other languages. Third party
ecosystem (both free/oss and paid). Speed. Java is one of the fastest managed
languages.

Yes it is verbose, yes inexperienced developers do the wrong things, but it
does a great job preventing you shooting yourself in the foot, and most
importantly, once you have a stable codebase, it can keep running for
months/years.

~~~
dehrmann
> you can plop a jar and start using it right away

I'm mostly a Java developer, so I think this explains why Docker always looked
a little silly to me. I also write some Python, and once I have to set up a
virtualenv, that's when Docker's existence makes more sense.

------
JulianMorrison
Lisp has never been noticeably better overall.

\- In the past, it was better at abstraction, but slow and niche and you
needed to shell out for hardware.

\- In the present it is _not better_. Common Lisp is about on a par with
Python in what it has built in, inferior in its ecosystem, and for some modern
stuff (threads, sockets) you will have to go outside the language standard
into the wild west of implementation extensions or third party libraries.

Lisp's one big selling point is macros. Macros are magic that lets you
reprogram the language. Reprogrammed languages break the ability of a
programmer to drop in and read the code. Languages that need reprogramming to
get basic shit done are unfinished or academic (Scheme). Languages that can
get stuff done, but tempt you to reprogram them are attractive nuisances
(Common Lisp, Ruby). In use, they create incompatible tangles that don't mix
well with other people's incompatible tangles. I have been bitten by this
repeatedly in Ruby. But Ruby is still easier to get work done in than Common
Lisp.

Basically it died of being meh with a side of footgun.

~~~
flavio81
>Common Lisp is about on a par with Python in what it has built in, inferior
in its ecosystem

about 20x+ faster than Python, too; speed in the same order of magnitude than
C.

>and for some modern stuff (threads, sockets)

pretty amazing you bring out threads in this comparison, when Python doesn't
even support true multithreading.

>Macros are magic that lets you reprogram the language. Reprogrammed languages
break the ability of a programmer to drop in and read the code.

It's the other way around. Lisp macros allow to eliminate boilerplate and
produce clearer code. This is obvious to any lisper.

~~~
JulianMorrison
For reference, I'm not holding up Python as a good language, I'm holding it up
as a _comparable_ one. Python is warty and creaky, but more mainstream than
CL, so it's a fair comparison.

A _good_ language would be Go.

~~~
ACow_Adonis
As a Common Lisper (among others) interested in language design, I've just
started reading up on Go, after I've been told it's "simple, robust, safe,
well-designed".

Purely subjectively, alot of what I'm seeing in Go so far looks to me like a
"2nd system design" for C programmers. I'm hoping that the concurrency model
is where it shines through with some good new ideas and implementations
because I've been relatively unimpressed with the rest of it so far. That's
not to say it's bad per se, but I've seen nothing in it that makes it stand
out from other languages design-wise, and it feels rather warty to me so
far...

Purely my own subjective opinion.

~~~
shanemhansen
> looks to me like a "2nd system design" for C programmers

That seems fairly accurate.

> I'm hoping that the concurrency model is where it shines

I'd say that channels aren't quite the magic thing they are made out to be and
goroutines aren't that amazing. The real win of go for me is three things:

1\. Evented runtime with a synchronous interface. "Go is my favorite epoll
library"

2\. A standard lib that has great http support (things like keepalive and
connection pooling and http2)

3\. A build and test toolchain that makes coverage and race detection and
memory and CPU and lock contention profiling trivial.

------
tgbugs
Most of my day to day work is in python, but I have a number of projects in
Racket and have been dabbling with common lisp. I think the author is simply
incorrect here. Other algol like languages offer the illusion of
understanding, but then you get things like 'patterns' where even though in
principle you know what the atomic elements of the language mean you are
confronted with an undocumented 'pattern' that you have to puzzle out to
understand what the code is actually trying to do. Lisp provides structure for
these patterns in the form of DSLs, and if that means someone has to confront
that they don't know something, at least it whacks them in the face rather
than giving them the illusion of understanding. More importantly lisps provide
a way to formalize, specify, check, and document the pattern.

So if lisp failed for some internal reason then I don't think it was for the
reason the author specifies. What should worry people more is that a language
that is better in 99% of cases can fail for reasons that have nothing to do
with its technical merits, but purely by accident of history or as a result of
politics unrelated to those technical merits.

Trilobites were the most diverse clade in the history of the earth. Trying to
rationalize why they went extinct by studying their anatomy is going to be
fruitless. Same with LISP.

------
overgard
This is a really loose definition of “failed”. His definition of failed seems
to be “didn’t take over the world”... but most things don’t. It’d be like
saying “John Smith failed at life because he wasn’t a billionaire by age 25”.
As a language it was (and is) used in a lot of places, had a lot of offspring,
and brought a lot of influential ideas into the world. That doesn’t seem like
a failure!

~~~
ajkjk
Considering the number of people who have bemoaned over the years how Lisp
never became mainstream, I think it's fair to say that it failed compared to
what its enthusiasts wanted.

~~~
xhgdvjky
I think enthusiasts are separated from the mainstream by definition

------
caiocaiocaio
Did it? People seem to use Clojure a lot, other dialects are popular with
hobbyists, and some of the most enduring computer science books of all time
use it. How many other languages from the 50s can claim that kind of wide use?

Aside from that, as the author points out, ideas from Lisp have made their way
into almost every widely-used language today. It's fingerprints are
everywhere. That doesn't sound like a failure to me.

~~~
logicprog
I often daydream about an alternate universe where the entire Lisp community
unified behind Clojure, built other runtimes for it, etc. I wish we could do
that, because it would become a force to be reckoned with. Clojure is
perfectly designed and positioned for success in the industry, the real
problem for lisp is that the community is so fragmented.

~~~
slifin
I too have a similar dream, though it already has a few good runtimes but the
more the merrier, compiling to native via graalvm recently was too cool

~~~
logicprog
Wow wait that's a thing? I haven't kept up with Clojure for about a year. What
runtimes are available? Is there one that doesn't run on a VM?

~~~
slifin
Graalvm can compile a native binary for all major OSs that does not contain
the JVM but does run the substratevm which generates binaries in the order of
tens of megabytes (2 in my case) it eradicates JVM startup time so now things
like scripting is viable without using JS

Runtimes for Clojure include JVM, JavaScript engines, .NET then there are less
popular runtimes like Go Erlang python? realistically I'd use one of the first
three till the community is bigger

I don't know another language with as much practical reach as Clojure has

~~~
flavio81
>Runtimes for Clojure include JVM, JavaScript engines, .NET

Except that they don't support the same Clojure language.

ClojureScript is a success, and code written for the JVM Clojure won't
necessarily run on ClojureCLR.

Clojure's overreliance on the JVM at the very core was a bad decision.

Take Common Lisp: the exactly same language runs on the JVM, native machine
language (many platforms), and LLVM, with zero code changes required.

~~~
flavio81
* typo: ClojureScript is a _subset_

It is also a success, on the other hand. It's a good alternative to
Javascript.

We also have a subset of Common Lisp that can be statically compiled to
javascript: Parenscript. And a subset of CL that runs on the browser as a lisp
implementation: JSCL.

------
karmakaze
> So what is the problem with creating domain-specific languages as a problem
> solving technique? The results are very efficient. [...] It results in many
> sub-languages all slightly different. This is the true reason why Lisp code
> is unreadable to others.

> The reason Lisp failed was because it fragmented, and it fragmented because
> that was the nature of the language and its domain-specific solution style.

The main points above can be solved with two things:

1\. A more opinionated base language with batteries (e.g. Clojure)

2\. A package manager shared by users of the language

Ruby is expressive and has the DSL failure modes as lisp (though usually not
as low-level), but it starts with a more batteries-included base. If it wasn't
for Rails and gems leading the way to normalized usage Ruby would not be as
successful as it is today.

~~~
janoc
Sorry but that's just wrong. He is complaining about DSLs making things hard
to read. That's not fixed by switching to another language or a package
manager. That's purely a software engineering problem - nobody forces you to
write code in that style.

And LISP code being unreadable to others - sorry, that's bull. Seen e.g. some
"modern" Javascript stuff recently? Good luck making heads and tails out of
some of the frameworks - it often doesn't even look like a Javascript anymore!
And nobody seems to be claiming that it leads to the demise of the language.
Or numerical calculations in Fortran - still a gold standard for scientific
stuff.

To me the article is very much an output of someone who couldn't have been
bothered to learn the language griping about things he doesn't really
understand well and making huge generalizations based on that.

Why LISP isn't popular today is simply because that unlike for other
languages, there hasn't been a free/cheap compiler and IDE for a PC for a very
long time (except for the crippled LispWorks). LISP has always been an
university/research thing running on either on specialized machines or later
on Unix, not something "mere mortals" had access to.

Also most programmers have been taught languages like Pascal/C/Java, maybe
Scheme in their introductory courses and have never been exposed to LISP, so
they have no way to know about it unless they are themselves curious about it.

~~~
dreamcompiler
> there hasn't been a free/cheap compiler and IDE for a PC for a very long
> time

CCL is a free Common Lisp compiler that runs on Mac, Windows, and Linux. On
the Mac, it comes with an IDE. It's been around for about 20 years.

SBCL is another free compiler that uses Emacs/Slime as an IDE.

~~~
AgentOrange1234
I agree that CCL and SBCL are superb on Linux and Mac. But until at least a
few years ago, they were both poorly supported on Windows (requiring something
like Cygwin). That may have changed now. Also, AFAIK, CCL is dying now. :(

~~~
flavio81
>AFAIK, CCL is dying now

CCL is the 2nd most downloaded Lisp implementation, so it can't be called
"dying."

You'll find that new lisp projects are almost invariably tested on CCL too.

~~~
AgentOrange1234
The latest commit is 2 months ago.[1] I don’t know how to define “dying,” but
it doesn’t seem healthy.

It saddens me, as it is/was a great tool for me for many years. I hope SBCL
can take it from here.

1\. [https://github.com/Clozure/ccl](https://github.com/Clozure/ccl)

------
abrax3141
Lisp hasn't failed. Software engineering has failed to reach the conceptual
level that Lisp operates at.

~~~
dreamcompiler
This is precisely correct IMHO. Lisp was designed for single programmers
managing their entire stack (including the compiler itself) rather than for
teams mostly gluing libraries together. Modern software engineering practices
are focused on lower-level languages that don't give programmers the power of
Lisp.

[Herein I mostly mean Common Lisp when I talk about Lisp. I have no experience
with large projects in Clojure or Scheme.]

Most modern software -- Lisp programs included -- is written by multiple
programmers in both coordinated and uncoordinated teams. Lisp can easily adapt
to team programming but SWE needs to be augmented to handle programming
projects that operate at a meta-level, which is natural with Lisp. If you
write code at a meta level with ordinary SWE it can lead to disaster.

SWE needs tooling, procedures, and policies to deal with macros, package
naming, bootstrapping, multiple-inheritance classes and first-class methods,
etc. before it can really support Lisp. Lisp projects that have added such SWE
features manually tend to succeed; those that don't tend to have difficulty
with Lisp. My point is the team has to do the SWE work; SWE tools operating in
default mode tend not to be good enough for Lisp.

~~~
lispm
> Lisp was designed for single programmers managing their entire stack

Funky, McCarthy had a whole team working on and with Lisp.

~~~
dreamcompiler
So did Symbolics. And most of those programmers had full-stack mastery. That's
not the case with modern software engineering projects. Nor did McCarthy or
Symbolics have to deal with the manager-friendly buzzword salad that permeates
most software projects today.

~~~
lispm
> those programmers had full-stack mastery.

After many years working in that environment...

------
garren
It's difficult to agree with the idea that VB displaced COBOL. Even if this
were the case, it certainly wasn't because COBOL compilers were expensive
compared to "a cheap interpreter" that came with the machine. COBOL compilers
surely were expensive, but they were also intended for mainframes rather than
PCs. Also, I don't remember VB ever coming packaged with Windows for free. VB
has been a part of Visual Studio for as long as I can remember and I don't
believe there was a free version of until around VS2005.

Lisp, it turns out, is a weird and challenging language. It takes a little
extra before you start to see the exciting bits. Unfortunately, until you
reach that point, it's a matter of struggling with the parts around it -
namely the editor. Lisp and Emacs go hand in hand, and while today you can
rely on other editors to competently work with Lisp code, 20 years ago I don't
think this was the case.

So now a user has to pick up Lisp (challenging when viewed through the lenses
of either VB and COBOL) and Emacs (positively baroque if your only experience
is within a GUI and IDE). I'd venture to guess that getting a functional Lisp
environment running on a 286/386 era PC was probably a challenge in itself for
most people.

Now consider VB or Java, significantly more familiar languages that didn't
effectively depend (at the time) on the features of their editor or IDE. Both
were a mouse-click away as far as installation, and Java didn't cost anything.
Both were backed by large organizations that had a lot of incentive to invest
and convince people to use their particular thing.

Lisp was this weird, mostly academic, language intended to tackle relatively
esoteric rather than business problems, that was Unix only at a time when PCs
and Windows were taking off. It doesn't seem unusual to me that it didn't take
over the world. Frankly, I'm surprised and thankful that it's still around,
actively used, and even sporting a significant community.

~~~
kazinator
> _while today you can rely on other editors to competently work with Lisp
> code, 20 years ago I don 't think this was the case._

I've been doing Lisp for just about that many years and have used nothing but
Vim. Its Lisp support is built off :set lisp mode which is in the original
Unix vi. It does a good job of syntax coloring, indentation and all that.

To reindent a block of code in Vim, I just put the cursor on one of the
parentheses, then hit =% . The odd time there is a stylistic disagreement
(like Vim wanting to align the arguments of something as if it were a
function, rather than let-like).

My Lisp implementation, TXR Lisp, comes with a comprehensive Vim syntax file.

When you browse the sources using CGIT, all the syntax coloring you see is
performed by Vim, being used on-the-fly as a back end for HTML generation.

E.g.:
[http://www.kylheku.com/cgit/txr/tree/share/txr/stdlib/defset...](http://www.kylheku.com/cgit/txr/tree/share/txr/stdlib/defset.tl)

~~~
ksaj
> =%

Yikes. I've been using vi forever and that never occurred to me. I just tried
it on my own config and it works just as you say. I hope it is configurable,
'cos it doesn't fully agree with my habits (although maybe that's a fault of
my own, and not vi(m)'s!)

Best thing about HN comment threads is that these gems keep happening.

~~~
kazinator
I used that on C code before Lisp: jump from { to } or _vice versa_ , while
reindenting.

Whether you get this alignment:

    
    
      (oper args
        blah blah)
    

or this one:

    
    
      (fun arg arg
           arg arg)
    

depends on one very simple configuration: the _lispwords_ variable. All the
identifiers listed in _lispwords_ are given the former treatment.

Then there are minor squabbles like what happens with the _t_ clause in
_cond_.

A lot of the time I just use visual select and =.

Also, in Vim, when you reindent multiple lines, it basically assumes that the
first one is indented right; it will not move it relative to it predecessors.

You often have to hit == on the first line to get it into alignment, and then
%= (or whatever) to reindent the range below it. E.g.:

    
    
       (let ((x y))
       (blah        ;; here we type ==
         blah blah
         (foo bar
              boo))
    
       (let ((x y))
         (blah      ;; now %=
         blah blah
         (foo bar
              boo))
    
       (let ((x y))
         (blah
           blah blah
           (foo bar
                boo))

------
sametmax
The crust of the article is:

> The reason Lisp failed was that it was too successful at what it was
> designed for. Lisp, alone amongst the early languages was flexible enough
> that the language itself could be remade into whatever the user required.
> [...] However, the process causes Balkanization. It results in many sub-
> languages all slightly different. This is the true reason why Lisp code is
> unreadable to others. In most other languages it is relatively simple to
> work out what a given line of code does. Lisp, with its extreme
> expressibility, causes problems as a given symbol could be a variable,
> function or operator, and a large amount of code may need to be read to find
> out which.

And I perfectly agree. You can see it in tutorials too. You can even see it in
any Lisp-lover comments showing you a proud snippet of how "it can be solved
better with lisp": it's too clever.

A little bit of clever is fun, it's good for the soul, it can even be
productive. But Lisp is just a big pile of cleverness.

Being clever is so appealing to my geek nature. It's so cool. Yet experience
(in coding and life) taught me it is not a good property to build a community
around. Or a project for that matter.

At best, it's something that can emerge from a battle against long and arduous
stream of problems. And then you can rest, you try to factor away the
cleverness. But not as a goal. Not as a basic proposition.

~~~
reikonomusha
Do you have examples of otherwise well regarded Common Lisp software that has
the criteria you state?

People have used Lisp cleverly, but most modern libraries are a bunch of
functions and classes. Sometimes more foreign features are used, like macros
or even reader macros, but they do so to help one express him/herself. Even
the open source Common Lisp compilers, written by arguably the lispiest of
Lispers, don’t have a lot of “cleverness”. Check out SBCL, CCL, or ECL.

Some of the prolific library writers—Weitz, Fukamachi, Hafner (aka Shinmera),
Strandh, Beane, etc.—all produce good libraries and good code, and aren’t at
all as you describe. Check out Hunchentoot (a web server), Clasp (a new CL
compiler), CL-PPCRE (regex library), Radiance (a web framework), Quicklisp
(package manager), ... These are all easy to jump into and relatively easy to
understand if you know the domain.

Even if you go back and look at the Symbolics sources, there really isn’t a
lot of cleverness.

If the complaint is that some lone wolf programmers write obscure code, then
I’m not sure what’s to worry about. Somebody’s lone wolf code isn’t what
you’re going to be importing or depending on.

I hear this opinion you state somewhat often by drive-by comments, but I never
see examples, just anecdotes.

~~~
ksaj
Don't write clever. Write clear.

------
mark_l_watson
I use Common Lisp for most of my development now that I am (mostly) retired,
so I admit some bias. Given that:

Common Lisp tooling is very good and there is an active community of users. I
argue that Haskell is really a Lisp language, at least it seems so to me.
Also, so many good Lisp-y things are in Ruby, Python, modern JavaScript, etc.

I think language selection for a project is a process of making trade offs. I
am happy that it is a free world to develop using whatever languages and tools
we want. I try very hard to never judge anyone on their politics and I try to
do the same in programming language wars.

------
ddragon
Are there examples of languages that were not popular for a long while before
suddenly going mainstream (the definition of success from the article)? I feel
like even a potential killer app is not enough, there were some interesting
web frameworks for smalltalk (like seaside) but it wasn't enough to shake in
any way the feeling that the language was a dead end for most people (and they
can just wait for a reimplementation in a popular or new hyped language if
it's really a good idea).

Phoenix/Ecto for example would probably not get the attention it gets (and
deserves) if it focused on Erlang instead of a new language with some
momentum. And Lisp seems to get it even worse when all of them (old and new,
with as many programming paradigms as there can be) get grouped together as
one 60 year old language family. Hopefully clojure, racket and other newcomers
can turn this around.

~~~
jenshk
Ruby was created in the mid-90s, long before Ruby on Rails became the killer
app.

~~~
ddragon
Yes, I was thinking about that case. While Ruby was comparatively still a
modern language at that point, Rails used the metaprogramming of the language
to effectively create a new unique language on top of it with it's opinionated
design. It might be even a point in favor of Lisp, since even in an older
language like Common Lisp it can still be possible to build a new language on
top that feels modern, maybe some kind of tidyverse environment for handling
the next AI wave paradigm whatever it may be.

------
chicagoscott
Paul Graham suggested in 2004 that not having an O’Reilly book was a real
factor:

[http://www.andrewlangman.com/articles/the-missing-lisp-
book....](http://www.andrewlangman.com/articles/the-missing-lisp-book.html)

~~~
DonHopkins
Can you guess the animal that should have been on the cover?

[http://www.maclisp.info/animal/](http://www.maclisp.info/animal/)

[https://www.donhopkins.com/home/archive/lisp/animal.l](https://www.donhopkins.com/home/archive/lisp/animal.l)

~~~
chicagoscott
Maybe given the age and longevity of the language and the creature, the
tuatara would be the appropriate cover animal. (It is in the animal.l source.)

------
namelosw
1\. The more I dig into Lisps, the more I find Lisp is not a language. 'Lisp'
is more like the term 'C-like'.

Common Lisp, Scheme, and Emacs Lisp are way too different from each other.
It's more like a comparison with C++, JavaScript, and Bash. Common Lisp is
hardly a functional programming language. Emacs has no lexical scope (This
really made me freak out). Clojure is a more qualified functional programming
language.

2\. Lisps, for example, Scheme was superior to other languages decades ago.
But now, as you can see JavaScript is basically a Lisp without macro (Yeah I
know HN guys don't like JavaScript, but I choose JavaScript because it's a
footgun language like Common Lisp and Emacs Lisp). Or just Elixir, basically a
Lisp without paren syntax.

3\. Lisp never dies, there will be more. It will march forward as the industry
moves. The Lisp syntax means the ease of parsing and macro design, and people
will combine this feature with other features. There are already:

a) Functional Lisps like Clojure b) Carp, a Lisp without GC, there were also
Linear Lisp c) WASM, an assembly with S-expression d) I have already seen
other prototypes with static types/dependent types/monadic macro systems etc.

~~~
lispm
> Emacs has no lexical scope

it has now

[https://www.gnu.org/software/emacs/manual/html_node/elisp/Le...](https://www.gnu.org/software/emacs/manual/html_node/elisp/Lexical-
Binding.html)

------
Quequau
I've been using Lisp off and on for pretty much my entire adult life... at
least since the mid eighties. It doesn't seem like a failure to me.

------
TheOtherHobbes
LISP didn't fail. It released plenty of spores which modified the chromosomes
of the rest of the language ecosystem.

The original LISP body then went into suspended animation, and may reappear in
an evolved form at some point in the future.

------
runn1ng
Add (2009) to this article.

[https://locklessinc.com/articles/](https://locklessinc.com/articles/)

------
thsealienbstrds
Right now I'm using Lisp for a hobby project. I find it actually really
enjoyable. Lisp's metaprogramming support is super powerful. I'm the kind of
developer who likes to explore different styles of programming. If you're like
me, Lisp will not fail you. It really is a "programmable programming
language".

For example, Lisp is not purely functional, but if you want to force yourself
to program in a bit more of a functional style, you can make a macro that
disallows the use of 'setf' (Lisp's assignment operator) in a function body,
and it's actually quite easy to do that!

Maybe Lisp 'failed' because it's too flexible. You can solve your problem in a
million different ways. Arguably, that is not a good thing in industry where
it is preferred to have a language that enforces a single style on your code
base for consistency across developers. However, I'd argue that it is very
good thing for a creative individual. You want to have the freedom to do
things your way.

From what I've experienced so far, Common Lisp offers a unique combination of
customize-ability, utility, and interactivity. It's been around long enough
for it to have a significant amount of libraries, and it has a multi-threaded
REPL (so you can run your main loop and hot-reload your code-updates at the
same time, how cool is that?!).

As for the parentheses... yes, I can see why Lisp code would be hard to read.
On the other hand, with the right editor extensions (parinfer) it's really
easy to edit, and you don't even need to think about the parenthesis when
doing it. The documentation is alright I'd say, you can find plenty of stuff
online.

Not sure what else I can say. I'm having a good time with CL... I'll try not
turning into a 'smug lisp weenie'?

------
jodrellblank
Elephant in the room: the hostility of the LISP community. Is there any
language community with quite the reputation for unwelcoming unpleasantness,
combined with the opposite - no counterbalancing reputation for welcoming
friendliness?

That people feel bad when they try to enter LISP world is much documented[1].
That Python and, say, Julia, managed to get reputations as welcoming
communities long after Eternal September says the problem isn't "everyone"
invading Usenet.

I'm expecting immediate downvotes for this comment, but what I really hope for
is "yes I have that feeling about LISP world compared to other languages" or
"no I feel LISP world is represented approximately the same as other
languages".

[1] e.g. [https://eli.thegreenplace.net/2006/10/27/the-sad-state-of-
th...](https://eli.thegreenplace.net/2006/10/27/the-sad-state-of-the-lisp-
user-community) from which comes this

> " _And yes, I think it still represents the comp.lang.lisp group, even in
> the post-Naggum area. Just look up the latest debate of the shortcomings of
> Common Lisp that was incited by Steve Yegge. Lisp is as close to a perfect
> programming language as any, and its zealots guard it rabidly. Criticism isn
> 't tolerated. Ignorance isn't tolerated - which drives many newbies away
> quite quickly. Lisp has been attacked so much before, that any post by a
> newbie meekly seeking an advice on some arcane feature gets immediately
> interpreted as criticism and turns into a flame war. True, Erik Naggum was
> the main cause of such flame wars in the past, but he wasn't the only one.
> Wake up, Common Lisp gurus - your language is so unpopular because of you -
> there is no one else to blame. If you ever asked yourself why is CL much
> less popular than the obviously "brain dead" Perl (paraphrasing Naggum's
> favorite Perl quote), the answer is the community, that is you._"

~~~
jodrellblank
A fun read about Common LISP community [https://github.com/exercism/common-
lisp/issues/206](https://github.com/exercism/common-lisp/issues/206)

~~~
flavio81
That "Hexstream" guy is a known mentally ill person who harasses other Lisp
community members in various ways (github, twitter, his personal website,
exposing other lispers' personal information, etc.)

He is alone and doesn't represent the community in any way.

------
yogthos
Typically what ends up in wide use is historical in nature. C became dominant
because it closely maps to the hardware that was available when personal
computing went mainstream. A whole generation of programmers learned it, and
that influenced language design for decades to come. When everybody is used to
doing things a certain way, it can take a long time for new ideas to gain
popularity. People often get set in their ways and have a hard time adapting
to new ideas.

A lot of the popular languages are fungible because they come from the same
family. For example, Ruby, Perl and Python are all extremely similar in nature
and offer no fundamental advantages over each other. Then we have system
languages like C, C++, and Rust which offer better control over resource
utilization.

As a long term trend what's going to drive language popularity is how well the
language is suited to tackle popular problems. Functional languages are
perfect example of this, as they have been around for many years, but they
simply didn't fit well with the available hardware.

Back in the 70s you had single core CPUs coupled with very limited memory.
Things like GC were simply not an option and you needed languages that were
good at squeezing out every bit of performance possible. Imperative languages
addresses this problem extremely well.

However, today we have different kinds of problems. In many cases raw
performance is not the most important goal. We want to have things like
reliability, scalability, and maintainability for large code bases. We want to
be able to take advantage of multicore processors and network clusters.

The imperative paradigm is a very bad fit for these problems because it makes
it very difficult to localize change. On the other hand, functional paradigm
is a great fit for these problem thanks to its focus on immutability and
composition. Today we're seeing a resurgence of interest in functional
programming, and Clojure is a perfect example of a Lisp that's quite
successful.

------
Tagbert
I learned Lisp in the 80’s and 90’s because AutoCAD used that as it’s internal
scripting language. Was fun to play around with and I wrote code for a paying
project to do election redistributing after the 1990 census. I’m out of that
world now and have no idea what AutoCAD uses these days for scripting.

------
patrickxb
Lisp failed? There are multiple articles about it on the front page of HN.

~~~
lispm
HN is even written in a Lisp dialect...

~~~
krapp
...which arguably is the language's only real application.

------
ncmncm
Languages that have achieved any degree of success, as Lisp did, never fail
for just one reason, or two, or three. Lisp failed for many reasons; some are
listed in the other comments. The article suggests one, and discounts others,
but all contributed.

The hardest lesson to learn seems to be that GC is poison. Other languages
that have avoided Lisp's other faults succumb to that one, and lock themselves
out of the most demanding problems.

When embarking on a project, you do not often know everywhere it will take
you. If you start with a language that won't go some places, your project
can't go there. If it needs to go there to succeed, you have doomed it at the
outset.

How many times do you need the lesson? Some need it more than others. Many
never learn it.

~~~
pfdietz
> The hardest lesson to learn seems to be that GC is poison.

Perhaps that's a hard lesson to learn because it's obviously wrong.

~~~
jeffdavis
I wouldn't call GC "poison", but in a lot of domains it is a severe
limitation. I don't see any GC language unseating C/C++ very soon, and perhaps
never.

Clearly GC has achieved a ton. However, I am starting to wonder whether it
will hang on to that success, or whether new language designs that don't need
it will gain a lot of ground back.

GC, in a way, seems like a shortcut to avoid reasoning about lifetimes, and
sometimes taking that shortcut comes back to bite. If it becomes easy enough
to get help from the compiler, maybe we just don't need GC any more?

Obviously far-fetched, but interesting to think about.

~~~
pjmlp
Yet C++ community has adopted RC on the standard library, COM/UWP uses RC, and
the most sucessful AAA game engine middleware uses a C++ tracing GC.

And yes, RC is a GC implementation algorithm as per CS reference papers and
notable books.

C might never come close to any form of authomatic memory management, but in a
couple of decades it will be relegated to surviving UNIX clone deployments.

~~~
jeffdavis
C might be diminished a lot, but the beachhead is being made with Rust, not a
GC'd language.

~~~
pjmlp
The only holdouts left for C are UNIX clones and embedded.

C++ and managed languages have taken over everything else.

------
alde
For me Lisp was a lot of fun to learn and play with, but I dont think it had a
chance to survive in modern production environment because: 1\. Not typed 2\.
Absence of a proper package mechanism and package managers like cargo or npm

~~~
flavio81
>because: 1. Not typed

Umm... the long section on "Common Lisp: The language" describing the features
of the type system disagree with you.

> 2\. Absence of a proper package mechanism and package managers like cargo or
> npm

Quicklisp.

~~~
alde
CL has some type checking features in some of its compilers, but this doesnt
mean it is a typed language. Quicklisp is still in beta and is in no way even
close to package managers like cargo or npm.

~~~
flavio81
>but this doesnt mean it is a typed language

Custom data types can be defined

Data types can be checked,

Functions can be dispatched on the data type of the input parameters (see
CLOS),

Variable types can be declared.

... so, Lisp isn't a "typed language"?

>Quicklisp is still in beta and is in no way even close to package managers
like cargo or npm.

QL is already better than NPM and PIP. It says "beta" only because the author
is modest; it already has years of releases and development.

Maven is a better package manager, though.

------
kragen
Lisp didn't fail. It went mainstream. Mostly.

(This article is full of errors; the author doesn't seem to know much about
FORTRAN, BASIC, or LISP.)

Paul Graham wrote an article about this in 2001, "What Made Lisp Different":
[http://www.paulgraham.com/diff.html](http://www.paulgraham.com/diff.html). He
lists nine features: conditionals, first-class functions (though not, at
first, closures), recursion, dynamic typing (and what I called the object-
graph memory model in [http://canonical.org/~kragen/memory-
models/](http://canonical.org/~kragen/memory-models/)), garbage collection, no
distinction between functions and expressions, a symbol type, a notation for
code using trees of symbols (and thus the ability to add macros), and the
whole language always available (no strong distinction between compile-time
and runtime, dramatically simplifying macros and other forms of
metaprogramming).

As Paul points out, these features got adopted by other languages gradually
over time, but in 1960 or 1970 or 1980 or even 1990, if you needed garbage
collection and dynamic typing, or to pass around functions as values, or to do
metaprogramming, your non-Lisp options were very limited. Prolog or Smalltalk
might be a possibility, but usually they weren't. So Lisp was extremely
popular. It was really the only reasonable candidate for an embedded scripting
language in the 1980s, so that's what Emacs and AutoCAD used.

By contrast, consider the currently-popular crop of languages: Java, C,
Python, C++, C#, VB.NET, JS, PHP, SQL, Objective-C, Ruby, assembly, Swift,
Matlab, and Groovy, say. Let's omit SQL and assembly from what follows. All of
them have conditionals; all of them have first-class functions (though in C,
closures are a nonstandard GNU extension); all of them have recursion; all
except C have some form of dynamic typing, and half of them are purely
dynamically typed (except C, C++, Objective-C, C#, Java, VB.NET, and Swift),
and even more of them use the Lisp object-graph memory model; all of them are
garbage-collected (except C, C++, and sometimes Objective-C); many of them
have a symbol or "atom" type (Python has intern(), Ruby has symbols,
Objective-C has SEL, Swift has Selector, and JS just acquired Symbols in ES6);
and most of them support Turing-complete metaprogramming in one way or
another: templates in C++, "eval" in Python and JS and PHP and Ruby and
Matlab, "Eval" in Groovy, and loading dynamically generated bytecode with a
fresh ClassLoader in Java.

Metaprogramming merits special attention here; fully a third of Paul's items
(symbols, representing source code as a tree of symbols, and the lack of
compile-time–run-time distinction) are about metaprogramming, and those are
the items that are not widespread today. The main use of metaprogramming is
implementing embedded domain-specific languages, which you could reasonably
argue is the most important part of the Lisp approach. (Certainly the article
claims that it's what sunk Lisp.) But there are ways to implement EDSLs other
than compile-time code evaluation to modify your source code while represented
as a tree of symbols, and indeed the immense difficulty experienced in solving
the hygienic macro problem in Scheme (getting to Macros That Work) suggests
that it may not even be the best way. You can get a long way by using
reflection instead of macros, and in Python you can override __dunder__
methods, implement iterators, and write metaclasses; in Java, in addition to
firing up OW2 Asm and generating new classes, you have @annotations; in Ruby
and Objective-C, you have method_missing and -doesNotUnderstand:; in object-
oriented languages in general, you have virtual method dispatch (including but
not limited to the Interpreter pattern); and in Ruby you have block arguments,
and the ES6 => syntax is lightweight enough to be used in the same way. (I
don't know several of these languages well enough to comment on their
metaprogramming facilities.)

So the real story is that most of Lisp's features went mainstream, and every
popular language has them, so they are no longer a reason to choose Lisp
_stricto sensu_. They do differ in how to implement metaprogramming, Lisp's
most radical feature, as did Lisp — fexprs are nowhere to be found in Common
Lisp (or in McCarthy's 1959 Ur-Lisp), and Scheme hygienic macros are another
game again, one which also doesn't provide an S-expression API to the macro-
writer.

(The expression–statement dichotomy is an exception here. It's true that Lisp
doesn't have it and most modern languages do. I think this is an example of
the tradeoff between error detection and succinctness I described in
[http://www.paulgraham.com/redund.html](http://www.paulgraham.com/redund.html)
— the expression–statement dichotomy improves the reporting of parsing errors
considerably, and the compensating expansion of your code is almost
insignificant.)

FigBug argues in
[https://news.ycombinator.com/item?id=20375596](https://news.ycombinator.com/item?id=20375596)
that Lisp _stricto sensu_ failed because it had no killer app (other than, I
suppose, Emacs and AutoCAD), because most developers don't pick a language,
but are rather constrained to use the language demanded by their environment:
JS in the browser, C for Unix, Java for Android. But that just poses the
question of why Android uses Java instead of a purer Lisp, why the browser
uses JS instead of a purer Lisp, and so on. It just reduces the adoption
decision to a smaller group of programmers.

There were a couple of other historically contingent things that happened,
which don't have anything to do with the merits of the languages as such:
around 1988 the AI Winter and the workstation revolution wiped out the Lisp
companies; around 1995 the internet went mainstream and for a while all the
interesting development was in Perl 5, partly because of its Lispy qualities
but also because its attitude toward Unix was the extreme opposite of Lisp's;
and the microcomputer world developed its own programming traditions, despite
the noble efforts of magazines like BYTE to bridge the gap. Presumably
something similar is happening right now in Shenzhen.

~~~
vorg
> the currently-popular crop of languages: Java, C, Python, C++, C#, VB.NET,
> JS, PHP, SQL, Objective-C, Ruby, assembly, Swift, Matlab, and Groovy, say

Did you get this list of 15 languages from TIOBE's July 2019 top 15 rankings
at [https://www.tiobe.com/tiobe-index/](https://www.tiobe.com/tiobe-index/) ?
It also says Apache Groovy has risen from #81 in July 2018 to #15 today. I do
believe the #81 ranking but the rise to #15 only 12 months later is ludicrous.
Someone's obviously spamming a search engine to get that ranking up. I know
someone does the same thing with Groovy downloads from Bintray.

Now if Groovy's ranking has been fabricated over the past year, then surely
some others of those languages have also been similarly fabricated for much
longer, and their _popularity_ has been exaggerated.

~~~
zmmmmm
> Groovy has risen from #81 in July 2018 to #15 today

The rise to 15 is probably not correct but 81 is probably as inaccurate the
other way as 15 is today. Given how ubiquitous gradle has become and groovy
being the basis of that. Especially when you consider the nature of tiobe, it
is about search results, so even people migrating away from groovy will
generate "groovy" traffic as they try to figure out how to do equivalent
things in, say, kotlin.

~~~
vorg
> 81 is probably as inaccurate the other way as 15 is today

I can understand both how and why someone would push Apache Groovy's ranking
higher up the Tiobe results, but I wouldn't know how you could push it down,
let alone why anyone would. #81 is probably as accurate nowadays as the
mid-40's has been for most of Groovy's life on Tiobe since 2006. Groovy's seen
a significant drop in use (outside Gradle) over the past few years.

------
bsder
Lisp failed because its community failed to accommodate the beginner trying to
do something _useful_.

Lisp has a zillion useful ways of doing things--that nobody ever taught.

Try finding a beginner book on Lisp (even today!) that actually teaches about
vectors and hash tables instead of trying to do everything with recursion,
lambda and singly-linked lists.

You don't attract a beginner trying to write a game, for example, that way.

~~~
lispm
> Try finding a beginner book on Lisp (even today!) that actually teaches
> about vectors and hash tables instead of trying to do everything with
> recursion, lambda and singly-linked lists.

I tried:

Practical Common Lisp by Peter Seibel

[http://www.gigamonkeys.com/book/collections.html](http://www.gigamonkeys.com/book/collections.html)

Also see Land of Lisp page 153ff: Arrays, Hash Tables, Structures, ...

[http://landoflisp.com](http://landoflisp.com)

Common Lisp Cookbook:

[http://cl-cookbook.sourceforge.net/hashes.html](http://cl-
cookbook.sourceforge.net/hashes.html)

COMMON LISP: A Gentle Introduction to Symbolic Computation, from 1990

[http://www.cs.cmu.edu/~dst/LispBook/book.pdf](http://www.cs.cmu.edu/~dst/LispBook/book.pdf)

Chapter 13: Arrays, Hash Tables, And Property Lists

~~~
bsder
"Practical Common Lisp"\--Copyright 2012--gets it right and deserves to be
called out for that. Hashes and Vectors appear _BEFORE_ lists in the book.
They don't get the emphasis they deserve, but they are on par with the list
coverage.

"COMMON LISP: A Gentle Introduction to Symbolic Computation, from 1990"\--Um,
wow, a whopping _17 PAGES out of 500+_ ( _5_ for hash tables) on nominally the
two most useful data structures in all computer programming. (Side note: I had
an actual dead tree edition of Touretzky from CMU in 1986?--I don't remember
hashes in it--I think that was an add-on later). And, even 1990 is after Lisp
starts getting displaced (Tcl, Perl, Python, etc. are all coming online).

The Perl4 Camel Book cookbook section was practically an existence proof for
"Show people how to do useful things in your language and they will flock to
you."

In addition, "Hash all the things!" was practically Perl's motto, and it wiped
the floor with everything else for a _VERY_ long time (something like
1994-2005--for 10+ years Perl was _dominant_ ).

~~~
lispm
> "Practical Common Lisp"\--Copyright 2012--gets it right and deserves to be
> called out for that. Hashes and Vectors appear BEFORE lists in the book

Did you read the book?

The book is from 2005. Lists are introduced in chapter 3 with a practical
example, a simple database.

The Touretzky edition I mentioned is from 1990.

Actually lists are the main fundamental data type in Lisp (well, it's also
called as an abbreviation of 'List Processor'), which sets it apart from many
other programming languages and is the reason of its existence. The basic idea
of Lisp is that one can do many practical things with lists.

Let's look further in your claim that it would be hard to find introductory
Lisp books with mentioning arrays.

Paul Graham wrote published in 1996 the book ANSI Common Lisp. Arrays appear
in chapter 4 together with strings (1d arrays/ vectors), sequences (a data
type joining lists and vectors), structures (records) and hash tables.

Winston&Horn, 3rd Edition of 1989. Arrays are in chapter 11. But from that
book we get a good idea through practical examples that one can do a lot of
interesting things with lists.

Another book, the 'Common Lisp Companion' from 1990. Vectors, strings, arrays,
sequences, hash-tables appear in chapter 4.

> Perl was dominant

for text processing and early web server scripting...

For example I can't remember much in NeXTStep or MacOS that made use of Perl
(or Lisp) in any crucial way. The OS and its applications were written largely
in C, C++ and Objective C.

------
jlg23
Oh the irony of finding this link and a discussion about it on a website
driven by a lisp dialect :)

------
molteanu
I'm using it.

I'm having fun with it.

I'm building useful stuff with it.

It has not failed.

------
carapace
I think Forth has a similar issue: despite the attempt to ANSI it, Forth is a
family of idiosyncratic individuals. Chuck Moore even says that standardizing
Forth is kind of doing it wrong, you're _supposed_ to custom build your own,
for each application even. (Like if a Jedi knight built a lightsaber for each
battle. (^_^) )

So on the one hand, Forth "failed", while on the other it's still used a lot.

------
agumonkey
lisp never failed, our idea of success is wrong

------
singularity2001
The OP makes a good case, but I don't agree with his dismissal of 'syntax' as
a major problem.

I found the comment introducing
[https://readable.sourceforge.io/](https://readable.sourceforge.io/) quite
interesting. Deserves a submit of it's own.

------
smitty1e
The "lack of a killer app" point made in the comments below is interesting in
light of the Perl6 story today.

------
grumpy8
Unpopular opinion, but I find readability really hard on lisp projects.
Everything has to be read backward, and just because you can nest 10 lambdas
doesn't mean that you need to nest 10 lambdas all over the place. So the code
you end up with is write-only nested-over-nested-over-nested backward-reading
code.

Can clean lisp be written? For sure, and it's magnificent. But in practice,
it's a total shit show. Basically, what I'm saying is when it's great, it's
amazing, and when it sucks it's the worst.. whereas other languages like
java/javascript/python have less variance in code quality and range from
["pretty-meh to "good-enough"].

And what's happening these days is the great features of lisp are integrated
with "good-enough" languages, and then they become "pretty-good" languages.

~~~
slifin
Threading macros -> and friends in Clojure change the ordering of reading and
they're idiomatic, is that what you mean?

~~~
pfdietz
Easy to add to Common Lisp. In fact, the CL code base I work on currently uses
these.

[https://github.com/hipeta/arrow-macros/](https://github.com/hipeta/arrow-
macros/)

------
derefr
Lisp didn’t fail. There are several successful modern Lisps, like Clojure.

But, moreover†, there are also successful modern languages that don’t look
like Lisps at first glance, but which certainly are Lisps by most
definitions—the best example I know of being Elixir.

Rather than a distinct compiler that does codegen to a distinct architecture
target, languages like Elixir (and Clojure!) consist of two distinct
components:

1\. A plain language grammar parser, which parses “syntactic literals” of the
language into a Lisp-alike AST (and Lisp had one of these as well, even in its
first incarnation—Lisp’s “syntactic literals” were called
[https://en.m.wikipedia.org/wiki/M-expression](https://en.m.wikipedia.org/wiki/M-expression)
s.)

2\. A homoiconic runtime interpreter of this AST, with hygenic macro support,
which evaluates macros at runtime with the _side-effect_ of producing a
program. This “runtime” is called “compile time”, but it’s really just the
same runtime that the compiled program runs in (and both have equal access to
the machinery that produces programs.)

In any Lisp, the definition of a function or a module isn’t a special form
that the compiler does something with; rather, `defn`-like clauses are just
references to an ordinary _macro_ , which the S-expression _interpreter_
invokes when attempting to reduce the AST that has `defn` as its head. The
`defn` macro consumes a parameter list and an expression body, and has the
_side effect_ of producing a compiled function in some way; and then the
`defclass`-alike macro has the side-effect of collecting those compiled
functions generated within its scope, and producing a compiled module from
them.

And, in any Lisp, the M-expressions (like those of Clojure, or those of
Elixir) are a convenient syntax to write code literals in, but you don’t have
to use them; you can just as well write “raw AST” as the M-expression
representation of the AST’s data-structure literals (since, in a Lisp, these
data-structures are always ordinary stdlib data structures like lists or
tuples, rather than compiler-specific data-structures); or, equally well, you
can _generate_ and _build_ these AST data structures by writing functions to
produce them, and then call these functions inside a macro body in place of
where you’d write a quoted M-expression literal or a plain S-expression ADT
literal. In est, there’s no difference between the macros that a _user_ of
such a language writes, and the macros that _define_ the language; both are
just homoiconic AST->AST mapping functions. There are a few that do fancy
side-effects involving invoking an object-code compiler... but you can write
your own functions that do that as well. (This is why it’s so easy to build a
new language [like HN’s runtime Arc language] on top of an existing
Lisp—there’s nothing stopping you from writing macros that call out to your
own codegen machinery, and at that point you’ve bootstrapped your way out of
the original language.)

——

† Note that, by this definition, there really _isn’t_ any difference between
Clojure and Elixir; I didn’t need to distinguish “actual Lisps” and “languages
that act like Lisps.” Clojure and Elixir are both Lisps—both M-expression
glosses on top of a “compile time”-runtime S-expression evaluation engine with
compilation as a side-effect of certain “compile time”-runtime macro
invocations.

Certainly, Elixir has special M-expression syntax for calling various macros
(`if`, `for`, etc.), but adding such a thing to Clojure wouldn’t make it any
less Clojure.

The real contrast, if there is one, is that Elixir has no _homoiconic_
M-expression syntax for S-expressions (i.e. no equivalent to Lisp’s
parentheses), so you can’t just write a low-level S-expression equivalent of
an M-expression inline in a function in order to have that AST subtree be part
of the function. Instead, you have to define a separate macro to declare your
intent to inject a low-level S-expression, and call it at that point in the
function body; and you have to write such ASTs in an alternate M-expression
form (as tuple data-structure literals.)

This is less of a distinction between the languages _as languages_ , though,
and more a distinction between the languages _as M-expression syntaxes_. You
could define an alternate M-expression syntax for Elixir, which _did_ have
direct inline S-expression support, and the resulting language—at least in my
opinion—would still be Elixir.

------
iLemming
> Why Lisp Failed...

All Emacs users: "Are we a joke to you?"

------
lambdamore
I think Mathematica is kind of a killer app for lisp.

~~~
DonaldFisk
Mathematica wasn't written in Lisp. You might be thinking of Macsyma.

~~~
dreamcompiler
My understanding is that Wolfram rewrote Macsyma in C but that Macsyma was his
direct inspiration for Mathematica.

------
epr
I have a lot of gripes with the first part of this article regarding old
languages that "failed" (which I detail below), but what follows is a pretty
reasonable assessment of why Lisp never really "clicked" as an enterprise
language. My opinion on the subject overlaps, but I would say that the biggest
issue is not programmer understanding (it is easy to write unreadable code in
any language), but problems with the language from a business owner or
manager's perspective. Lisp programmers are less replaceable, and the trade
off between the benefits in power of Lisp compared to what large businesses
lose in terms of ability to replace employees quickly was not worth it to them
and I honestly couldn't say whether their judgement was good or bad in regards
to this.

Starting at the top:

> Other languages of similar heritage (to Lisp) are still widely used.

Which ones? Unless I'm very mistaken, there were virtually no languages which
fit into a similar role as Lisps for at least 1-2 decades after Lisp (1958)
came onto the scene. The only language I can think of that would accurately
match this description is Prolog (1972), because although Basic (1964) was
also a dynamic, symbolic language, it was really meant to be an easier
alternative to Fortran II with instant feedback and simpler syntax for people
with little to no computer science or math background.

> Some of the above languages are no longer quite as popular as they once
> were. This will be the definition of "failure" we will use here. The
> question is why did they fail? The first stand-out is COBOL.

COBOL failed?!? It is possibly one of the most successful languages ever. From
a 2018 Reuters report on active COBOL use that continues today (and was even
more prevalent in 2009 when this article was written):

\- 43% of banking systems \- 80% of in-person transactions \- 95% of ATM
swipes \- 220 Billion LOC

> as time progressed fixed sized arrays as data structures became obsolete

According to who? Vast amounts of numerical computation software from weather
and climate models to modern AI algorithms make use of fixed length arrays.

> The ALGOL language family succeeded. The reason for this is that these were
> the languages written by programmers for programmers.

There is no doubt that the Algol (C, really) family of languages have
absolutely dominated, but whether that was because they were well designed or
something like that seems a questionable notion. K+R were essentially looking
for a portable assembler with which to rewrite Unix, and many would argue that
C's popularity stemmed primarily from it's use in Unixes, which spread to all
main stream operating systems.

~~~
vorg
> active COBOL use that continues today [...] 220 Billion LOC

That 220b LOC is because most newly-written COBOL programs today are copied
and pasted from some other large existing program, then some small parts of it
changed. I know in one large corp I once worked at, a manager ordered that
instead of a new code being created for a new customer in the program suite as
was usually done, every program in the suite was cloned and the new customer's
name was hard-coded into the strings in the program text. That manager got the
new system up and running in record time and was well regarded by his peers.
The maintenance programmers and computer operators got some extra job security
in the years ahead too.

------
dventimi
If we're indulging in idle speculation, then here's my contribution.

I think Lisp failed for the same reason that ORMs exist, and to be even more
provocative, for the same reason that women are underrepresented in
programming: the Apple II.

Yes, yes, of course that isn't "the reason." "The reason", singular, is
nonsense, as any outcome is a consequence of many interrelated causal factors.
However, often a few or even one dominate, and in my view the Apple II[1] is a
dominant factor for these outcomes.

Hear me out.

Evidently, there once was a time when women had a healthier representation
within programming, and this roughly coincided with times both when there was
an efflorescence of different programming models, and when "computers" was
more a profession for adults. Then, the personal computer revolution occurred
in the 1970s, which introduced inexpensive, under-powered computers into
(upper) middle class homes, and this roughly occurred in tandem with the
introduction of video games both in arcades and then in homes. In my
unscientific observation, girls were shoved aside by their brothers, who
swiftly moved those video game consoles and computers into their bedrooms,
where just as swiftly we set about adapting games and writing our own games
using the only language offered: BASIC. And so a baby boom of baby programmers
occurred. Programming expanded beyond an adult profession and became also a
province of adolescent hobbyists, whose first and only (for a long time) view
of what it meant to program meant: an imperative language, with subroutines,
standard control-flow operations, and mutable state. BASIC isn't usually put
in the Algol family of languages, but it's a better candidate for a foster
child of that family than the never-learned models of before: Prolog, SQL,
Forth, Lisp, etc. It wasn't difficult to make the transition from BASIC to C,
then to C++, and then to Java (picking up Perl and maybe Python along the
way). Without even trying, that generation of boys became a generation of
young men primed to program in a non-Lispy way, right around the time when the
internet was creating a demand for programmers. What fruit did this yield?

* Boys and then men became overly-represented in programming, relative to decades before.

* The database was held at arm's length, and SQL was tolerated only by creating ORMs that permitted retreating to the comforting familiarity of imperative, mutable state, Algol-like languages.

* Languages like Lisp and Prolog were forgotten about, until being "rediscovered" decades later.

Of course, if any of this is true, it raises other questions

* Why did the early, cheap computer makers choose BASIC instead of Lisp?

* Why did later, more powerful computer makers choose C?

* Why was there a gender disparity in affinity toward games in kids and adolescents?

I can speculate on some of the reasons for these, but I've indulged in enough
for now.

[1] "Apple II" is just a handy placeholder. Substitute Atari, Commodore 64,
"personal computers", whatever, if you like.

------
fit2rule
I wonder what is the most widely used, popular program, that has been written
in Lisp? Is it Emacs?

Every time I sit down to try to learn Lisp, I end up wondering what the hell
I'll use it for. Functional programming languages seem to have flipped a bit
in my mind that predisposes me to be prejudiced against Lisp .. I find it very
hard to do anything actually useful with it, whereas I can pick up c++ or Lua
or Python and immediately get something running.

I'm not saying this is Lisps' fault, but I've been programming for 30 years
and have tried many times to become a Lisp programmer.. its just never been
effective.

~~~
ben509
Speaking to why a LISP newbie finds it tough to get things done... I think
most people think in imperative terms. "I want to make a type. Now, given a
thing of that type, change a thing here, change a thing here, and return the
thing I want."

And most languages delineate a programmer's tasks clearly. For instance, in
Python and Ruby, logical structure is expressed in blocks, control (mostly)
transfers between statements, while calculation is performed in expressions,
and these are all visually distinct.

LISP's idea that all structures can be expressed as lists doesn't visually
demarcate the distinct "things" a programmer is trying to compose. You can
absolutely learn it, but it makes it harder.

Now, since LISP does basically use an AST directly, it makes metaprogramming
trivial. The beauty of extensive metaprogramming is that you can construct new
languages cheaply. The curse is that you get a lot of cheaply constructed new
languages.

If Python wants to add an "async" keyword, they go through weeks of debate and
do a long writeup on it.[1] In LISP, someone writes a few macros and, boom,
new syntax. The LISP community are smart people, they recognized this and I
think the standardization efforts tried to mitigate it, and a ton of thought
clearly went into Scheme.

But even this article doesn't get the problem. It's not "there are too many
parens" but more that everything in LISP looks the same when you're new, it's
a jumble of parens. Your brain isn't getting distinct markers to help learn
the structures.

And LISP, shares a problem with most dynamic languages that anything can kind
of go anywhere. It's not as bad for others, though, because if you _look_ at
an example of Python code, you can generally _see_ the structure and that
narrows down what can fit there. Whereas LISP is always a mess of parens and
keywords; yes, it's usually obvious from looking at the docs, but it's just
more research you have to do to get something done.

~~~
flavio81
>But even this article doesn't get the problem. It's not "there are too many
parens" but more that everything in LISP looks the same when you're new, it's
a jumble of parens. Your brain isn't getting distinct markers to help learn
the structures.

On a typical language you have "structures" you mention, like "if", "for",
"switch". Lisp also has them, but they look more like function calls. The
syntax has a lot of less noise.

Automatic indentation helps you visualize such "structures" in an easier way.
It's not any more difficult than using a language like Python or C.

>And LISP, shares a problem with most dynamic languages that anything can kind
of go anywhere.

Not really, because Common Lisp -the major Lisp dialect- is a strongly typed
language, unlike the most popular dynamic languages: Javascript, PHP and
Python and Ruby (Python & Ruby being 'duck typed' for the most part). And
unlike the famous weakly-typed statically-typed language: C.

So no, "anything" _can 't go anywhere_, because a type mismatch will cause a
runtime error. And in CL you can correct runtime errors by modifying the
source code and resuming program execution (without having to restart the
complete program), so no big deal either.

The most common data construct, the Lisp list, accepts any kind of data type
inside, but Lisp also has structs, objects and arrays, all of which can be
defined for specific data types. Note that the ANSI Common Lisp standard
includes extensive support for type declarations.

Not to mention that Lisp, unlike other languages (Java, C++ -- i'm looking at
you guys) don't erase types at runtime. Lisp works with values (not variables)
, and values are typed. Any value, in Lisp, carries its type information at
runtime.

~~~
ben509
> The syntax has a lot less noise.

Note here how you unconsciously use "has" instead of "have", that kind of
grammatical noise exists because those small redundancies help us understand
each other.

I think the reason very few computer languages use parens is because they're
harder to read by virtue of having less grammatical redundancy.

(At the other end of the spectrum, languages like COBOL are hard to read
because they're swamped in redundancy.)

> So no, "anything" can't go anywhere, because a type mismatch will cause a
> runtime error.

I should have been clearer: it can go anywhere while you're writing it, so you
have to run it to find out. That's the problem people also have with other
dynamic languages when they're working on non-trivial codebases. Also, because
types are rarely declared up front, you're often paging through code to figure
out what type should go some place.

> And in CL you can correct runtime errors by modifying the source code and
> resuming program execution (without having to restart the complete program),
> so no big deal either.

I agree that's an amazing feature few langauges have been able to copy, but it
seems like it wasn't enough.

