
How Lisp Became God's Own Programming Language - chwolfe
https://twobithistory.org/2018/10/14/lisp.html
======
default-kramer
> McCarthy built Lisp out of parts so fundamental that it is hard to say
> whether he invented it or discovered it.

I loved that sentence! I'm guessing epistemology or a similar field has
pondered the "invented or discovered" question already, and if so, I want to
read about it.

> [on SICP:] Those concepts were general enough that any language could have
> been used

What?? In chapter 4, you write your own Lisp interpreter. If they had chosen
C++, would you be writing a C++ compiler? Or a Lisp interpreter in C++? Either
way, it would be ugly. And most languages would encounter problems even before
they got to chapter 4. What made SICP great was building abstractions out of
primitives. Most languages give you some abstractions, and others simply can't
be built (at least not elegantly). I can't imagine SICP using a language that
doesn't feel a lot like Lisp.

That said, count me among the people whose first Lisp exposure was SICP. And
yeah, it was really fun and really enlightening. I am loving Racket now, but
Racket is big and practical. SICP is small and beautiful - as I recall, the
authors deliberately avoid using most of the language. (They used Scheme, but
I think Lisp would have worked fine too, right?)

~~~
yiyus
Chuck Moore always says he did not invent FORTH, but discovered it.

It's fascinating the relationship between Lisp and Forth. I am quite sure
there is something about these two old simple languages and the concept of
duality in mathematics that we are missing.

~~~
thaumasiotes
> It's fascinating the relationship between Lisp and Forth.

Is there more to the relationship than "Lisp is written as a preorder tree,
and Forth is written as a postorder tree"?

~~~
samatman
There is an ancestral relationship. Chuck Moore studied under McCarthy
directly.

~~~
MycroftJones
Yep. And back in the 1970's Chuck Moore wrote a book describing the
implementation and design of FORTH, and it is pretty close to LISP. And I
asked Chuck Moore directly, in person, if there was a connection to LISP, and
he said "yes".

------
joe_the_user
Lisp is a powerful language that allows one to mix multiple logical levels
within a program (especially through it's homoiconicity).

Shrdlu is perhaps the most classic illustration of Lisp's power[1], A classic
program from 1968-70 that allowed natural language communication about a micro
world. It was written in a version of Lisp even more free-form than today.
When I looked at the source code a while back, it's parsing of natural
language involved a wide-variety of on-the-fly fix-ups and such to take into
account the multiple irregularities of human language.

The thing about Lisp's power is it allows the production of complex program
quickly but doesn't particularly have a standard way of gluing these programs
together. The other classic Lisp work, The Bipolar Lisp Programmer[2],
describes how Lisp has multiple partial implementations of important libraries
and programs simply because it allows a certain type of personality to by
themselves produce something that's remarkably good but doesn't encourage any
particular group effort.

Lisp is certainly evidence that "language matters" but not evidence that Lisp
is where one should stop.

[1]
[https://en.wikipedia.org/wiki/SHRDLU](https://en.wikipedia.org/wiki/SHRDLU)

[2] [http://marktarver.com/bipolar.html](http://marktarver.com/bipolar.html)

~~~
tunesmith
> Lisp is a powerful language that allows one to mix multiple logical levels
> within a program (especially through it's homoiconicity).

Statement like these are what make me suspect Lisp code would be a maintenance
nightmare. I am regularly refactoring code to separate multiple logical levels
so we can more easily maintain our codebases. If this is a misunderstanding,
can someone clarify?

~~~
d--b
Yes, this is why macros have been abandoned in most entreprise languages (C#,
Java)

~~~
coldtea
As if they've tried them and then abandoned them? Java, fone one, didn't even
have generics and closures when it came out.

It didn't have macros because it was intended for the ho-hum enterprise
programmer of the time, who, the thought was, would not know what to do with
them.

Instead, they re-invented all those things badly (e.g. through gobs of XML and
ugly reflection based metaprogramming).

~~~
kamaal
Nobody still uses them, even if they are available. In fact anything that
doesn't looks enterprisey enough never goes past code reviews.

The adoption cycle for even the simplest of Syntactical features when it comes
to Java enterprise is >10+ years. In some companies its never.

The saddest part isn't even that. The sad part is a whole generation of
programmers have been raised, and turned to be architect astronauts without
ever using something like a lambda or a closure.

The only hope for programming as a craft now is hoping Oracle kills Java(even
if by mistake), and then some thing like Perl 6 comes along to replace it.

~~~
AnimalMuppet
I'm not sure that will help. I see a kind of "family resemblance" between
COBOL and enterprise Java. I wonder if any language that is going to play in
this space is destined to become a monstrosity - destined by the nature of the
problem space rather than the nature of the language.

I am aware that when I say this, I am basing my opinion on a sample size of
two...

------
YeGoblynQueenne
>> Two decades after its creation, Lisp had become, according to the famous
Hacker’s Dictionary, the “mother tongue” of artificial intelligence research.

More precisely, Lisp became the "mother tongue" of AI research _in the United
States_. Europe and Japan, which at the time also had a significant output
into AI research, instead used Prolog as a lingua franca.

This is interesting to note, because a common use of Lisp in AI was (is?) to
write an interpreter for a logic programming language and then use that
interpreter to perform inference tasks as required (this approach is evident,
for example, in Structure and Interpretation of Computer Programs, which
devotes chapter 4.3 to the development of a logic programming language, the
_query language_ , which is basically Prolog with Lisp parentheses).

Hence the common response, by Prolog programmers, to Greenspun's tenth rule,
that:

    
    
      Any sufficiently complicated Lisp program contains an ad-hoc,
      informally-specified, bug-ridden, slow implementation of Edinburgh Prolog.

~~~
varjag
> This is interesting to note, because a common use of Lisp in AI was (is?) to
> write an interpreter for a logic programming language and then use that
> interpreter to perform inference tasks as required

Not sure how you come to that conclusion. Literally none of the notable Lisp
AI programs (ELIZA, STUDENT, SHRDLU, AM, EURISKO, MICYN) had anything to do
with reimplementing Prolog.

Implementing Prolog however is trivial in Lisp, so many textbooks used it as
an intermediate level exercise.

~~~
YeGoblynQueenne
It's the response to Greenspun's tenth rule that mentions Prolog explicitly.
My previous comment states that AI programs written in Lisp implemented _an
interpreter for a logic programming language_ , unspecified. Although to be
fair, that usually means "an informally-specified, ad-hoc, bug-ridden, slow
implementation of Prolog" (1).

Now, my knowledge of Eliza, Shrdlu, etc is a little limited (I've never read
the source, say; btw, have you?) but let's see. Wikipedia is my friend, below.

According to wikipedia MYCIN was a backwards chaining expert system. That
means a logic programming loop, very likely a resolution-based theorem prover,
or, really, (1).

ELIZA was first implemented in MAD-SLIP, which, despite the name was not a
Lisp variant (although it _was_ a list-processing language). STUDENT is
actually part of ELIZA- it's one of the two scripts that came with the
original implementation, by Wizenbaum. The other script is DOCTOR, which is
the one more often associated with ELIZA (it's the psychoanalyst). If I
understand correctly, STUDENT solves logic problems, so I'm guessing it
incorporates an automated theorem prover; so basically, (1).

Eurisko was written in a frame-based (think OOP) representation language
called RLL-1, the _interpeter_ for which was written in Lisp.

SHRDLU was written in Lisp _and MicroPlanner_ an early logic programming
language (with forward chaining, if memory serves). In the event, (1) was not
necessary, as MicroPlanner was already all that- and more!

The Automated Mathematician (AM) was indeed written in Lisp, but I don't know
anything about its implementation. However, from the description on its
wikipedia page it appears to have included a very basic inference procedure
and a rule-base of heuristics. Sounds like a case of (1) to me.

~~~
varjag
MYCIN is a fuzzy logic inference engine, among the first ones. Would love to
see that implemented in period-true Prolog, but tbh won't hold my breath.

RLL-1 and Microplanner are domain-specific languages, traditional Lisp
approach to building complex applications. Fact is both were written in Lisp,
that's just sophistry.

Neither AM nor EURISKO would map naturally to unification with their very
special heuristic-driven task prioritizing and scheduling algorithms.

~~~
YeGoblynQueenne
I don't know what "period-true Prolog" is. You can still run programs found in
Prolog textbooks from the 1970's in modern Prolog interpreters- just like with
Lisp.

>> Fact is both were written in Lisp, that's just sophistry.

I don't see the sophistry. If I implement Lisp in Prolog, the interpreter may
be Prolog, but the programs it then evaluates are Lisp.

I don't see why unification would be an impediment in implementing any
heuristics or scheduling algorithms.

~~~
varjag
Period-true as in from 1970s. There were attempts to build fuzzy Prolog
dialects in late 1980s, although it seems not particularly successful.

> If I implement Lisp in Prolog, the interpreter may be Prolog, but the
> programs it then evaluates are Lisp.

But it does not implement Lisp in Prolog. It implements a domain specific
language in Lisp in a time proven manner. Googling "DSL Lisp" will give you
probably hundreds of references. I did it before for my own projects and my
"languages" even had no name. I could call it VARLANG-22 or whatever, but
unless I told you all you'd see is some Lisp code.

> I don't see why unification would be an impediment in implementing any
> heuristics or scheduling algorithms.

It does not bring anything to the table there, because that's not how those
systems work. Not every project is a first order predicate logic expression,
as incredible as it sounds.

~~~
YeGoblynQueenne
We're in a weird situation here. In this thread, you're arguing that a
programming language implemented in Lisp, which retains the syntax of Lisp, is
a DSL, and so still just Lisp; but in the other thread, you're arguing that
Prolog implemented in Lisp, with the syntax of Lisp, is not a DSL, but Prolog.

I recognise that the above comes across as an attempt at sophistry, again, but
I sincerely think you are not being objective, with this line of thinking.

------
Phemist
[http://kingjamesprogramming.tumblr.com/](http://kingjamesprogramming.tumblr.com/)
contains a selection of "verses" generated with a markov chain trained on
SICP, the King James Bible and some other works. The results are oftentimes
hilarious and some of the Lisp related quotes are very thematic:

    
    
        13:32 And we declare unto you the power and elegance of Lisp and Algol.
    
        Lisp, whose name is Holy
    

(and one that doesn't necessarily mix in something from King James)

    
    
        A powerful programming language should be an effective program that can execute any Lisp program.

------
loosetypes
For whatever reason I really enjoyed the Symbolics graphics and animations
demo[0] linked in the article.

I was born in the 90s and have constantly heard the narrative that
technological progress is getting faster and faster, and that we're currently
at the forefront. And then I see something like this. They had basically
photoshop before I was even alive!?

[0]
[https://www.youtube.com/watch?v=gV5obrYaogU](https://www.youtube.com/watch?v=gV5obrYaogU)

~~~
richardjdare
Paint programs go back to Xerox Parc in the 70's[1], at the very least. In the
80s and 90s there were a bunch of high-end paint programs for Symbolics, SGI
workstations, Quantel Paintbox [2] etc. They were more video-graphics oriented
than Photoshop, which historically at least, had a strong print emphasis.

I love computer graphics history, and loved the mystique of "high-end"
computing when I was making crap pixel art on my Amiga:)

Another cool thing is, in that video, the artist is basically using the same
box modelling techniques I did when I took a Maya course a couple of years
ago. In 1991 I was 3d modelling on the Amiga, and although it was awesome to
have access to that software at home, modelling was much more cumbersome and
limited.

Now, to blow your minds even more here [3] is a recent demonstration of that
same 3d modelling program. At 19 minutes in, he switches to the Lisp Listener
console, and uses Lisp to inspect and modify the data belonging to the 3d
model, before switching back to the 3d app to view the changes. Lisp machines
were incredibly well-integrated platforms for the expert user.

[1]
[https://creators.vice.com/en_uk/article/wnpqnm/the-1970s-gra...](https://creators.vice.com/en_uk/article/wnpqnm/the-1970s-graphics-
program-that-spurred-space-exploration-computer-picassos-and-pixar)

[2]
[https://www.youtube.com/watch?v=BwO4LP0wLbY](https://www.youtube.com/watch?v=BwO4LP0wLbY)

[3] [https://vimeo.com/125771177](https://vimeo.com/125771177)

~~~
pjmlp
Caligari?

~~~
richardjdare
Imagine 2.0 from an Amiga Format coverdisk (Like most of my application
software when I was a teenager) I remember reading about Caligari, but I never
used it.

~~~
pjmlp
Ah thanks. I only knew about Caligari, but never used it. Just saw it at trade
shows back in the day.

I was the PC guy on our computing parties. :/

------
YeGoblynQueenne
>> They do this even though Lisp is now the second-oldest programming language
in widespread use, younger only than Fortran, and even then by just one year.

And the third-oldest is COBOL (it's at least as widespread as FORTRAN;
arguably, it's even more widespread than both FORTRAN and LISP together,
considering that it's used by pretty much every financial org on the planet).

It seems that, alredy from such an early time, the kind of languages we would
end up creating was already pretty much set in stone: FORTRAN, as the grandady
of languages aimed at scientists and mathematicians, that modern-day R,
Python, Julia etc draw their heritage from; LISP as the grandmother of
languages aimed to computer scientists and AI researchers, still spawning an
unending multitude of LISP variants, including Scheme, ML and Haskell; and
COBOL, the amorphous blob sitting gibbering and spitting at the center of the
universe of enterprise programmers, that begat the Javas, VBs and Adas of
modern years.

(Note that I'm referring to language philosophy and intended uses- not syntax
or semantics).

(I'm also leaving out large swaths of the programming community: the Perl uses
and the C hackers etc. It's a limited simile, OK?).

~~~
truculent
I would say that R (and probably Julia as well, although I'm much less
familiar) draws it's heritage from Lisp more than FORTRAN.

~~~
bachmeier
R was originally a Scheme dialect, so that's a pretty safe assertion.

------
zakum1
In the history of Lisp, the paper by Richard Gabriel:
[https://www.dreamsongs.com/WIB.html](https://www.dreamsongs.com/WIB.html),
“Lisp: Good News, Bad News, How to Win Big” is insightful and beautifully
written.

I am surprised it is not mentioned in the article wrt the “winter period” in
which Lisp popularity waned.

------
metonymy
If you want to read SICP, it is available in HTML, EPUB here:
[https://github.com/sarabander/sicp](https://github.com/sarabander/sicp) also
here:
[https://sicpebook.wordpress.com/ebook/](https://sicpebook.wordpress.com/ebook/).
Not available in EPUB on Amazon or Google.

------
zimablue
I think this article reverses cause and effects. It seems to start from the
assumption that there's nothing special about LISP and then points to big
cultural moments where programmers revered it and said "this accounts for 20%
of the meme" etc.

I'd say those cultural moments exist because LISP /is/ something special. You
could write SICP in Java (someone probably has) but the code would be way
longer and less beautiful.

------
kbumsik
As a dev from non-CS major, I personally havn't learnt anything about Lisp but
I would like to know. Is Clojure (specifically ClojureScript) a good start to
study about it?

~~~
knowingpark
Clojure is a modernised version of Lisp. It compiles to Java bytecode and runs
on the JVM. Clojurescript compiles to Javascript for use in the browser. Some
amazing programming tools have been written for Clojurescript. The community
is very robust and opinionated, in a good way I think.

Clojure is very modern with its vectors and maps. Lisp is more of an antique,
a very valuable antique though. its very interesting to learn about both at
the same time, as I did :)

~~~
pjmlp
Contrary to popular belief, Common Lisp has support for vectors, maps,
records, stack and heap allocation.

What Clojure has going for it is the wealth of Java libraries.

------
kidsnow
The principle of orthogonal design is something I learned in CS, but hardly
anyone mentions any more. The idea boils down to building software parts in a
consistent way such that can be combined and re-used to form new things. The
way you can accomplish this is by having very few rules. The more "syntaxy" a
language is, the less orthogonal it is.

For more reading and discussion on this topic:
[https://softwareengineering.stackexchange.com/questions/1035...](https://softwareengineering.stackexchange.com/questions/103567/what-
is-the-most-orthogonal-programming-language)

------
sn41
In addition to SICP, John Allen's "The Anatomy of Lisp" is also a great book
to learn computing concepts through Lisp.

~~~
mark_l_watson
It is especially good if you want to implement a Lisp.

~~~
raphlinus
I loved this book as a kid, but I'm really not sure I would recommend it
today. The Lisp of that era had "dynamic scope," and a great deal of Allen's
book is concerned with fancy data structures and techniques to implement it in
a reasonable fashion. But today I think we understand that stuff as basically
wrong, and that "lexical scope" works better (and is much closer to the
original lambda calculus that served as an inspiration). There are probably
some proponents of dynamic scoping, but I think it's a lost battle.

So definitely yes, if you want to implement a historic Lisp. Otherwise, not so
much.

~~~
lispm
A later text on the same topic (implementing Lisp/Scheme) is:

[https://en.wikipedia.org/wiki/Lisp_in_Small_Pieces](https://en.wikipedia.org/wiki/Lisp_in_Small_Pieces)

Highly recommended.

------
aswanson
Far too many intelligent programmers swear by lisp for there to be no "there"
there. I'm trying to learn it in my nonexistent spare time.

~~~
3pt14159
Look, here is the deal with Lisp. It shows you the data structure of your
computer program and lets you operate on it and that is neat. But it's not
useful for actual work because the way you manipulate it makes it hard for you
to understand what is happening without knowing _all_ the ways your code is
being manipulated. I have to write lisp for my editor (emacs) and I don't hate
it, but I don't love it either. The syntax is hard to read quickly because
it's cluttered and (usually) nest-y.

If you're willing to trade complete purity away, try Ruby. It's basically
everything you want from Lisp without the mess. Give up purity, get
comprehensibility.

Blocks are a really great way of doing things. You can even investigate the
block source code as a string if you really want to.

Dynamic method definition is well supported and predictable. Data structures
are easy to compose and operate on. It's basically all the power but in a more
comprehensible way. There's a reason why Rails came out of Ruby. It's
naturally powerful.

~~~
aidenn0
I don't get the whole "macros make your code incomprehensible" thing that is
always brought up. A macro is just another way to add abstraction to your
program.

You might as well say functions are bad because without reading the source
code for the function you don't know what the function does.

A macro that looks like one thing but does something else is a bad macro. We
don't throw away functions just because someone can write a function named
"sort" that actually randomizes it's argument rather than sorting

~~~
marcosdumay
Macros are way more powerful than functions. They can hide way more side
effects, can fill a namespace, and can surprise you on many other ways.

This wouldn't be a problem if those surprises were rare and clearly marked,
but the entire reason for macros to exist is to carry the surprises. As a
consequence, having macros as the default (ok, second choice, not much better)
tool of your language is bad. It's not that macros are bad by themselves, but
they shouldn't be used often.

Besides, powerful tools do not get well together with dynamically typed
languages.

~~~
aidenn0
> Macros are way more powerful than functions. They can hide way more side
> effects, can fill a namespace, and can surprise you on many other ways.

Abstractions that surprise you are bad. That doesn't mean the tool used was
necessarily bad.

> This wouldn't be a problem if those surprises were rare and clearly marked,
> but the entire reason for macros to exist is to carry the surprises.

See above RE: surprised. Also, I can usually identify a macro from
indentation, as most macros tend to have lambda lists similar to:

    
    
        ((FOO BAR &key BAZ) &body b)
    

which slime will pickup on and indent appropriately.

> It's not that macros are bad by themselves, but they shouldn't be used
> often.

If by "use" you mean "write" I agree. I do write macros far less often than I
write functions, and this is common advice for lisp programmers anyways.

> Besides, powerful tools do not get well together with dynamically typed
> languages.

Not all lisps are dynamically typed, even common lisp has optional type
declarations and typed racket takes this further. Also the GP post suggested
Ruby, so that's not a great alternative by this argument.

------
ancarda
Having spent months learning Haskell, I'm interested to pickup another mind-
expanding language. If I read SICP (and also watch the MIT lectures), what
dialect should I follow along in? Ideally I would learn something people are
using today so there would be usable libraries. I mostly write website
backends and APIs.

Clojure? Common Lisp? Something else?

~~~
specializeded
DrRacket IDE [0] + the SICP compt language [1] and you can start writing it
instantly in a well built and maintained environment that’s racket based and
pretty fleshed out library wise, certainly nothing compared to Clojure but
among the rest, it’s the best (imo), I recall Carmack writing a server in
Racket for fun and praising the experience a few years back.

[0] - [https://racket-lang.org](https://racket-lang.org)

[1] - [http://docs.racket-lang.org/sicp-
manual/index.html?q=sicp#%2...](http://docs.racket-lang.org/sicp-
manual/index.html?q=sicp#%28part._.Introduction_to_the__lang_sicp_language%29)

Additionally, if SICP proves too slow going or difficult math wise [3] you can
always use drracket for HtDP [4] and it’s corresponding misnamed edX course(s)
[5] and later on, PLaI [6].

[3] -
[http://cs.brown.edu/~sk/Publications/Papers/Published/fffk-h...](http://cs.brown.edu/~sk/Publications/Papers/Published/fffk-
htdp-vs-sicp-journal/paper.pdf)

[4] -
[https://htdp.org/2018-01-06/Book/part_preface.html](https://htdp.org/2018-01-06/Book/part_preface.html)

[5] - [https://www.edx.org/course/how-code-simple-data-ubcx-
htc1x](https://www.edx.org/course/how-code-simple-data-ubcx-htc1x)

[6] -
[http://cs.brown.edu/courses/cs173/2012/book/](http://cs.brown.edu/courses/cs173/2012/book/)

------
sova
>nobody has or will make anything practical with Lisp

I made a website with Clojure (a Lisp) that is indeed practical:
[http://practicalhuman.org](http://practicalhuman.org)

~~~
auvi
Reddit was originally coded in Common Lisp

~~~
timbit42
So was Amazon.

~~~
ballpark
I've never heard that. I do remember seeing here on Hacker News an old job ad
for Amazon (close to founding) that was looking for C++ programmers.

[edit] Just found it:
[https://groups.google.com/forum/#!topic/ba.jobs.offered/-rvJ...](https://groups.google.com/forum/#!topic/ba.jobs.offered/-rvJUMBbZ18)

------
aportnoy
How do you build a simple beautiful website like this?

~~~
sharvil
It's Jekyll [1] powered static site, here is the source code:
[https://github.com/sinclairtarget/sinclairtarget.github.io](https://github.com/sinclairtarget/sinclairtarget.github.io)

[1] [https://jekyllrb.com/](https://jekyllrb.com/)

~~~
aportnoy
Thank you!

------
collyw
Would it be worth going through SCIP without prior knowledge of LISP (i.e.
could I pick it up from the book) or is it better to have some knowledge
beforehand?

~~~
gumby
At the time the book was written it was not uncommon for a new MIT student to
arrive without having used a computer at all. 6.001 was intended to be the
first introduction to a computer so not only is no lisp assumed but no
programming at all!

~~~
kamaal
It also depends on how motivated enough you are. Even as early 2005 most of us
in India couldn't afford computers. I remember we wrote 8085 programs all on
paper. Limited time was available, with limited kits in the lab, but we were
motivated enough to do it on paper alone.

These days where everything is supposed to be easy, newbie friendly,
accommodating and all that. People have a tendency to quit early and expect
the ecosystem to make it easy for them.

These days you get a decent computer under $100 if you use a Rasberry Pi. I
could do anything for something like that a decade back.

~~~
mangamadaiyan
I wouldn't quite say motivated - we had no choice. You either wrote and
debugged the 8085 assembly on paper _before_ you ran it on the board, or you
didn't do it at all :)

While I understand (and respect) the sentiment, having gone through much the
same, I wouldn't disparage something being newbie-friendly. That isn't
necessarily a bad thing.

Edit: s/play down/disparage

------
andrewstuart
Is assembly the devil's own programming language?

~~~
nickpsecurity
Assembly is too honest about what the program is doing to be the Devil's work.
That would be C with undefined behavior kicking in on malicious input from the
Devil's children.

~~~
qwerty456127
Indeed. Assembly is the language of the nameless ancient horror that was there
before the devil was born. The language of the devil is C and the languages of
his demons are JavaScript and PHP.

~~~
Annatar
Seriously? Wow. Assembler is the most beautiful, simplest language that I have
ever programmed in.

O tempora, o mores...

~~~
qwerty456127
Lisp is the most beautiful. Brainfuck is the simplest.

I miss punchcards so much BTW, I used to draw on them when I was a child...

~~~
Annatar
I find Lisp very beautiful and elegant but I find assembler even more so.

------
Shorel
I think the test to find out if a programmer can eventually learn to work on a
Lisp codebase lies in his/her opinion about the conditional (ternary)
operator.

If the programmer hates the C(T)O because it is too confusing, that programmer
is hopeless about using Lisp.

If the programmer sees the C(T)O as a trivial syntax that helps to make the
code short and neat, then that programmer will love Lisp.

~~~
weliketocode
I haven't dived into lisp, but if you're telling me it'll make my code
shorter, I'm sure I'll love it.

------
pjmlp
> Ruby got… well, Ruby is a Lisp

If only it had the same level of AOT/JIT compilers that most Lisp variants
enjoy.

------
zeveb
Another two factors in the '00s re-rise of Lisp were Steve Yegge's Drunken
Blog Rants ([https://sites.google.com/site/steveyegge2/blog-
rants](https://sites.google.com/site/steveyegge2/blog-rants)) & Practical
Common Lisp ([http://gigamonkeys.com/book](http://gigamonkeys.com/book)).

The former was, at the time, very influential (and deservedly so); the latter
was and remains the best reference for actually using Common Lisp to write
real software (Edi Weitz's Common Lisp Recipes is an excellent companion
volume).

------
robbrit
Back in 2007 I thought it was silly that SBCL (and SmallTalk too) distributed
its applications as "images". Seems they've been re-invented today as
"containers", which are suddenly an amazing idea.

------
zepto
Prolog is probably the language of god.

~~~
robotresearcher
No.

(note: this is a genuine Prolog joke)

~~~
smnplk
?- What

------
unixhero
... I thought that was HolyC :-)

------
decafbad
Lisp programmers and Lisp braggers are distinct groups.

------
etatoby
Lisp and Scheme are great. They would be my favorite programming languages,
_if they had a statically typed, Hindley-Milner type system._

As they stand, they are great learning tools, but I would never build
something serious with them. Let alone questions about parallelism,
concurrency, available libraries, development tools, etc.

Any suggestions are welcome.

~~~
Royalaid
I think that part of the power of LISP is it's dynamic nature but if you want
typed options they exist for Clojure and Racket. There is also development on
a language called Carp that aims to be a Clojure style language for C
[https://github.com/carp-lang/Carp](https://github.com/carp-lang/Carp).

------
cztomsik
Everybody's just praising lisp like it's the best language ever, yet very few
people are actually using it - and I think it's because it's really easy to
write smart code which has to be explained over and over again to new people
(and to your future-you).

I think this should be noted.

------
armitron
On SICP and Lisp, I was recently asked:

    
    
        "Thanks a lot for this insightful reply! I've read about how 
        powerful are Lisp languages (for example for AI), my question is: 
        does Emacs really use all this theoretically powerful functionality 
        of these languages? In what way is this metalinguistic abstraction 
        used? In the built-in functions of Emacs, the powerful packages 
        made by the community, or the Elisp tweaking of a casual Emacs user 
        to customize it (or all three of those).
    
        I've read a lot of people praising and a lot of people despising 
        Elisp. Do these people who dislike Elisp do it because they want a 
        yet more powerful Lisp dialect (like Scheme) or because they want 
        to use a completely different language?
    
        PD: Excuse my ignorance, I'm still learning about programming. As a 
        side note, would you recommend me to read SICP if I just have small 
        notions of OOP with Python and Java and I want to learn more about 
        these topics? Will I be able to follow it?
    

Let me start from the end: Reading SICP changed everything I thought I knew
about programming and shattered any sort of non-empirical foundation - that I
had built up to that point - regarding how my mind worked and how I interfaced
with reality. It's not just a book about programming, there are layers of
understanding in there that can blow your worldview apart. That said, you do
need to make an effort by paying attention when you go through the book and
(mostly) doing the exercises. The videos on youtube are also worth watching
in-parallel with reading the book. The less you know about programming when
you go through SICP, the easier it will be for you to "get" it since you'll
have no hardwired - reinforced by the passage of time and investment of
personal effort - prior notions of what programming is and how it should be
done.

* Metalinguistic abstraction

Short answer: all three.

Long answer: The key idea behind SICP and the philosophy of Lisp is
metalinguistic abstraction which can be described as coming up with and
expressing new ideas by first creating a language that allows you to think
about said ideas. Think about that for a minute.

It follows then that the 'base' language [or interpreter in the classical
sense] that you use to do that, should not get in your way and must be
primarily focused in facilitating that process. Lisp is geared towards you
building a new language on top of it, one that allows you to think about
certain ideas, and then solve your problems in that language. Do you need all
that power when you're making crud REST apps or working in a well-trodden
domain? Probably not. What happens when you're exploring ideas in your mind?
When you're thinking about problems that have no established solutions? When
you're trying to navigate domains that are fuzzy and confusing? Well, that's
when having Lisp around makes a big difference because the language will not
get in your way and it'll make it as easy as possible for you to craft tools
that let you reason effectively in said domains.

Let's use Python as an example since you mentioned it. Python is not that
language since it's very opinionated and constrained by its decisions in the
design space and, additionally, has been deliberately created with entirely
different considerations in mind (popular appeal). This is very well
illustrated by the idiotic Python moto "There's only one way to do it" which,
in practice, isn't even the case for Python itself. A perfect example of style
over substance, yet people lap it up. You can pick and choose a few features
that superficially seem similar to Lisp features but that does not make Python
a good language for metalinguistic abstraction. This is a classic example of
the whole of Lisp being much more than the sum of its parts, and in reality
languages like Python don't even do a good job of reimplementing some of these
parts. This is the reason I don't want to just list a bunch of Lisp features
that factor into metalinguistic abstraction (e.g. macros and symbols).

* Feedback loops

The other key part of Lisp and also something expressed fully by the Lisp
machines is the notion of a cybernetic feedback loop that you enter each time
you're programming. In crud, visual terms:

[Your mind - Ideas] <\--> Programming Language <\--> [Artifact-in-Reality]

You have certain ideas in your mind that you're trying to manipulate, mold and
express through a programming language that leads to the creation of an
artifact (your program) in reality. As you see from my diagram, this is a
bidirectional process. You act upon (or model) the artifact in reality but
you're also acted upon by it (iterative refinement). The medium is the
programming language itself. This process becomes much more effective the
shorter this feedback loop gets. Lisp allows you to deliberately shorten that
feedback loop so that you _mesh with your artifact in reality_. Cybernetic
entanglement if you will. Few other languages do that as well as Lisp
(Smalltalk and Forth come to mind). Notice that I emphasized your mind and
reality/artifact in that previous diagram, but not the medium, the programming
language. I did that in order to show that the ideal state is for that
programming language not to exist at all.

* Differences between Lisps

All Lisps allow you to express metalinguistic abstraction (they wouldn't be
Lisps otherwise). Not all Lisps allow you to shorten the feedback loop with
the same efficiency.

The Lisps that best do the latter come out of the tradition of the Lisp
machines. Today this means Common Lisp and Emacs Lisp (they're very similar
and you can get most of what Common Lisp offers on the language level in Emacs
Lisp today). For that reason, I don't think Scheme is more powerful than Emacs
Lisp, since Scheme lacks the focus on interactivity and is very different to
both CL and Emacs Lisp.

As far as other people's opinions go, my rule of thumb is that I'd rather
educate myself about the concepts and form my own opinions than blindly follow
the herd. Which is why I also think that people who are sufficiently impressed
by an introduction to Lisp (such as the OP article) to want to learn it and
ask "Which Lisp should I learn? I want something that is used a lot today" are
completely missing the point. You'll notice that most programming today is
done for money, in languages that are geared towards popularity and
commoditization. For me, programming is an Art or high philosophy if we take
the latter to stand for love of wisdom. And as someone has said, philosophical
truths are not passed around like pieces of eight, but are lived through
praxis.

P.S. The talk by Gerry Sussman ([https://www.infoq.com/presentations/We-
Really-Dont-Know-How-...](https://www.infoq.com/presentations/We-Really-Dont-
Know-How-To-Compute)) that I saw mentioned in HN yesterday provides an
excellent demonstration of metalinguistic abstraction and also serves as an
amalgamation of some of my other ideas about Lisp.

~~~
pjmlp
What I love about Lisp, ML and logic languages is how they come down to the
basics of CS, Algorithms + Data Structures.

Ideally the same solution can then be applied to a single core, hybrid CPU +
GPU, clustered environments, whatever.

Yes, abstractions do still leak, specially if optimization for the very last
byte/ms is required, but productivity is much higher than if that would be a
concern by each line of code being produced.

And 90% of the times the solutions are good enough for the problem being
solved.

------
catacombs
Lisp is God's language. TempleOS is God's operating system.

------
vertline3
in XKCD comic "Lisp" God also says the universe was mostly hacked together
with Perl.

Edit: I see I read the story but I didn't click the first comic, just the
second one, Sorry disregard.

Another language that gets holy reverence is smalltalk.

~~~
piinbinary
[https://xkcd.com/224/](https://xkcd.com/224/)

~~~
vertline3
Thank you!

------
ASipos
> a field as esoteric as “recursive function theory”

How can this kind of descriptions persist when recursive functions are _the
very thing_ that a computer computes.

~~~
AnimalMuppet
Are they? They are the very thing that lambda calculus models computation on.
They are _not_ the very thing that Turing Machines model computation on. They
also are _not_ the very thing that CPUs use, or that assembly uses, or even
that most high-level languages use. (Unless you're going to say that most
high-level languages use a recursive descent parser to generate the binaries
that the CPUs run, and _that 's_ the basis for your statement...)

~~~
ASipos
Neither lambda calculus, nor Turing machines "model computation on" recursive
functions. Rather, these two, and all the other models of computation are
equivalent as they compute the same class of functions from the naturals to
the naturals. And we call that class the class of recursive functions.

That being said, a computer is a physical device that is able to compute this
entire class of recursive functions. It is what makes it a _computer_ and not
a pen or a chair. It's not some esoteric notion. It's the whole thing about
it.

~~~
AnimalMuppet
That's like saying that the basis of computation is the standard model, since
all actual physical computers are made out of particles that are in the
standard model. It may be _true_ , but it's so far abstracted from what's
actually going on that it's not _useful_ at all.

------
keithnz
I thought god mucked around on quantum computers, and I think we can all feel
a little bit better that god can't seem to write bug free code either.

~~~
wwweston
Either that, or it's all a feature.

~~~
azernik
And on the seventh day, he set the version number to pi and declared all bugs
to be features.

------
leshow
I'm not familiar with any of these memes about lisp. If anything, the stuff
you're claiming about lisp I've heard about Haskell.

------
oldandtired
Lisp - oh what it could have been. It had such potential, but then it got
broken. I find it fascinating that those who are dedicated to the
proselytisation of Lisp don't see the brokenness of the language. For them,
all of the broken things are the features of the language.

Scheme was one attempt to fix some of those flaws.

In latter times, we see the development of Kernel to fix other flaws.

So many second class citizens, so many exceptions to the rule.

I am going through the source code for Maxima CAS (written in Lisp) and in so
many ways, it's a mess. I am not at all disparaging those who have been
involved in writing the Maxima CAS system and its source. They have done an
incredible job and what they have achieved is remarkable.

However, like any software system of any complexity in any language, it has
lots of areas that are difficult to maintain, let alone advance. In that
regard, Lisp has not been as an advantageous language as it could have been.

Lisp (as in Common Lisp and its add-ons) is not a simple language and it is
not a consistent language (see CLHS - Common Lisp Hyper Spec docs).

When I first came across it in the latter 1970's, I thought "wow". But its
flaws quickly came to the fore.

So, there is no way that it would ever be God's own programming language.
Especially since, God doesn't need to program, that's just for us very limited
mortals.

~~~
mncharity
> then it got broken

Got broken? I think of it more as having failed to obtain/coordinate the
resources needed to progress.

What it means to have a healthy language ecosystem has advanced. 1970's Prolog
implementations couldn't standardize on a way to read files. 1980's CommonLisp
did, but had no community repo. 1990's Perl did, but few languages then had a
good test suite, and they were commercial and $$$$. Later languages did, but
<insert-your-favorite-thing-that-we-still-suck-at>.

And it's not easy for a language to move on. Prolog was still struggling with
creating a standard library decades later. CommonLisp and Python had decade-
long struggles to create community repos. A story goes that node.js wasn't
planning on a community repo, until someone said "don't be python".

The magnitude of software engineering resources has so massively ramped, that
old-time progress looks like sleep or death. Every phone now has a UI layout
constraint system. We knew it was the right thing, craved it, for years...
while the occasional person had it as an intermittent hobby project. That was
just the scale of things. Open source barely existed. "Will open source
survive"? was a completely unresolved question. Commercial was a much bigger
piece of a much smaller pie, but that wasn't sufficient to drive the
ecosystem.

The Haskell implementation of Perl 6 failed because the community couldn't
manage to fund the one critical person. It was circa 2005, and the social
infrastructure needed to fund someone simply wasn't the practiced thing it is
now.

And we're still bad at all this. The javascript community, for all it's
massive size, never managed to pick up the prototype-based programming skills
of self and smalltalk. The... never mind.

It's the usual civilization bootstrap sad tale. Society, government, markets,
and our profession, are miserably poor at allocating resources and
coordinating effort. So societally-critical tech ends up on the multi-decade
glacial-creep hobby-project-and-graduate-student installment plan. Add in
pervasively dysfunctional incentives, and... it becomes amazing that we're
making such wonderful progress... even if is so very wretchedly slow and poor
compared to what it might be.

So did CL get broken? Or mostly just got left behind? Or is that a kind of
broken?

~~~
oldandtired
You raise interesting history and it's a good thing to see the perspective as
you've given.

I don't know if Common Lisp got left behind or just took a completely
different path. From my perspective, it got broken with its macro system
decisions, it dynamic/static environment decisions and its namespace
decisions. It created too many second class citizens within the language which
means that you have to know far more than you should in understanding any part
of the programs you are looking at.

Every choice a language designer makes affects what the language will do in
terms of programmer productivity, not only for the original developers of
programs using that language, but also for all those who come later when
maintaining or extending those programs.

I have come to the conclusion that a language can be a help when writing the
original program and become a hindrance when you need to change that program
for any reason. It is here that the detailed documentation covering all the
design criteria and coding decisions, algorithm choices, etc, become more
important than the language you may choose.

Both together will enable future generations to build upon what has been done.

All the points that you have highlighted above are important, but the
underlying disincentive to provide full and adequately detailed documentation
will work against community growth. No less today than in the centuries past
is the hiding away of knowledge where individuals are not willing to pass on
the critical pieces unless you are a part of the pack or do not think it is
important enough to write down because it is obviously obvious.

To understand a piece of Lisp code, one has to know what the special forms and
how they interact, what the macros being used are and what code they are
generating and what the various symbols are hiding in terms of their
SPECIALness might be. These things may help in writing the code, but they work
against future programmers in modifying the code. Having had to maintain
various code bases that I did not write in quite a variety of different
languages, I have found that "trickily" written code can become a nightmare to
bring about required changes. I have found that Lisp code writers seem to like
writing "trickily" written code.

Now, that is only one person's perspective and someone else may find something
completely different. That is not a problem as there are many tens of ....
programmers in the world. Each one having a perspective on how to write good
code.

~~~
mncharity
> namespace

Nod. I fuzzily recall being told yeas ago of ITA Software struggling to even
build their own CL code. Reader-defined-symbol load-order conflict hell, as I
recall. And that was just a core engine, embedded in a sea of Java.

> second class citizens

I too wish something like Kernel[1] had been pursued. Kernel languages
continue to be explored, so perhaps someday. Someday capped by AI/VR/whatever
meaning "it might have been nice to have back then, but old-style languages
just aren't how we do 'software' anymore".

> detailed documentation covering all the design criteria and coding decisions

As in manufacturing, inadequate docs can have both short and long-term
catastrophic and drag impacts... but our tooling is really bad, high-burden,
so we've unhappy tradeoffs to make in practice.

Though, I just saw a pull request go by, adding a nice function to a popular
public api. The review requested 'please add a sentence saying what it does.'
:)

So, yeah. Capturing design motivation is a thing, and software doesn't seem a
leader among industries there.

> enable future generations to build upon what has been done.

Early python had a largely-unused abstraction available, of objects carrying C
pointers, so C programs/libraries could be pulled together at runtime. In an
alternate timeline, with only slightly different choices, instead of
monolithic C libraries, there might have been rich ecology. :/ The failure to
widely adopt multiple dispatch seems another one of these "and thus we doomed
those who followed us to pain and toil, and society to the loss of all they
might have contributed had they not been thus crippled".

> To understand a piece of Lisp code [...struggle]

This one I don't quite buy. Java's "better for industry to shackle developers
to keep them hot swappable", yes, regrettably. But an inherent struggle to
_read_? That's always seemed to me more an instance of the IDE/tooling-vs-
language-mismatch argument. "You're community uses too many little files
(because it's awkward in my favorite editor)." "You're language shouldn't have
permitted unicode for identifiers (because I don't know how to type it, and my
email program doesn't like it)." CL in vi, yuck. CL in Lisp Machine emacs...
was like vscode or eclipse, for in many ways a nicer language, that ran
everything down to metal. Though one can perhaps push this argument too far,
as with smalltalk image-based "we don't need no source files" culture. Or it
becomes a "with a sufficiently smart AI-complete refactoring IDE, even this
code base becomes maintainable".

But "trickily" written code, yes. Or more generally, just crufty. Perhaps
that's another of those historical shifts. More elbow room now to prioritize
maintenance: performance less of a dominating concern; more development not
having the flavor of small-team hackathon/death-march/spike-into-production.
And despite the "more eyeballs" open-source argument perhaps being over
stated, I'd guess the ratio of readers to writers has increased by an order of
magnitude or two or more, at least for popular open source. There are just so
very many more programmers. The idea that 'programming languages are for
communicating among humans as much as with computers' came from the lisp
community. But there's also "enough rope to hang yourself; enough power to
shoot yourself in the foot; some people just shouldn't be allowed firearms (or
pottery); safety interlocks and guards help you keep your fingers attached".

One perspective on T(est)DD I like, is it allows you to shift around ease of
change - to shape the 'change requires more overhead' vs 'change requires less
thinking to do safely' tradeoff over your code space. Things nailed down by
tests, are harder to change (the tests need updating too), but make surrounded
things easier to change, by reducing the need to maintain correctness of
transformation, and simplifying debugging of the inevitable failure to do so.
It's puzzled me that the TDD community hasn't talked more about test lifecycle
- the dance of adding, expanding, updating, and pruning tests. Much CL code
and culture predated testing culture. TDD (easy refactoring) plus insanely
rich and concise languages (plus powerful tooling) seems a largely unexplored
but intriguing area of language design space. Sort of haskell/idris T(ype)DD
and T(est)DD, with an IDE able to make even dense APL transparent, for some
language with richer type, runtime, and syntax systems.

Looking back at CL, and thinking "like <current language>, just a bit
different", one can miss how much has changed since. Which hides how much
change is available and incoming. 1950's programs each had their own
languages, because using a "high-level" language was implausibly heavy. No one
thinks of using assembly for web dev. Cloud has only started to impact
language design. And mostly in a "ok, we'd really have to deal with that, but
don't, because everyone has build farms". There's
[https://github.com/StanfordSNR/gg](https://github.com/StanfordSNR/gg)
'compile the linux kernel cold-cache in a thrice for a nickle'. Golang may be
the last major language where single-core cold-cache offline compilation
performance was a language design priority. Nix would be silly without having
internet, but we do, so we can have fun. What it means to have a language and
its ecosystem has looked very different in the past, and can look very
different in the future. Even before mixing in ML "please apply this behavior
spec to this language-or-dsl substrate, validated with this more-
conservatively-handled test suite, and keep it under a buck, and be done by
the time I finish sneezing". There's so much potential fun. And potential to
impact society. I just hope we don't piss away decades getting there.

[1]
[https://web.cs.wpi.edu/~jshutt/kernel.html](https://web.cs.wpi.edu/~jshutt/kernel.html)

~~~
oldandtired
My point about "understanding the code" and the burden of additional
information to retain is about the semantics applicable to the language
itself, not about the tooling that we have build around it for development.

Lisp started with some core simple ideas to which were added many others. For
some like the dynamic scoping, simple idea that it is, it has complexity
interactions with the rest of the language. These interactions increase the
knowledge burden that must be retained at all times to be able to make sense
of what you are reading. This burden is on top of any knowledge burden you
need to carry in relation to the application you are modifying or maintaining.

This is about what are the things you design as part of your language, not the
things you do with your language. This was what I was trying to somewhat
humorously write in my first comment. As I look back over it, I failed to make
that clear.

Lisp had the beginnings of "wow", but then it took a wrong turn down into a
semantic quagmire. Scheme started to fix that and later Kernel was another
attempt.

~~~
mncharity
> failed to make that clear [...] the burden of additional information to
> retain is about the semantics applicable to the language itself, not about
> the tooling that we have build around it for development. [...] knowledge
> burden that must be retained at all times to be able to make sense of what
> you are reading

Not lack of clarity I think - it seems there's a real disagreement there. I
agree about the burden, and the role of complex semantics in increasing it.
But I think of _bearing_ the burden as more multifaceted than being solely
about the language. I think of it as a collaboration between language, and
tooling, and tasteful coding. For maintenance, the last is unavailable. But
there's still tooling. If the language design makes something unclear and
burdensome, it seems to me sufficient that language tooling clarifies it and
lifts the burden. That our tooling is often as poor as our languages, perhaps
makes this distinction less interesting. But a shared attribution seems worth
keeping in mind - an extra point of leverage. Especially since folks so often
choose seriously suboptimal tooling, and tasteless engineering, and then
attribute their difficulties to the language. There's much truth to that
attribution, but also much left out.

Though as you pointed out, cognitive styles could play a role. I was at a
mathy talk with a math person, and we completely disagreed on the adequacy of
the talk. My best trick is "surfing" incompletely-described systems. His best
trick is precisely understanding systems. Faced with pairs of code and
tooling, I could see us repeatedly having divergent happiness. Except where
some future nonwretched language finally permits nice code.

