
The Roots of Lisp (2001) - afurrysolver
http://www.paulgraham.com/rootsoflisp.html
======
tromp
> In 1960, John McCarthy published a remarkable paper in which he did for
> programming something like what Euclid did for geometry.

One could argue that Church's (1930s) lambda calculus, which underlies LISP,
is a closer analogue to Euclid's distillation of the essence of Geometry.

With the minimal addition of pure binary IO, the lambda calculus is easily
transformed into an untyped programming language [1]. On the other hand, full
blown and richly typed pure functional programming languages like Haskell
remain semantically faithful to their lambda calculus underpinnings.

[1]
[https://tromp.github.io/cl/Binary_lambda_calculus.html](https://tromp.github.io/cl/Binary_lambda_calculus.html)

~~~
nikofeyn
i don't think it's accurate to say lambda calculus underlies lisp, is it?

certainly lambda calculus was used to define functions in lisp, as seen in the
original lisp paper (or the most famous one).

[http://jmc.stanford.edu/articles/recursive.html](http://jmc.stanford.edu/articles/recursive.html)

lambda calculus was a tool, but it seems godel and turing's work on computing
machines and recursive functions were more fundamental to the original lisp.

~~~
curious_yogurt
The claim seems accurate to me, at least for Scheme: that is, lambda calculus
does underlie Scheme.

Lambda calculus, as articulated in Church's 1935 paper, gives us a way to
understand computability. But if we look specifically at Operation II in the
paper (p. 347), what we have is the lambda function in Scheme. Given that
every function in Scheme (including special forms) is equivalent to some
lambda function, it seems that in virtue of Operation II we can say that
lambda calculus underlies Scheme.

Whether we can push this further to say that lambda calculus underlies
McCarthy's initial conception of LISP, or some arbitrary version of Common
Lisp, I do not know for certain. But if all these versions treat lambda
functions the same way as Scheme does (in relevant respects), then the claim
holds for Lisp generally.

[1]: Church. An Unsolvable Problem of Elementary Number Theory. _American
Journal of Mathematics_, Vol. 58, No. 2. (Apr., 1936), pp. 345-363.

------
bsaul
Recently had a conversation on how functional programming and declarative
programming were somewhat linked, and how i observed that languages that
started to go down the functional road slowly tried to add more and more
capabilities for DSLs, and then slowly moved toward becoming a lisp. But that
was just my feeling.

is this theorized somewhere ? Or am i just dreaming ?

~~~
dgb23
Anecdotal experience:

(dynamic) functional programming enables very 'easy' succinct abstractions
through combing out concretions and composing functions out of a general tool-
set. Think code-reuse in the small.

With some discipline you naturally end up with a very data-oriented codebase,
where the specificity of your application/library ends up in your data-
structures.

Now these data-structures are really just plain data, without specific
behavior attached to them, so they embody the (maybe almost) declarative part
of your program/system. The next step would be to polish them to make them
more human readable.

An additional feature of Lisps is their homoiconicity and macros. These come
into play when plain data-structures are not the right fit anymore. Typically
you want to handle some user/client defined behavior. You can now transform
the syntax of the language itself to clean up your API to reduce boilerplate
and increase readability.

In a sense you have now programmed your own DSL.

------
TruffleLabs
McCarthy wasn’t just interested in making Lisp for AI, he was thinking about
the possible applications AI, with Lisp as a platform to get to more advanced
states of understanding. An example of this is in his paper “Computer Control
of a Machine for Exploring Mars." Stanford Artificial Intelligence Project,
Memo N. 14, June 15, 1964, Author: McCarthy, John, 1927-2011; Stanford
University Library collections.

[https://stacks.stanford.edu/file/druid:qh147kq8662/SC1041_SA...](https://stacks.stanford.edu/file/druid:qh147kq8662/SC1041_SAIL_AIM_014.pdf)

------
yters
If I look at the operon structure of a genome, and squint a bit, it looks very
similar to the LISP structure:

(func var0 var1 ... varn)

Could it be that the design of LISP is inspired by the structure of the
genetic code?

~~~
abhiyerra
I always thought this as well. In Lisp, data is the program similar to how DNA
is data as well as the program.

~~~
tabtab
We are all just mutated Lisp...and mutts.

[https://xkcd.com/224/](https://xkcd.com/224/)

------
ctkrohn
Funny to see this after all these years. I read this paper just before
starting college in 2003 and it inspired me to spend most of the summer
writing a rudimentary Lisp interpreter in rudimentary C++. That was my first
piece of my code that actually worked, and was more than a dozen or so lines.
Good memories.

------
blackrock
I get that Lisp is popular because of its meta-programming abilities, ie its
macro system. And the reasoning was that you didn’t have to wait for the
language writers to implement the feature you wanted, since you could
implement it yourself as a macro.

But this presented other problems, namely that you might end up coding your
libraries on an island, and no one else could understand your code. This is
probably why Lisp is mostly more powerful when there is only one or a few
programmers working on the code base.

But with the increase in other programming languages, and the massive amounts
of other libraries out there, then is this macro system really all that
necessary anymore?

~~~
slifin
Clojure is a lisp that runs on top of Java, JavaScript and a couple of others
and has access to all of those libraries via npm and maven

Having more libraries than most languages doesn't nullify the use of macros,
macros are usually discouraged in application code because they don't compose
well, don't make a macro if a function will do

However having macros meant that when go popularised CSP concurrency at the
language level it meant we didn't need to change the language (for all
variants of Clojure) it just meant we could make a library with macros to do
CSP concurrency and it works on the browser on the server etc

Completely opt in, well documented as a library and we didn't weigh down the
language with this feature, if or when the next big concurrency thing comes
along we won't be looking to deprecate go channels from the language

Macros are incredibly powerful and you probably won't need them often but for
those small cases of scratching a particular itch they're a get out of jail
free card

------
coder1001
It amazes me how @pg managed to conquer the business side of startups and
started YC given his very technical background!

To me it is not very common for (very) technical people to be good at both
tech and business!

~~~
fouc
Yet some of the most successful business people seem to be good at tech. I
heard that half of the CEOs of the top 500 fortune companies have a STEM
degree.

~~~
Hendrikto
My guess: STEM degrees require and foster critical thinking, logic, and
abstraction skills, which are all very conducive to being a good business
person.

------
jammygit
Which is the better introduction to Lisp: On Lisp (by pg), SICP, or another
text (maybe on Common Lisp)?

Or is it better to just go straight to Clojure these days?

~~~
lerax
Clojure is always the wrong option to get into the principles of Lisp.

* Clojure utilizes non-standard Lisp syntax like []{}, weird function declaration and differ semantics of some forms (like cond usage, a powerful specie of switch-case of Lisp);

* Clojure don't have cons pairs, neither car and cdr operators. Cons pairs are fundamental Data Structures to build compound data, very well explored in SICP and any LISP text book.

* Clojure has the batteries of Java, so you will own a giant ecosystem to build complex software, but the evil parts of jvm will be inherited too. For the first contact with Lisp, this can be a unnecessary pain in the ass.

I would recommend to start a LISP journey with Land of Lisp or Practical
Common Lisp (PCL), both focused in Common Lisp. Land of Lisp has a lot of
history about the beginning of Lisp and the author write the book in a fun
way, lot of charges and xkcd-like humor. The book has a collection of game
projects per chapter, one at time, teaching principles of the language. PCL
it's very useful for understanding specific parts of the language, I used as
complement when Land of Lisp was not sufficient (loop topic on PCL is very
good).

~~~
tosh
another angle:

while Clojure is a Lisp with its own distinct flavor it gives you …

* a large community of practitioners and professionals (great for asking questions, finding collaborators, …)

* easy access to libraries in the js/jvm/.net ecosystems

* a style that relies more on data and transformations of data (illuminating simplicity)

in any case: it is worth digging deeper, one thing that kept me away was not
knowing where to start (analysis paralysis). in hindsight picking any Lisp
would have been great (instead of postponing).

Find a thread and start pulling :)

Rich Hickey’s talks were a great entry point for me
[https://github.com/tallesl/Rich-Hickey-
fanclub](https://github.com/tallesl/Rich-Hickey-fanclub)

e.g.
[https://www.youtube.com/watch?v=rI8tNMsozo0](https://www.youtube.com/watch?v=rI8tNMsozo0)
(the “Simplicity Matters” keynote at Rails Conf 2012)

edit: Land of Lisp is a great book as well

------
terminaljunkid
IMO Lisp is just overrated, it lacks visual clues, reads right to left with
horrible nesting etc..

Sure it has some good concepts. But fanboys on internet make it seem like some
God tier thing.

~~~
coldtea
"Some good concepts"?

It was the first language to add "if/else" constructs, GC, closures, first
class functions, reference semantics, and recursion.

It took between 5 to 40 years before these became available in mainstream
languages (conditionals like if/else were adopted early, GC not so much,
closures took even more). Add to that macros and the flexibility of runtime
evaluation / code creation, which most mainstream languages still lack.

And of course, Lisp's rules remain (and will always be) the most succinct way
to define a full blown programming language with meta-programming facilities
to boot -- as opposed to a mere Turing machine like thing or assembler.

> _IMO Lisp is just overrated, it lacks visual clues, reads right to left with
> horrible nesting etc.._

This doesn't make any sense...

The nesting is the same as in almost any language.

"reads right to left" \- huh?

You know that Lisp code can be indented right?

~~~
tabtab
As far as "Lisp lacks visual clues", after many heated debates on this, I've
concluded that the ugliness and rigidness of "production" languages enforces
certain visual and syntactic standards that makes reading others' code easier.
Indentation of Lisp won't "solve" this because production languages also can
be indented. It's not a difference maker.

Parameter lists are wrapped in parenthesis and separated with commas, while
code blocks are wrapped in curly braces and separated with semi-colons.

Lisp won't do this because it's then harder to blur/merge/change the
distinction between code and data, which is the very power of Lisp.

Building architectural eyesores is probably not a rational plan for a city,
but dammit, those eyesores help you know and remember where you are.

Some people have a certain kind of eye and/or brain that allows them to read
Lisp quickly, but this may not be universal. Some say "with enough time you'll
get used to it", but the fact Lisp has been around for 60 odd years without
catching on in the mainstream is evidence of this. Somebody would be rich by
now off its alleged advantages if they were real (beyond a "write-only"
startup language).

Functional languages are better at expressing ideas but slower at
communicating them to other readers, on average. I also personally find them
difficult to debug because they don't have enough "intermediate state" to
x-ray for debugging purposes. Here's a conceptual illustration:

    
    
          // imperative:
          a = af(param);
          b = bf(a);
          c = cf(b);
    
          // functional
          c = cf(bf(af(param)));
    

In a debugger and/or Write statements I can readily examine intermediate
values "a" and "b". Not so with functional. SQL presents a similar issue. Its
new WITH statements help out, but only partly. Divide-and-conquer works better
if you can examine the divisions.

~~~
armitron
You know your brain adapts so you can read something easier/faster the more
you are exposed to it. For me, Lisp code couldn't be easier/faster to read and
comprehend.

I'm convinced most of the people making jokes about parentheses or Lisp code
being hard to read are superficially dismissing it without putting in even a
minimal effort of working with it.

Things that don't immediately click are discarded. Individual curiosity
("Really smart people say great things about this, I wonder why that is..")
leading to individual effort leading to deep understanding is not the
prevailing attitude.

Sad state of affairs.

~~~
lispm
I think that the syntax has some hurdles, but it's not the parentheses: It's
to mentally understand when lists and symbols are data and when they are code.
That's a problem not found in other programming languages and at the same time
it is an interesting feature. Lisp is not alone to have such hurdles - another
example would be Haskell which is also more difficult to learn than the
average programming language (lazy evaluation, type system, monads, ...).

Often Lisp had been used as a teaching language for computer science concepts
(recursion, evaluation models, algorithms, etc.) and thus it was associated by
students with novel concepts they struggle with and not with solving practical
problems. A typical example is the SICP book. It's great, but mostly CS and
mathematics oriented -> the result is that the feedback is mixed.

