
The Nature of Lisp - llambda
http://defmacro.org/ramblings/lisp.html
======
akkartik
Part of the problem is that lisp evangelism sets itself up to fail. An
instantaneous blinding moment of enlightenment, would you like fries with
that? Haven't they heard that you shouldn't start a joke with "This is the
most hilarious thing ever"?

I've been doing lisp for several years now. I've built several interpreters.
I've never had the enlightenment he describes. The minor epiphanies have been
on par with oo design and unit tests. I've travelled far over the months, but
it's closer to grok than zen.

~~~
ChuckMcM
Enlightenment epiphanies result in proselytizing. Can't be helped, its like
tapping your knee with a rubber mallet makes your leg kick out.

The 'secret' or the thing that most people don't get early on when
programming, is that code is data and data is code. A binary tree is data that
is carefully surrounded by the semantics of the data's relationship with its
peers. Reading the structure reads out the data in sorted order. Lisp just
makes that painfully clear, that there is no distinction between state and
semantics as far as computers are concerned and it allows you to move the
'computation' between data structures and algorithm at any point.

A grad student at USC explained it well when he described it like learning
your third or fourth spoken language, Suddenly you brain "flips" from having
three or four different ways of naming 'milk' into a single concept of milk
with an infinite number of ways to identify it. The relationship between the
root concept and the expression of that concept change precedence in your
thought process.

Once you have made that switch you can write code in any computer language.

~~~
kinleyd
Very nicely put. imho, it could serve as the tl;dr for the op, which was also
excellent.

------
gliese1337
Reading the related article on writing a Lisp interpreter in Haskell
(<http://news.ycombinator.com/item?id=4764088>) reminded me of my _second_
blinding moment of enlightenment- understanding vau expressions. Things that
can't be implemented as functions are typically things that require
controlling the evaluation of arguments (conditionals, assignment, short-
circuiting boolean operators, etc.), and additional language features (built-
in special forms or macros for writing your own) are included to handle those.
But if you have something that allows you to control the evaluation of
arguments, simply choosing to evaluate all your arguments gives the equivalent
of a function. Implement that thing, and your compiler/interpreter no longer
needs to know about the difference between functions and macros and built-in
forms; they're all the same thing!

There's not a lot of practical use for that kind of thing that I am aware of
(implementing run-time macros is one, being able to pass short-circuiting
boolean operators to map, reduce, etc. is another), but I strongly suspect
that's just because we don't have 30 years of collective experience figuring
out all of the great things about vau expressions like we have with Lisp and
anonymous functions. The only language (discounting toy projects) I know of
that actually implements them is Kernel
(<http://web.cs.wpi.edu/~jshutt/kernel.html>).

~~~
Peaker
A language that is lazy by default lets you control the evaluation of
arguments. Ordinarily, they're not evaluated, and if you force them they are.

However, macros are not just about whether to evaluate -- but about exposing
the internal syntactic structure of the arguments.

In Haskell, using laziness you can implement control flow, short-circuiting,
etc. If you want functions that work with the syntactic structure of their
arguments, you need heavier machinery:

* Thick DSLs: Define explicit AST types and have a DSL that explicitly constructs those AST's.

* Template Haskell (the arguments' syntax has to be in ordinary Haskell)

* Quasiquotes (Need to parse strings)

I think the need for exposed syntax is relatively rare (e.g: a function that
evaluates and shows a trace of the evaluation). In those cases, I think
explicit AST types work pretty well, as Haskell has extremely light-weight
syntax for constructing user-defined data types.

~~~
gliese1337
Without access vocal inflection, I'm not sure if you're intending to argue, or
expand. So, I'm gonna go with continuing to expand on the point.

Simple laziness does not allow you the same level of control over evaluation
as vau expressions do. A vau expression can choose to evaluate it its
arguments exactly once (like call-by-value), exactly as many times as they're
used (like call-by-name), only if they are used (like laziness), as many times
as you feel like, in a different environment than the calling context, or not
at all, and can make that decision independently for every argument.

In Kernel's implementation at least, unevaluated operands are AST types that
can be poked and modified, not opaque values like lazily-evaluated operands.
As a result, vau expressions can be used to implement macros, both the hygenic
and non-hygenic variety, and the language need not define quoting or
quasiquoting because those features can also be implemented within vau
expressions.

Vau expressions seem to play havoc with static analysis, though, so there are
good arguments for actually having some of those things as built-in language
features rather than just building everything as a standard library.

~~~
Peaker
I was expanding (with a slight correction about macros doing more than just
controlling evaluation).

Haskell-style laziness comes with purity, where it does not matter much
whether something is evaluated once or many times. It does matter if it is
evaluated 0 or more though (due to non-termination and exceptions).

The opacity of values is what I meant by macros also exposing the syntax as
opposed to just controlling evaluation.

------
sjmulder
It would seem to me that some variety of Lisp would be the ideal candidate as
a sort of runs-everywhere language, a thin portable base language that runs on
top of different runtimes, offering easy integration with whichever it is
running on.

Basically, something like a minimalist Clojure but not just for Java. It would
be able to run atop the CLR, JavaScript or the Objective-C runtime as well.
The interface with the host platform may be different, as long as the core
language works everywhere. Ideally the core would be tiny.

~~~
mischov
So something like Clojure, but that runs on the CLR and Javascript as well?

CLR - <https://github.com/richhickey/clojure-clr> JS -
<https://github.com/clojure/clojurescript> Python -
<https://github.com/halgari/clojure-py> Lua - <https://github.com/raph-
amiard/clojurescript-lua> C - <https://github.com/schani/clojurec>

(Sorry, couldn't resist.)

~~~
netfeed
Ruby - <https://github.com/unnali/rouge>

------
bitops
Very good article, though I doubt it'll convince the usual mass of
unbelievers. (I love Lisp, for the record, though my primary exposure has been
through Emacs Lisp - so shoot me).

A really great book that helps you get appreciate the concepts in Lisp,
without really talking about Lisp directly too much, is "Patterns of Software"
by Peter Gabriel. <http://amzn.to/TxDKGG>

I found it to be a very enlightening read. Definitely a book you have to sink
into with plenty of time and quiet.

~~~
blue1
_Richard_ Gabriel. Though Peter Gabriel would be good too :)

~~~
bitops
Haha, oops! Too late to edit. Good catch. : )

------
nnq
...a bit offtopic, but I was wondering while reading the example of using C
itself as the C preprocessor language: why don't languages provide the ability
to do this kind of thing automagically, I mean marking some code to be
executed at compile time and act as a code generation feature? (I know, it's
easy enough to write a small preprocessor that does it, and it's just
primitive string based macros, but having a standard way to do it baked into
the building tools or the interpreter for an interpreted language seems
...neat ...even cool if some more "magic sauce" would be added to it to make
these "macros" hygienic :) ).

~~~
merijnv
> why don't languages provide the ability to do this kind of thing
> automagically, I mean marking some code to be executed at compile time and
> act as a code generation feature?

There's certainly already languages that do this type of thing. Haskell has
Template Haskell which lets you execute Haskell code at compile time to
generate code. I'm pretty sure multiple ML's also have similar meta-
programming features.

It works rather nice, actually.

~~~
bad_user
Scala 2.10 will have it too.

There is a distinction to be made. In non-homoiconic languages writing macros
takes a lot of effort, while in Lisp it's very natural.

On the other hand I don't feel that's an advantage for Lisp, because macros
are not composable as functions are and you have to really grok Lisp in order
to write macros effectively and also recognize instances where they are
appropriate.

~~~
Peaker
I don't think writing Template Haskell macros takes a lot of effort. It is
probably harder than Lisp macros, but the main effort is studying the TH API
once.

------
jon6
To understand Lisp is to understand interpreters. With that understanding you
can create domain specific languages which is extremely powerful.

But I wouldn't recommend using Lisp itself.. macros in particular are
unhygienic.

~~~
p4bl0
Common Lisp != Lisp. You mean Common Lisp, Lisp is the family of languages
(which also includes the Scheme sub-family, Racket, Clojure, Arc, Kernel…).

~~~
jon6
Its disingenuous to call scheme and racket lisps. They have parenthesis and
first class functions but the similarities stop there.

So yes I equate Lisp with common lisp. I didn't read the entire article (far
too long) but he does mention 'defmacro' which is in Common Lisp.

~~~
mattdeboard
So, homoiconicity is a trifling, meaningless similarity?

"Sure, it may be homoiconic, use prefix notation, have first-class functions
(in additional to all the other usual functional paradigms that aren't unique
to lisps) but it's not a lisp."

Big ok to that one. This must be pedantry of the highest caliber, not
ignorance.

~~~
jon6
They are not homoiconic. The underlying datastructure for many schemes, and
racket, is not a list. It is a syntax object. Of course you can still do
metaprogramming with syntax objects but I wouldn't call it the same thing.

~~~
elibarzilay
You should be careful here, and not lump together "many Schemes" and "Racket"
(or other specific Scheme implementations). The thing is that Scheme standards
have traditionally avoided tying the language with a macro system that
requires some specific representation for syntax -- giving you only the simple
rewrite rules system means that you don't actually need to know that
representation.

In Racket, OTOH, there are definitely syntax objects with enough functionality
to write code that handles them, and I suspect that you know that. The
question is whether this should be considered "homoiconic" or not, but this is
a kind of a subjective issue, since at an extreme, I can say that all
languages that have strings are homoiconic. Perhaps you need more from the
language to make it so, maybe eval, or maybe actually require it to have
compile-time procedural macros? In any case, Racket will have all of the
features that CL does, so it is arguably at least "as homoiconic" as CL is.
But in fact, it has more than just s-expressions: these syntax objects are
basically sexprs + a bunch of stuff like source location and lexical context,
so in fact they represent _more_ than what lists in CL do. Should I then
conclude that Racket is more homoiconic than CL? And this is not a tongue-in-
cheek argument: in fact, many CL implementations are aware of the limits of
sexprs as good representation for code, and add things like source location
via a backdoor, like a hash table that maps pair objects to additional
properties. Racket does that in its basic syntax representation so IMO it's
fine to indeed consider it more homoiconic. And I also say that for the
addition of lexical context information -- that's something that is not only
included in the Racket syntax object, it's something that you just _cannot_
get in CL, so if homoiconicity is being able to have a high-level
representation of code (unlike raw strings), then this is another point where
Racket wins the pissing context.

Finally, it's not that all "many Schemes" are limited as described above --
there are many of them that have their own macro systems with similar syntax
values, and that includes Schemes that follow R6RS since that dictates syntax-
case which comes with them. It just happens that Racket is has been
traditionally running at the front lines, so it's more advanced.

~~~
p4bl0
It's not really necessary to second you, but I'd like to add that "code as
data" is more real in Racket than is CL since code is not just the AST, it's
also (as you point out) location and more importantly, context. In this
setting Racket' syntax objects are more "code as data" than "code as sexp" as
it is in CL will ever be.

~~~
elibarzilay
Right. Perhaps a better way to summarize this is that:

* Lisp made the first giant step of having code representable as data for meta-programming, and chose sexprs to do so

* Common Lisp came later, and made the important step of _requiring_ this representation, which means that in every CL implementation you're required to have the code as data aspect

* But the flip side of this is that CL hard-wires _just_ sexprs, it forbids an extended type, which means that you can't get anything more than sexprs (without resorting to "extra properties" hash table tricks)

* Meanwhile, Scheme (R5 and others that have only `syntax-rules') took a step back by specifying only rewrite rules which can be implemented in any way an implementation chooses

* But some Scheme implementations _did_ use sexprs, but since they need to encode more information (lexical context) they extended them into syntax values (note that some Scheme low-level macro systems try to present users with a simplified interface where user code sees just the sexprs)

* Later on, Racket took further steps and enriched its syntax values with "more stuff"

* R6RS got closer to this too, by adopting the syntax-case system (but some people had issues with "wrapping" symbols, since you can't do that with the hash table trick)

* And finally, R7RS (the "small" version) is going to take a step back into the R5RS days. (And in the "big" language it looks like they'll adopt one of these systems that try to keep the sexpr illusion.)

------
myoffe
Also, this is a very interesting article on why lisp is unsuccessful in the
"read world": <http://www.winestockwebdesign.com/Essays/Lisp_Curse.html>

I like the original article a lot, but what it failed to do for me is convince
me why someone like me, a typical programmer, would want to choose Lisp over
Python/Ruby/etc to solve a real world problem. Both Ruby and Python have
powerful meta-programming abilities built into them. Lisp should be compared
with these, not with C.

I still think that functional programming is extremely interesting (I'm in the
long process of learning Haskell myself) and is useful is certain real world
cases, but I was not convinced by this article. All the problems there are
easily solved in modern and dynamic languages.

------
secure
I found that a rather good introduction to code as data, but I am not sure
whether I am supposed to have been hit by the enlightenment he describes… :-)

~~~
S4M
Yes, for me the first time I read about the lisp syntax I was thinking:

"oh cool, it makes (+ 2 2) exactly equivalent to the syntaxic tree

    
    
         +
    
      /     \
    
     2       2
    

"

But I don't find it particularly enlightening and I still don't see what cool
stuff you can do with macros that you can't do elsewhere.

~~~
ced
Macros extend the power of the language way beyond its core primitives. For
instance, I wrote a macro, TEMPORARY-ASSIGN. I use it like this:

    
    
      (TEMPORARY-ASSIGN ((traversing obj) true)
         ... do stuff ...)
    

In Python, the equivalent code would be.

    
    
       old_trav = obj.traversing
       obj.traversing = True
       try:
          ... do stuff...
       finally:
          obj.traversing = old_trav
    

There's no way to abstract out that pattern in Python. Every time you want to
temporarily assign a field or variable, you're stuck writing the above code.
Another example:

    
    
       (defun foo (x y) ...)
    

is how you define a function in Common Lisp. I wrote a macro, DEFUN-CACHE

    
    
       (defun-cache foo (x y) ...)
    

which is the cached version. In Python, you can do the same with decorators,
but that's one more tacked-on feature. Lisp programmers have been writing
defun-cache since 40 years.

If you want to learn more, Paul Graham's On Lisp is the definitive book on the
topic. You can download it for free <http://www.paulgraham.com/onlisp.html>,
and it's very readable, even if you're not a Lisper.

~~~
reinhardt
> There's no way to abstract out that pattern in Python.

I'm sure there are macros that can't be abstracted out in Python but this
isn't one of them:

    
    
        from contextlib import contextmanager
        
        @contextmanager
        def temp_assign(obj, attr, val):
            old_val = getattr(obj, attr)
            setattr(obj, attr, val)
            yield
            setattr(obj, attr, old_val)
        
        class X:
            pass
        
        x = X()
        x.a = 1
        with temp_assign(x, "a", 2):
           print x.a # prints 2
        print x.a # prints 1

~~~
hythloday
I think it's an extraordinary strength of Python that I hadn't seen your code
when writing mine but that other than two variable names they're _identical_.
Leaving my comment up for demonstration of this.

~~~
reinhardt
Ha, awesome! I went for an exact transliteration although if I were to use
this idea for real I would probably do the assignment explicitly in the body.
I think this looks a bit more pythonic:

    
    
        @contextmanager
        def restoring(obj, attr):
            old_val = getattr(obj, attr)
            yield
            setattr(obj, attr, old_val)
        
        x.a = 1    
        with restoring(x, "a"):
           print x.a
           x.a = 2
           print x.a
        print x.a

------
agumonkey
Seems like Slava Akhmechet's worldwide celebration.

------
ekm2
Total ignoramus here:Does this "profound enlightenment" actually lead to
profound execution?

~~~
aerique
As I've written in another comment I have never experienced the profound
enlightenment but what do you mean with "profound execution"?

I can tell you that Common Lisp gets me the quickest results going from idea
to prototype, it is a very practical language and doesn't get in the way.
However a large part of this is experience. It was the most fun language to
learn and apply to projects for me though.

------
klibertp
This is my third time reading this article; this time I stopped reading after
a few paragraphs, but still skimmed it to refresh some things in memory. This
is very good article, and one I would recommend to anyone to read, were it not
for it's length - these days I guess half of the responses would be "tl;dr",
sadly.

It's one of the articles that convinced me to take a look at Lisp a few years
back, among others, which caused me to learn Scheme rather than Common Lisp or
Emacs Lisp (I think Clojure was not around then yet). I invested half a year
time to learn PLT Scheme/Racket and felt enlightened quite a few times along
the way. First class continuations were the most mind blowing thing and I
spent a few weeks trying to understand them. To prove to myself that I know
what call/cc (or rather - it's delimited brethren) is all about I wrote
python-style generators using them and this was one of the most rewarding
experiences in programming for me.

Then I moved on, to Erlang IIRC, which was much easier to understand and use
after being exposed to Scheme. In the following years I learned many more
languages, all the while aiming for "purity" of the concepts and knowing full
well that I won't be able to use any of them in real world. Many programmers
would call Smalltalk a toy language - at best - but I had great time learning
it and expanding my views on OOP, for example. I thought that the compromises
that widely used languages make cause these languages to represent only a
piece of what is possible, even if they are called "multi-paradigm", and
wanted to explore more.

All this time I was writing commercial software in Python; I can't say if
other languages I learned made me a better programmer - from the business
perspective - but some really helped me expand my understanding of what I do.
Forth and Lisp and Smalltalk did this and I was perfectly happy with stopping
to use any of them after gaining some "enlightenment". They were not
practical, not made for real world, they were there just to prove and
demonstrate some kind of point, perspective.

This past week I couldn't work due to health problems and suddenly, after a
few years of almost continuous work, I found myself bored. I thought, hell,
why not? and went to reimplement a tiny bit of what I was working on earlier.
I did this using Racket, my first "esoteric" language, so I had quite some
things to relearn (good thing, too, because the language evolved in the
meantime), but I finally (8 hours or so, in one go... they tell me it's _not_
healthy to do this when you're ill, but it was fun) did it.

And it worked. And looked great. It was much shorter, more elegant and
performant than Python. Certainly, half (or more) of this improvement came
from me implementing the same thing the second time; but still, I was amazed
at how easy and fun that was.

So the next day I decided to create another piece of code in Racket, this time
a fresh one, which output would go straight into the larger system at work.
It's good I had a task at hand which could be broken into pieces that small.
And again, it worked, I did it in about the same time I would do this in
Python, despite the lack of "concurrent.futures" or even thread-safe queue in
Racket. I didn't use continuations or any other obscure features; just higher
order functions and a few macros here and there to simplify error handling and
such and some conveniences over pairs and lists.

I'm not sure what should I think about this situation. It's not a "proof of
suitability" for the real world, of course - I'd need to write much more code
to even begin to be able to claim that Racket is ok to use at work. But on the
other hand I felt bad for ignoring really good language and environment for
such a long time. I should have been trying to use it more and more often and
I didn't because I thought it's not meant for that.

But above all, it was fun. Not because I was learning new stuff, like the
first time, but because the language made it fun. And, what's almost as
important, it worked - I have the code that does what it should be doing.

Well, I plan to try using Racket much more often from now on... Maybe someone
else will give some Lisp a chance after reading this :)

------
hasenj
I passionately hate XML so this could not possibly resonate with me.

I never had the enlightenment he talks about. Actually I think that learning
Lisp/Scheme might have made me a bit of a worse programmer in a way. It made
me "dread" repetitive code so much to the point that I almost could not do
anything with any language that's not highly dynamic.

Anyways.

I had 2 epiphanies with lisp.

1\. Macros. Very powerful concept, but in practice difficult to use properly
in your code. It's too difficult to reason about what's going on, like say, if
you're maintaining or modifying a set of macros. I think it's more useful not
as a construct that you would often use in your own code, but as a construct
that's very useful for making libraries.

2\. Continuations. This is not really related to lisp itself, and can be done
in other languages, like javascript[0]. Understanding a continuation as an
even higher level construct than closures .. and the fact that scheme had it
built-in was very mind blowing for me.

It makes sense though that a lisp language _must_ have it built-in. It's a
concept that's very fundamental to the theory of computation, but in most
programming languages it's not explicit at all.

Before continuations, I thought no lisp language can ever have equivalents of
"break", "return", or "continue". After understanding continuations, I see
that these constructs can built using continuations as a basic building block.

So this to me suggests that the concept of "continuation" is a very basic and
fundamental concept that all students of Computer Science should be familiar
with. Unfortunately I was never taught about it in University.

[0]: <https://github.com/laverdet/node-fibers>

~~~
pg
That "in practice" makes it sound like macros are so hard to understand that
they're not worth using in real applications, which is definitely not true.
Between the facts that (a) one uses them in deliberately restricted ways, (b)
one gets increasingly familiar with them, and (c) they are are, token for
token, way more powerful than ordinary code, macros end up being used a lot.

~~~
hasenj
Certainly having experience with Lisp/Scheme will make it easier to deal with
Macros.

As a "newbie" to Scheme, (well, actually what I played with was Arc, but I
think it belongs to the Scheme family) I was able to write a few macros, and
seeing them work in action was very nice indeed.

The tricky part is maintaining the macros or changing their behavior. It's
like that saying goes: if you write code as cleverly as you can, you are not
smart enough to debug it, because debugging is twice as hard.

------
dschiptsov
Replacing XML with YAML will make it much more clear and much more shorter.

Concept of bindings (of symbols to values) and lexical scope (frames of the
environment) must be described.

DSLs must be introduced to show how a list structure and uniform function
application syntax glue everything together.

The much better advice - _read SICP for Christ's sake_.) People who wrote it
spend much more time thinking what ideas to illustrate, in which order and
why.

Then watch the Lectures, to feel the bliss.)

The true piece of mind comes after you finish reading _On Lisp_ and then the
contents of _arc3.tar_

Before that it is still just as being blinded and puzzled by a sudden flash of
premature enlightenment.)

~~~
brudgers
Graham's _On Lisp_ as free PDF:
<http://lib.store.yahoo.net/lib/paulgraham/onlisp.pdf>

_Structure and Interpretation of Computer Programs_ as free HTML:
<http://mitpress.mit.edu/sicp/full-text/book/book.html>

~~~
dschiptsov
Movies - [http://groups.csail.mit.edu/mac/classes/6.001/abelson-
sussma...](http://groups.csail.mit.edu/mac/classes/6.001/abelson-sussman-
lectures/) ,)

