Hacker News new | past | comments | ask | show | jobs | submit login
Ask PG: Lisp vs Python (2010)
232 points by kung-fu-master on Oct 18, 2010 | hide | past | favorite | 192 comments
It seems that a lot of old school Lispers switching to Python (for example: Peter Norvig). What do you think on Lisp vs Python today?

Peter Norvig here. I came to Python not because I thought it was a better/acceptable/pragmatic Lisp, but because it was better pseudocode. Several students claimed that they had a hard time mapping from the pseudocode in my AI textbook to the Lisp code that Russell and I had online. So I looked for the language that was most like our pseudocode, and found that Python was the best match. Then I had to teach myself enough Python to implement the examples from the textbook. I found that Python was very nice for certain types of small problems, and had the libraries I needed to integrate with lots of other stuff, at Google and elsewhere on the net.

I think Lisp still has an edge for larger projects and for applications where the speed of the compiled code is important. But Python has the edge (with a large number of students) when the main goal is communication, not programming per se.

In terms of programming-in-the-large, at Google and elsewhere, I think that language choice is not as important as all the other choices: if you have the right overall architecture, the right team of programmers, the right development process that allows for rapid development with continuous improvement, then many languages will work for you; if you don't have those things you're in trouble regardless of your language choice.

In terms of programming-in-the-large, at Google and elsewhere, I think that language choice is not as important as all the other choices: if you have the right overall architecture, the right team of programmers, the right development process that allows for rapid development with continuous improvement, then many languages will work for you; if you don't have those things you're in trouble regardless of your language choice.

Thank you, Peter. This is how I have felt for years, but could never find words that describe it as well as you just did.

Someone should write a program that automatically posts this paragraph at the top of every language war thread. I think they should write that program in php :-)

OTOH, it's not completely irrelevant. If you can launch 10% faster, that's a big advantage.

What makes it possible to launch 10% faster? Is it familiarity with the language, tools, libraries, and frameworks? Is it that the language is more dynamic and requires less to accomplish task X? I would say yes to both questions.

Now with the first question, familiarity with the language, lets discuss Python. Python is my language of choice. I know it, I use it, I pay attention to what is happening with the language, the community, etc. I organize a Python User Group. If I were to work on a new project, I would choose Python. I believe I could launch 10% if not 50% faster with Python than with Ruby. Does that make Python a better language? No. Just that I don't know Ruby, Rails, Sinatra, etc.

I think the second question about how dynamic the language is plays into this as well. Java is not very dynamic. Some tasks in Java are downright ugly to work with. If you want to prototype fast and spit out a product fast, you might not want to use Java. At that point, there are several great languages to choose from like Python, Ruby, Groovy, Scala, Lisp, Smalltalk, Javascript, Closure, etc. The list just goes on. You could also argue that .net is more dynamic. But Java has some good points to it as well. You have less risk of bugs due to it being statically and strictly typed. You can build off some great libraries. You get the benefit of the JVM and JIT, which are amazing. Maybe you can't launch faster with Java, but maybe you can launch with a more robust stable product. First doesn't mean best; remember the article about Wasabi and Mint?

The way I read Peter's comment was you don't want to spend you time finding the best language. Instead you want to just grab the most convenient language and spend your time creating the best product. Think through your architecture, work on your design, test the hell product, study the market, your competitors, work on your marketing, and on and on. The list of things to do never end, and if you were still debating about which language to choose, you won't launch.

I agree. Language choice is definitely not the most important factor in success, but it's not as unimportant as some think it is.

Some languages make it easier to become familiar with the libraries. Compare the standard libraries of Ruby and Python with PHP. Even though I programmed in PHP for several years I always had to look up argument order, or whether the names of functions contained underscores between words or not.

What makes it possible to launch 10% faster? what about a better understanding of the problem you are solving? just my 2 cents :p

Also, the language choice might lead one to a slippery slide of platform choice which easily slips into trapping you in an architecture you really did not want.

Or if one of the choices is Actionscript.

I've had a similar experience, lately while writing and editing pieces for Code Quarterly--I've written the same basic algorithms in Javascript, Python, and Common Lisp to play around with them. I find the Python the best vehicle for conveying the algorithms despite being more fluent in Common Lisp.

But I've also been astounded at how slow CPython is compared to SBCL (the Common Lisp implementation I use) when I have to do long runs to gather data. (For the things I've been playing around with, my Common Lisp implementations have been something like 5 to 20 times faster.)

I'm not a Lisp expert, so I have another question. Is it possible to embed DSL into Lisp which will looks like pseudo-code?

For example:


    a = 0

    b = 100

    s = 0

    for (i from a to b)

        s = s + i


If such pseudo code can be embedded into SBCL it would generate fast machine code, also it would be possible to easily modify pseudo code syntax.


This lets you write things like:

(for var in form do ...


(for var in form collect ...

where form can be anything that has an iterator method defined on it.

There's also a macro I use called BINDING-BLOCK (BB for short) which subsumes a bunch of binding forms. So you can write things like:


  x (...)

  y (...)

  :db (q r s) (...) ; Destructuring-bind

  :mv (p q) (...) ; Multiple-value-bind


It's extensible so you can add your own clauses. Code is here:


(It's a little out of date. If you want the latest let me know and I'll update it.)

that's bad assembly-style pseudo-code and thinking. Lisp favors higher-level, functional pseudo-code that readily runs as is!

like in scheme: (write (let for ((i 0) (s 0)) (if (> i 100) s (for (+ 1 i) (+ i s)))))

why would you care for those a, b, or s variables in the first place when all you want is the sum? The above expression writes the sum as computed by a recursive approach using lexical bindings.

you may of course abstract it away into a function: (define (for from to doit result) (if (> from to) result (for (+ 1 from) to doit (doit from result))))

and use it: (write (let ((a 0) (b 100)) (for a b + 0)))

I didn't even need macros yet!

sure, that's possible - just see the LOOP macro...

Then, why no one still not made such eDSL? As I understand, the only reason to choose Python for Peter Norvig was the similarity of Python to pseudo-code. I think that such great hacker as Peter Norvig could easily develop pseudo-code eDSL on top of Lisp macro-system.

As school teacher on programming I'm limited in choice of programming languages. The only language which I can study is Pascal (a lot of other reasons on it). The problem is that Pascal have not any libraries (GUI, 2D/3D, Game Development Engines, programming micro-controllers, ...). Students can write only simple console applications. So, it was be ideal to have subset of Pascal as eDSL on top of Common Lisp or Clojure. In that way I can easily extend original Pascal to access some real world libraries.

I think also on top of Common Lisp we can develop some simpler eDSL as pseudo-code for beginner students.

PS: gone to learn Common Lisp...

>The problem is that Pascal have not any libraries (GUI, 2D/3D, Game Development Engines, programming micro-controllers, ...).

It's not true. Lazarus (http://www.lazarus.freepascal.org/) gives pretty much everything Delphi has. Currently I develop OpenGL 2D analytical tool. Again GUI is on par with Delphi.

There are such DSLs on top of Common Lisp. But usually not on the scope of a full programming language like Pascal. It is not that typical anymore, but there are examples in that direction.

LOOP shows some of the practical problem. For example for LOOP one needs a custom highlighter in the IDE, because it has its own complex syntax, which does not follow the basic Lisp model. Same for indentation / code formatting. There maybe other problems.

There is another Iteration facility as a library for Common Lisp called ITERATE. It is very similar to LOOP, but the syntax is a bit more oriented towards Lisp.

There are such DSLs on top of Common Lisp. But usually not on the scope of a full programming language like Pascal. It is not that typical anymore, but there are examples in that direction.


I have found two examples:

1. Python in Lisp: http://github.com/franzinc/cl-python

2. Ruby on Lisp: http://jng.imagine27.com/articles/2010-10-07-084756_ruby_sub...

I will try to write little subset of Pascal as eDSL on Lisp.

There is also cl-javascript: http://github.com/akapav/js

The advantage of doing a "language X to CL" translator is that it's going to be much simpler than doing it in another language (the number of features offered by CL is a huge superset of that available in any other language; the only exception is continuations), the result will be faster (the example Ruby translator is faster than Ruby, cl-javascript is faster than SpiderMonkey, etc, mostly due to SBCL being a reasonably good compiler), and you will have access to really good debugging tools (Clozure is really good at this - see http://openmcl.clozure.com/ccl-documentation.html#watched-ob... for example).

Usually these languages like Python are implemented on top of Lisp using a special parser. Of that there are many, from C, Python to special research languages. Then there are a lot of languages with lispy syntax. Typical examples were Prolog dialects with Lisp syntax (and optional Prolog syntax). The example that languages are embedded into a single s-expression (like in the LOOP macro) is also possible, but different. It would not make sense to use Python that way, since the low level syntax of Python is different from what Lisp s-expressions provide. For example s-expressions are not preserving any textual layout (indentation) when read - something that a language that uses indentation in its syntax would need, but would not get from being embedded in a single s-expression.

Hmmm... it's interesting...

I came to conclusion that it is easier to do external DSL, not embedded. Parse source code on Pascal and then translate it into S-expressions. Am I right?

Btw, if I'm gonna write external DSL, I can do that in any language for example on Python. So, what's the difference?

Once again the YC community surprises me in both it's depth and it's breadth. From someone who has found your (and Russell's) text invaluable, I say - thank you, sir.

And thank you PG/YC for giving me the opportunity to do so.

Just wanted to chime in briefly here and say thank-you for your example Lisp interpreter in Python. Made some things much clearer for me and because of it I now have the knowledge to be implementing R5RS over the top of Ruby! Thanks again!

> But Python has the edge (with a large number of students) when the main goal is communication, not programming per se.

Communication is increasingly the more important part of programming. Engineering is only really useful if its well communicated, or in a binary you trust.

Yeah, there'll come a time when to build systems all you need is a bunch of people talking to reach a consensus: how much to pay for a third party to write it for them.

oops, it seems it already happens today...

Peter, you imply that Python is better for, or more like, pseudo-code, but why is it better? In my opinion, it is better for most people because they were taught syntaxes more like that of Python than that of Lisp. An important factor in easy-of-understanding is to be similar to something you are already familiar with. To the extent that this is the reason, it's not implicit in Lisp vs. Python, but rather more a question of which style of syntax is better-known.

> Peter, you imply that Python is better for, or more like, pseudo-code, but why is it better?

Well from what he writes, he seems to say Python looks much closer to their actual pseudo-code, and it's therefore easier for students to translate pseudocode to Python than to Lisp.

> In my opinion, it is better for most people because they were taught syntaxes

Such as english. One of the original goals of Python (inherited from ABC[0]) was to be a good teaching language. I'd expect that when Peter talks about his students, he's mostly talking about students with low-to-no knowledge in programming. Those who are already knowledgeable probably don't have a hard time adapting.

[0] http://en.wikipedia.org/wiki/ABC_(programming_language)

When I took an AI class at my university with AI: A Modern Approach (the book was good, if a little difficult to understand at times), we had a couple projects that we had to write in lisp.

First project: Problem Solving Agent for Traveling Salesman Problem. 1. Depth First Search (function argument- DFS) 2. Iterative Depth First Search (function argument- IDFS) 3. A* - Heuristic: Path Cost Incurred (function argument- PATH) 4. A* - Heuristic: Minimum Spanning Tree heuristic (function argument- MST) 5. (Extra Credit 25 points ) Create and implement a heuristic equal to or better than MST

Second project: In this project we implement a decision tree induction algorithm in Lisp.

I had played around with lisp before this point and found it fascinating. I approached these projects with excitement. But even with 8 years of serious programming experience, I could not for the life of me solve these problems in lisp. My problems included:

1. Knowing exactly what methods I wanted to call and use and either a. Not being able to find them in any reference I found online, or b. Finding out that they don't exist, and you have to write them yourself, or c. Finding them and shaking my head at how ridiculously they were named. 2. Not being able to read the code I had just written. 3. Not being able to debug. 4. Finding that manipulating common data structures like hash tables is a total chore.

Eventually I gave up. I had spent about two hours trying to implement the project I had already solved in my head into common lisp and was making little or no progress. So I fired up another vim terminal, solved the project in Python in about 30 minutes, including debugging, and then manually translated the code into lisp.

When project 2 rolled around, I decided to give it another go, but I quickly became frustrated again. Maybe my mind just isn't able to grok lisp? Maybe I'm just not smart enough?

All I'm claiming is that I am an example of a student who was already very knowledgeable about programming and completely unable to adapt to lisp.

Do you also find your pseudocode (and therefore python) clearer, or was it entirely pedagogical?

Redesigning a data format to match how I notate and think about it, to minimize my cognitive load, has been very helpful to me.

The long-term trend in computers seems to be to trade performance for helping the developer.

> when the main goal is communication, not programming per se

This reminds me of debates about the scientific method. Some say that you can test hypotheses etc alone without publishing and you are doing science; but others define science as a community activity, and so without publishing, it is not 'science'. While I love the idea of the lone researcher, and clever insights definitely come from individuals, without a community there is no SOTSOG.

About your last paragraph, I agree 100% All of that is more important than choice of language.

(This is why I love HN)

Robert M. Lefkowitz's 2007 PyCon Keynote discusses the "main goal is communication, not programming" in great detail. Wish there was a text I could cite.

Hey, we used AIMA (which, oddly, is Greek for "blood") in my undergrad and I loved every page, great book! Thanks for writing it!

I'm sorry I missed your talk at UBC. Please come back again soon!

I'm a Clojure guy that just wrote my first Pylons app. Here's my impression:

1. Python doesn't suck. I was able to mix FP & OOP approaches to get to my goal fairly quickly.

2. iPython was fun to use, helped out a lot, but it's not SLIME.

3. Guido has an excellent goal with making code readable, and significant white space is not a bad choice. However, I find being able to analyze active data structures in a Clojure namespace to be a superior way to learn about a system.

4. Python's libraries are pretty good, and it's already been written. As a first impression, Python libs are much better to use than Clojure wrapped java libs. I'm going to look into porting SQLAlachemy to Clojure, it rocks.

5. Paster has a ton of functionality. I'd like to see a similar Clojure tool, maybe Lien can evolve into that.

6. I would like to see more FP constructs natively available in Python.

7. __method__ is an interesting convention. You can have an object implement the right method, and your object works with Python syntax. However, I find it to be a poor man's version of Clojure's protocols (Full Disclojure, I have a conflict of interests here).

8. Decorators are an interesting way to do functional composition, but I prefer comp and monads. Way more versatile.


That's all I've got for now. I'm sure I forgot something.


edit: grammar & spelling

Re: 8, you don't need decorators to compose functions:

    def comp(f, g):
        def h(*args, **kwargs):
            return g(f(*args, **kwargs))
        # fix up h.__doc__ and friends
        return h
or simply

    (lambda the, args: g(f(the, args))(x, y)
(don't remember comp's semantics, is (comp f g) = f o g or g o f?)

too long; don't read:

Decorators are certainly cool, but semantically they represent something more like a pattern than a FP construct. A decorator represents something you might want to do to lots of functions, a property you want all instances of a function to have without writing it explicitly into each function. Function composition is more along the lines of having two functions which are interesting on their own, but which sometimes you want to compose.

With decorators, it would also be awkward to compose multiple functions. Observe:

    def compose_with(g):
        def decorator(f):
            def decorated_function(*args, **kwargs):
                return g(f(*args, **kwargs))
            return decorated_function
        return decorator

    def h(x): math.sqrt(x)

    def g(x): 2 * x

    def f(x): x + 1
versus (for some reasonable definition of apply...)

    def compose(*fns):
        def composition(*args, **kwargs):
            return reduce((lambda computed, next_fn: next_fn.apply(computed)),
                          (args, kwargs))
        return composition

    # define fns as above without decorator
    hogof = compose(f, g, h)

To expand on what I was thinking with #8, here's how I'd implement a decorator in Clojure

(def my-decor (partial comp decor-bevior))


Well, what python calls a "decorator" is just a function that accepts a function and returns a function with roughly similar functionality. In python, the old way to decorate functions was

    def my_decorator(f): ...
    def my_function(...): ...
    my_function = my_decorator(my_function)
and the new "@my_decorator" syntax is just sugar for this.

Function composition is only one of many things you can do with decorators. You could implement a K-combinator with them if you wanted to:

    def kestrel(x):
        "decorates a function to evaluate that function, but then return x"
        def decorator(f):
            def g(*args, **kwargs):
                f(*args, **kwargs)
                return x
            return g
        return decorator

    def foo(x):
        y = x + 1
        print y
        return y

    foo(5) # => 4, but prints 6


Macros are the main reason I decided to create Adder, a Lisp-on-Python with minimal impedance mismatch. Unfortunately, the first macro-heavy program I wrote turned out to be really slow, because macros engage the compiler, which, of course, is in Python.

When I first tried it, it took something like 50s at 2.4GHz, virtually all of which was the compiler. (The compiler runs at load time; obviously, saving the compiled code for the next run would help.) I got it down to...let me try it now...7s at 3GHz, but that's still too slow for a 200-line program.

If anybody's interested, the code's on Github [1]. To see the macro-heavy example, look at samples/html.+, which is an HTML generator. The framework takes 169 lines; the sample page starts at line 171. To see the output, run:

./adder.py samples/html.+

[1] http://github.com/metageek/adder

(Edit: it requires Python 3.x.)

> Unfortunately, the first macro-heavy program I wrote turned out to be really slow, because macros engage the compiler, which, of course, is in Python.

The Python byte-compiler, specifically, appears to be fairly slow. I hadn't really thought much about this, but while contributing to a benchmark yesterday (http://news.ycombinator.com/item?id=1800396), it wound up staring me in the face. There's actually surprisingly little difference performance-wise in running Lua from source vs. precompiled (both pretty fast), whereas the difference between Python and pyc in my benchmark was wider than every other possible pair in the chart except python interpreted vs. "echo Hello World". (I didn't have any JVM languages, though.)

I don't really do eval-based metaprogramming in Python, but do so on occasion in Lua. I thought I felt better about doing so because Lua is syntactically much simpler (and has scoping rules that make avoiding unexpected variable capture easy), but the Lua compiler itself also appears to be substantially faster than Python's. (It doesn't do much analysis, but still usually runs faster than Python.)

And yes, it's not as good as straight-up Lisp macros, but Lua's reflection also covers a lot of low-hanging fruit that macros would otherwise handle. The biggest thing lacking in Lua compared to Lisp is an explicit compile-time phase for static metaprogramming. (Code generation is an inferior alternative.) Lisp macros win big in part because they can avoid the overhead of parsing, but parsing Lua is fairly cheap thanks to its small, LL(1) grammar (http://www.lua.org/manual/5.1/manual.html#8).

I quite like the approach taken in MetaLua (http://metalua.luaforge.net/) for allowing more advanced cases of meta-level programming. After some exposure, moving up and down levels using +{} and -{} becomes about as readable as meta-programming can IMHO. The explicit modification of the compiler is not so nice, but I'm guessing it is hard to do anything better in a a programming language with some actual syntax.

Of course, what I really would like to use is MetaLuaJIT, to get the best of all worlds :)

Macros were also the reason I wrote Noodle, which was a similar effort (Lisp compiling to Python bytecode) back around 2004. The project died when I couldn't figure out sane, easy-to-learn rules about how to make macros live with modules and imports, and I found I didn't like the syntax concessions I had to allow to make attribute access less of a terrible pain.

I still think something like this would be worthwhile, and I wish you the best of luck! I particularly like your (.bar.baz foo) syntax. And good on you for moving away from bytecode generation; I also found that to be a dead end.

So what are your plans for macros+namespaces? Will macros from imports live in the same namespace?

And what did you mean by "Python supports only two levels of lexical scope" at http://www.thibault.org/adder/ ? I don't recall any constraints regarding lexical scope levels in the VM.

and I wish you the best of luck! I particularly like your (.bar.baz foo) syntax.


And good on you for moving away from bytecode generation; I also found that to be a dead end.

Glad to hear I'm not the only one. :-)

So what are your plans for macros+namespaces? Will macros from imports live in the same namespace?

Actually, I hadn't thought about it--I haven't gotten to the point of being able to write modules in Adder. I think I see what you mean, though: any module has to be a Python module, so macros would have to be expressed as functions.

Strawman: the module could contain a variable listing the macros. When compiling an (import) form, the Adder compiler would check the imported module for that list, and update its internal structures accordingly.

And what did you mean by "Python supports only two levels of lexical scope"

That was a mistake; I've removed it. I think maybe I just didn't figure out how to generate bytecode for a triply nested function.

Of course, there is the remaining problem that Python can't define a scope without defining a function. I didn't want to use the standard tactic for turning (let) into a nested function (I don't trust Python functions to be fast enough), so the current compiler uses name mangling, tagging each variable with the scope depth. (Global variables are untagged, so that they can be accessed cleanly by Python modules.)

It's funny, but the thing I miss most from Common Lisp when I write in other languages is the LOOP macro. It's ugly, and non-lispy, but most loops I have to write can be expressed clearly and concisely using LOOP, and writing the equivalent code in another language is annoying.

I'm tempted to create a LOOP clone for Clojure, then laugh villainously as I unleash it upon the world.

Loop, like structured editing (Paredit), is one of those Interlisp things that's very controversial and divisive.

There's tons of really wild ideas in Interlisp that seemed to be the half-baked acid trip ideas of West Coast hippies at the time, that are just starting to become rediscovered in the past couple of years (pervasive undo -> reversible debugging, DWIM-like autosuggestions in more places), and even the implementation techniques used are still innovative (for example the error-trapping implementation of Conversational Lisp (http://docs.google.com/viewer?a=v&q=cache:4GnEnGS2XXkJ:c...) is quite similar to how Geoff Wozniak approached auto-defining functions (http://exploring-lisp.blogspot.com/2008/01/auto-defining-fun...)

LOOP is not from Interlisp. It comes straight from Maclisp.

'LOOPS' from Interlisp is something entirely different: an object-oriented extension to Interlisp.

I'm going by the Hyperspec and what I remember from reading Kaisler's Interlisp. From the former:

"One of the Interlisp ideas that influenced Common Lisp was an iteration construct implemented by Warren Teitelman that inspired the loop macro used both on the Lisp Machines and in MacLisp, and now in Common Lisp."


It influenced it the Maclisp LOOP. The idea. That's all. The CL LOOP macro OTOH is a straight version of the Maclisp version. The MIT version of LOOP came from the same sources, even.

The Interlisp iteration facility looks slightly different. There are Interlisp manuals as PDF at bitsavers...

LOOP is crack. Personally I love it.

Could you port CL's LOOP to Clojure easily? Or would its internal hyper-imperativeness make that tricky?

If you have to create a LOOP clone, why not creating an ITERATE one? ITERATE is cleaner, and it would follow the Clojure trend of bringing out the most modern features of Lisp.


> I'm tempted to create a LOOP clone for Clojure, then laugh villainously as I unleash it upon the world.


> significant white space

Nit: significant indentation. Mostlanguageshavesignificantwhitespace, somemorethanothers (for instance, at least in 1.8, Ruby seems to have more whitespace issues than Python)

The fun alternative language example is of course Fortran, where at least early versions disregarded whitespace.

Point 6, we are lucky to keep the ones we have, Guido is not keen on map, reduce, etc.

While Guido is not keen on reduce, I don't think he has any desire to eliminate list comprehensions (which are equivalent to map/filter).

In other words, list comprehensions are the Pythonic version of same.

and Generator Expressions[1], a lazy version of list comprehensions [1]: http://www.python.org/dev/peps/pep-0289/

Yeah, but generators have been done much better. Look at Icon, Lua, or Prolog.

I wonder why. I never really use map over list comps, but I've used reduce in several situations where a comprehension wouldn't have been appropriate.

  6. I would like to see more FP constructs natively available in Python.
FP constructs such as? Map is there. Reduce is one import away. There is nice syntax for list comprehensions. What do you miss?

<cite>2. iPython was fun to use, helped out a lot, but it's not SLIME.</cite>

what exactly were you thinking about?

This question sounds like it's from 2005 rather than 2010. Lisp seems to have become fashionable again now, thanks to Clojure.

I'm sure Python has very good libraries, but I would find it constraining to program in a language without proper macros.

The problem with the Blub paradox is that there's no total ordering.

I do Common Lisp and C++ at my day job (ITA), and I do much of my personal hacking in Python. In Python and C++ I miss macros; in Lisp and Python I miss RAII and strong typing; in Lisp and C++ I miss dictionary literals.

And, in all of them, I miss algebraic datatypes.

Hello, my co-worker, whoever you are. Yes, Common Lisp should have dictionary literals. I don't know why there hasn't been a commonly-used reader macro for this. (Emacs Lisp mode and other tools would have to know about it, so it needs to be a widely-accepted convention.)

Common Lisp does have strong typing. What it does not have is static typing.

I am at the SPLASH conference, and the Dynamic Language Symposium is happening right now. There is controversy over whether we can find a way to have the benefits of both static and dynamic typing in the same language. The great advances in type inference make me hopeful. The keynote speaker, Allan Wirfs-Brock, replied to my question about this with more pessimism. It is not a simple question; for one thing, not everybody even agree about which factors are "pro" or "con" for either static or dynamic. I am not doing programming languages these days (I'm doing databases) but I continue to be hopeful.

Hello, my co-worker, whoever you are.

John Stracke. (I've been staying pseudonymous, but today I mentioned Adder, which is tied to my real identity.)

Common Lisp does have strong typing.

True. I need to remember to be more precise; "doesn't have strong typing" just means "doesn't have type feature Blub". Common Lisp has runtime type safety, and type hints for efficiency; what it does not have is the pervasive typing that I'm used to from C++, which has a separate set of benefits. The most obvious is that, in C++, I can change the interface to a class and be certain that the compiler will catch any caller that uses it incorrectly. (Although I suppose it may be possible to do something like that with CLOS. I haven't used much CLOS, since ITA avoids it.)

There is controversy over whether we can find a way to have the benefits of both static and dynamic typing in the same language.

I'd say that type inference already brings us nearly there: the convenience of dynamic typing, with the rigor of static typing.

I may be wrong, though; I've used ML and Haskell, but not enough to really feel where the pain points of type inference are.

'True. I need to remember to be more precise; "doesn't have strong typing" just means "doesn't have type feature Blub". '

That might be less precise, but more correct :-P.

"Common Lisp has runtime type safety, and type hints for efficiency; what it does not have is the pervasive typing that I'm used to from C++, which has a separate set of benefits. "

The type declarations aren't just for efficiency (although they are frequently (ab)used for it).

"The most obvious is that, in C++, I can change the interface to a class and be certain that the compiler will catch any caller that uses it incorrectly."

I'm not sure what common lisp version you use, but wouldn't this be fixed by simply declaring types of everything? You can declare the types of on the slots of a struct, you can declare the types of arguments to functions, results of functions, variables, slots of objects, contents of sequences... (having trouble thinking of something you can't declare types on, maybe a hashtable? Although you could wrap the accessors in a function).

Then SBCL (at least) yells at you when you go to recompile the project.

wouldn't this be fixed by simply declaring types of everything?

Yes, but I'm not so sure about the "simply". It's only marginally easier in C++; but at least you know that haven't forgotten to declare anything.

> I haven't used much CLOS, since ITA avoids it.

Interesting. Is it for performance reasons?

...actually, I don't know. When I started, I was told we don't use it; I don't remember whether I was told a reason. If I was, it was performance; anything else would have been surprising enough to remember.

I wonder if it's similar to why Jane St doesn't make a lot of usage of the O in Ocaml (from what I've gathered at least, I don't work there.....yet). Object systems can be nice but they tend to make understanding your code all the harder since you have all the dispatching. I don't know anything about CLOS but this is pointed out as a reason against using Java and the object system in Ocaml in the Caml Trading video.

I did mention in the original essay that it was only a partial order, in footnote [4].

Ah, but metageek's point is that it's not an order at all. It contains cycles.

Sorry, I haven't read it in a while.

> in Lisp and C++ I miss dictionary literals

So roll your own. Or use mine:


Forgive me, but I think you mean in python you miss static typing.

Python is strongly typed.

Also, just to jab at C/C++, pointers to void... really? It all but makes C/C++ a weakly typed language.

http://wiki.python.org/moin/Why%20is%20Python%20a%20dynamic%... http://www.artima.com/weblogs/viewpost.jsp?thread=7590 http://en.wikipedia.org/wiki/Duck_typing http://articles.sitepoint.com/article/typing-versus-dynamic-...

I can't help you with strong typing, but I'm pretty sure a macro would go a long way towards implementing RAII in a Lisp.

> I'm pretty sure a macro would go a long way towards implementing RAII in a Lisp.

Well yeah, just build the macro you need on top of unwind-protect, that's pretty standard common lisp fare.

That's true; we've already got special cases like WITH-OPEN-FILE.

Actually, you already have something better than RAII in unwind-protect. Just create whatever macro you need on top of that (I'm pretty sure with-open-file is just a macro on top of unwind-protect, Siebel seems to agree in his chapter on files) and you're done.

Why do you miss dictionary literals? Syntax similar to arc's seems to do it for me:

  (obj a 1 b 2 c 3)

We do the same thing:

  (make :a 1 :b 2 :c 3)
which macroexpands into a plist in CL and {a:1, b:2, c:3} in JS. I've grown to like this idiom a lot. Yeah, you have to type OBJ or MAKE or whatever but in return you get something that integrates completely smoothly with the rest of the language.

Y'know, I'm not sure. Next time I need to use a hash, I'll create something like that and see how it feels.

What about Qi? What about Typed Racket?

Typed Racket combines all the inconvenience of type declarations with all the performance provided by a dynamic language.

One question I've had about Typed Racket since I first saw it: why all the colons? Seems to mess with the elegance of Lisp, and especially Scheme.

There has to be another friendlier syntax for typing, no?

why not ? colons are used to denotes values type in ML

The Python philosophy is that macros do more harm than good, making it harder for someone to read/understand your code in the long term. These and other "constraints" make the code more accessible to others and even yourself.

Macros are just another abstraction tool. If you poorly use an abstraction tool, it makes the code harder to read, if you properly use an abstraction tool, it makes the code easier to read.

Here's a function that makes code harder to read:

def sumAList(aList): return 7

This doesn't mean that functions are bad.

Now there is an argument that macros make code harder to read in that I've yet to see a really good macro system that isn't dependent on the code having very little syntax (e.g. S expressions), since the more different the code is from the AST, the harder it is to manipulate the code successfully.

Combined with the fact that more syntax can make code much easier to read, there is a conflict here.

However, I don't think that's the argument you are making.

The argument is that macros have non-local effects; they interact with the code in which they are applied. This means that the macro definition and the code surrounding the macro invocation can't necessarily be understood in isolation.

This is also essentially the argument against global variables.

A function definition is, in general, far away from the function call. If you think this problem is more severe for macros than for functions, then you should articulate why. You may well have a very valid point in your mind but I think it needs to be expressed somewhat more specifically.

For non-hygienic macros, it's essentially the variable capture problem.

For hygienic macros, I don't know of a good argument that they are inherently more difficult to understand separately from their invocation than a function.

(I'm not personally arguing against macros - or global variables for that matter - just trying to state the argument).

I think it should be obvious that you use programming constructs only when difficulty of understanding it is less than the difficulty of understanding code without it (over the whole program). This applies to functions, classes, macros, frameworks etc.

Full macros (like in CL where they are just functions that don't evaluate their arguments) give the programmer same power as compiler writer or programming language designer.

ps. To really get benefit from Lisp macros, you would need to standardize code walker. Without code walker, macros can't reach their full potential.

What is code walker?

If you want local reasoning you will have to reject Turing completeness, otherwise you could implement a language with global variables and eval a bunch of code in that language.

I think we shouldn't be limiting our tools. We should instead limit their use. Global variables can be nasty, but it's nice to have them when your code is best expressed with global variables. Same for macros.

Yes, but any given function is usually extremely easy to understand, because it's only, say one level of abstraction, then a macro is a few levels higher than that. And as you go up in your levels of abstraction, it gets harder and harder to really understand what's going on. Sure, some macros are intuitive and easy to follow, but those are usually easily replicated with other things, especially in a dynamic language like Python.

I also don't mean to imply that Lisp is lesser for using macros. I love Lisp and any implementation clearly requires macros. But Lisp is also, undeniably, harder to read for this and other reasons.

Hmmm. I think your argument is roughly that there is a "sweet-spot" for abstraction when it comes to readability?

That is a point at which less abstraction makes the code harder to read, and more abstraction makes the code harder to read?

I will agree with this in specific cases (i.e. for any given solution, there is a point at which adding abstraction can't improve readability), but I'm not certain I agree in the general case (i.e. that using macros cannot improve readability).

I guess it also depends on what you mean by "really understand what's going on." I started out programming in C. Now in C, if you know your compiler well, you can predict fairly accurately what binary code will be generated when you compile with optimizations off. Moving to higher level-languages you lose this ability, and no longer "really understand what's going on."

For systems programming, I may still use C to get this advantage. For other problem domains, I sacrifice this knowledge because representing my problem more tersely in a higher level language makes the code more readable and easier to understand. Now I will never know exactly which instructions will be executed when I write in Python.

Similarly, with sufficiently fancy macros, I may not know what LISP code is generated, but if the macros do what they say they do, it can make my code less verbose, more understandable, and easier to maintain. There are times when really understanding what is going on trumps the terseness, and those times I don't use macros.

Also, I love Python. It embeds well in C (which is where my original background is), and it has very good portability, and a good set of libraries.

I also implied, but didn't say straight out that Python has a good reason for not having macros: Part of its design is to look like pseudo-code. See also Norvig's comment to the OP. Macros that operate on text rather than trees (C preprocessor, m4, etc) are far more error prone, and probably a Bad Idea. Therefore if you want your language to look like something other than a tree, you have to forsake macros that operate on code as it is written. I have seen for several languages (Python among them I believe) Lisp-like macros that operate on the AST of the language. They have not caught on. I have several theories why this is so, but right now my preferred one is that it feels too much like you're hacking the compiler, and Compilers Are Scary.

right now my preferred one is that it feels too much like you're hacking the compiler

I think what you implied just before that is a stronger argument: it's way too distant from the base language, and this semantic distance is so costly that it's not worth the trouble.

"And as you go up in your levels of abstraction, it gets harder and harder to really understand what's going on."

There is a difference between abstraction and indirection. Just because you've added the latter doesn't mean you've gained any of the former.

"Sure, some macros are intuitive and easy to follow, but those are usually easily replicated with other things, especially in a dynamic language like Python."

How would you implement SETF in Python? Or how about compile-time link checking for a web application (http://carcaddar.blogspot.com/2008/11/compile-time-inter-app...)?

At the International Lisp Conference last year, I put in an evening event called the "Great Macro Debate", in which these issues were discussed. (We encouraged humor, and flaming as long as it was witty, so it was a lot of fun.) What you say is true to an extent. Macros, like most things, can be abused. If you have a group of Lisp programmers, one thing you can do is have the junior ones request advice from the senior ones about what constitutes "tasteful and idiomatic" use of macros, and vet particular macros, since it is rather hard to crisply "pin down" just what those things mean.

I wish there was a language called harmless which common folk could use to express common thought without any fear. It would go like this:

do this do this and that and that too do this amen

No branching nor decision trees as not to confuse common folk. Now programming is a socially acceptable activity!

Macros can be used in python.

There are a few libraries that help make it easier(so you do not need to manipulate the ast yourself). For example:

  def macroname(arg1, arg2):
     ... macro contents ...
There's some current information for you old time lispers, so next time you don't sound so dated in your Battles with Trolls in the great never ending language war flames ;)

I like how you imply that you don't participate in the language wars. When you just did.

Link? The only "macro" library I've seen for python has been "MetaPython" (http://metapython.org/), which is more of a let down than it is useful, unfortunately.

Even if that happened to work correctly (which I don't believe), you would still be missing a bunch of macro-related stuff that makes CL the programmable programming language it is, and Python isn't:






EDIT: Fixed layout

Well, those kinds of claims are kind of par for the course for the Python community. I remember that it was commonly claimed that Python 1.5 had the "full power of the lambda calculus", when all it had was anonymous function definitions, and not true higher order functions.

An approximation of some of Norvig's recent thoughts (Feb 2010):

"(1) It just turned out that when Google was started, the core programmers were C++ programmers and they were very effective. Part of it is a little bit of culture. (2) Early Lisp programmers (Erann Gat) at Google actually noticed that other programmers were equally or more productive. It has more to do with the programmer; we're getting to the point where language choice is less important (as opposed to 20 years ago). (3) Lisp is optimized for a single programmer or a small group of programmers doing exploratory work... If I want to make a change in a weekend I'd rather do it in Lisp than anything else, but by the time you get up to hundreds of programers making changes are not a language problem but a social one. (4) Libraries."

Paraphrased from: http://www.youtube.com/watch?v=hE7k0_9k0VA#t=03m20s.

That reminds me of a cool story, in Norvig's talk about Python...

When he finished Peter [Norvig] took questions and to my surprise called first on the rumpled old guy who had wandered in just before the talk began and eased himself into a chair just across the aisle from me and a few rows up.

This guy had wild white hair and a scraggly white beard and looked hopelessly lost as if he had gotten separated from the tour group and wandered in mostly to rest his feet and just a little to see what we were all up to. My first thought was that he would be terribly disappointed by our bizarre topic and my second thought was that he would be about the right age, Stanford is just down the road, I think he is still at Stanford -- could it be?

"Yes, John?" Peter said.

I won't pretend to remember Lisp inventor John McCarthy's exact words which is odd because there were only about ten but he simply asked if Python could gracefully manipulate Python code as data.

"No, John, it can't," said Peter and nothing more, graciously assenting to the professor's critique, and McCarthy said no more though Peter waited a moment to see if he would and in the silence a thousand words were said.


That is a cool story...

Though, may I add that Python (or any other modern programming language) can manipulate its own code as data - only not as gracefully as Lisp. In other words, a Lisp program is its own AST - but in other languages the AST is only a "parse" away (and Python specifically makes computing it very easy).

"... My first thought was that he would be terribly disappointed by our bizarre topic and my second thought was that he would be about the right age, Stanford is just down the road, I think he is still at Stanford -- could it be? ..."

I've often wondered why McCarthy has never been asked to Startup school to talk about developing and using Lisp and the advantages?

He is probably not interested in talking about stuff when he could be doing stuff.

The Lisp vs. Python story really hasn't changed terribly much in the past five years or so. Both are still great languages once you learn to speak the idioms of the language. Both languages have persistent problems with people refusing to do so and then bitching that it's not $SOME_OTHER_LANGUAGE. Both languages have places where I'd suggest one of them over the other. Neither language is even close to a replacement for the other, and programming Lisp in Python is as big a mistake as programming Python in Lisp.

(Of the two though, Python seems to have more problems with people refusing to speak the native idioms and insisting on writing $LANGUAGE in Python instead. Python Is Not A Functional Language. It is a multiparadigm language where the functional is definitely the foreign and borrowed paradigm on top of an imperative/OO core. Ignoring that will bring you grief, but it won't be Python's fault.)

Later edit: In fact, refusing to speak Python's native idioms has been getting noticeably worse in the last six months. If you want a solid OO language with some decent functional borrowing, learn Python. If you want a pure functional language for whatever reason, do us all a favor and don't learn Python. Or at least don't add to the chorus of people complaining Python isn't Haskell, just go learn Haskell. Or Clojure, or whatever.

> The Lisp vs. Python story really hasn't changed terribly much in the past five years or so.

Actually there have been three significant developments in the last five years that IMO tilt the scales back over to the Lisp side:

1. Clojure

2. Clozure Common Lisp a.k.a. CCL (a very unfortunate confluence of names -- the fact that Clojure and Clozure differ by only one letter but otherwise bear almost no resemblance to each other causes no end of confusion).

3. The state of Common Lisp libraries has gotten a LOT better in the last five years.

The state of Common Lisp libraries has gotten a LOT better in the last five years.

And a lot (perhaps equally LOT) better still in the last couple weeks, with Quicklisp. Not that I've tried it yet :)

Fair about Clojure, I was assuming Common Lisp. Point 3 I consider not much net change, though, as the same is true of Python, and pretty much every other competitive language.

Before, Lisp had the edge in native code compilation and functional programming, Python had the edge in libraries. In the modern landscape, that made Python win IMO. Now CL has caught up to Python (mostly -- it's 90% of the way there) in libraries and retains its edge in the other two areas, so in my book CL has pulled back into the lead. YMMV.

Could you elaborate a little more on CCL? I thought it was mainly used to access Cocoa frameworks, but maybe things have changed since the last time I checked it. Are there any chances it can now work with Cocoa touch?

CCL has very good ObjC/Cocoa integration, and a well integrated IDE. But it also runs on Linux and Windows, it has native threads (that feature alone puts it head-and-shoulders over Python), and it has a wicked fast compiler. So it makes a kick-ass webapp development platform.

I don't know about Cocoa touch, but the CCL compiler was recently ported to run on ARM processors.

One down side to CCL is that it will not run on Intel boxes that don't have the SSE2 instruction set extensions. That isn't a big deal if you're doing a web application on the server of your choice (which will probably have SSE2), but if you want to use CCL on a desktop application that runs on Windows (or Linux), it could be a very big deal. People still use old computers, and I believe some of the newer netbooks don't have SSE2. If you're doing a desktop app, the last thing you want your customer to see is "sorry, you can't run that on this machine".

You can read more about this on clozure.com:


And that points to a list of SSE2-capable (and some non-SSE2-capable) machines:


I bring this up only as a caveat to those thinking about using CCL. I think it is great that it is available as a free Common Lisp implementation. It would be even better, however, if this limitation didn't exist.

> that feature alone puts it head-and-shoulders over Python

Uh no it doesn't. Python uses OS threads. It cripples them with the GIL (hence multiprocessing), but it never used green threads.

And of course, using green threads can be an advantage if you're not dumb about it (see Erlang).

> it has a wicked fast compiler. So it makes a kick-ass webapp development platform.


> Uh no it doesn't.

Uh, yes it does.

> Python uses OS threads. It cripples them with the GIL

And CCL doesn't. QED.

> using green threads can be an advantage

Not if you have multiple cores.

> What?



> Uh, yes it does.

That makes no sense, having the exact same feature doesn't make CCL superior.

> And CCL doesn't. QED.

QED nothing, not having a gil does something, using OS threads doesn't.

> Not if you have multiple cores.

Yes if you have multiple cores as well. You just have to map your green threads onto OS threads or processes, that's what Erlang does.

> ;-)

Smileys don't give meaning to nonsense.

CCL has no GIL.

Which has absolutely nothing to do with the original claim that:

> it has native threads (that feature alone puts it head-and-shoulders over Python)

As I wrote previously (and both you and lisper apparently decided to ignore), Python uses OS threads as well.

Native threads that are locked against each other by a GIL, are still native threads, but less useful native threads. An important purpose/use of native threads is that they are scheduled by the OS and not hindered by a GIL, so that in multicore machines multiple threads can run concurrently.

CCL does that. Python with the GIL not.

I tried writing an extensive Common Lisp vs. Python paper (yes, I have seen all of the existing ones), but it got too big and out of hand. One of interesting developments in Python recently are decorators, which allow some interesting metaprogramming. This can do some things that Lisp macros are used for. It's still not Lisp macros, but it;s picking off the low-hanging fruit, which helps Python coders a lot.

Any chance we might see the draft? :-)

Oh, it's so ugly, and it hasn't been checked for mistakes, etc. I'd be embarrassed to have my name associated with it.

I read you comment quickly and only now realized it's from you (Dan Weinreb, über lisp guru of ITA fame). I understand your felling (to put an early draft out). But, there is any chance you could send a copy to interested people who know the quality of your work (the man worked on the Common Lisp specification for god's sake!) and will read it as an early draft (no citations allowed ;-) and will promise not to send the draft to anyone? ;-)

You would probably entangle him in a two-hundred thousand dollar bet with Scott Aaronson.

Anything to say about MetaPython here, Dan?

I think most people switching from Lisp to Python do so for practical reasons. I suspect Peter Norvig probably wouldn't use as much Python if he didn't have to adapt to the Google culture.

I think Python is a surprisingly nice language in many ways. It falls into the category of being an "acceptable Lisp". But I think most Lispers still prefer the real thing and only use Python when they need to be pragmatic.

I don't know of any Lisp programmer who thinks Python in an acceptable Lisp. In fact, there are some things about Python that are completely unacceptable as far as a Lisp goes: (1) crippled lambda, (2) no easy way to pass chunks of code around, etc.... Obviously not having macros is a given in anything but Lisp. Ruby (because of blocks) is far more of an acceptable Lisp.

Yeah, the lambda issue in Python is definitely not very Lispy.

But I think even though Ruby offers a bit more for Lispers, I just don't like the design of the language as much in general as Python. If I don't code Python for a year and have to delve into it again it takes only a few seconds to get back to speed again. In Ruby, on the other hand, I always forget how the blocks & variables behave and am thrown by the strange predicate syntax, making this far more unpleasant.

I agree, I like Ruby better both for the syntax, the cleaner oo integration, and so on -- but a year and a half away from each, I can pick up the Python again a lot quicker. And python is friendly to scripting, low-level and embedded applications, which Ruby (Matz) is not interested in.

I agree. But if you find yourself with an employer who requires that everything be written one of a few languages, and Lisp isn't on the menu, I'd take Python over C++ in a heartbeat.

> Obviously not having macros is a given in anything but Lisp

At least these languages (that I know of) have macros:


True, but the syntax for all three of these is pretty hairy.

Lisps might not be the only languages with macros, but they're the only ones with elegant and easy to use macro systems.

You could argue of course this is a moot point, since even Lispers only write macros infrequently. So who cares if it's hard to write one on the rare occasion you need to?

I am amazed that people say Lisp is about macros. I think it's about being homoiconic. You can make some ugly form of macros in ANY language. Homoiconic-ness is what makes you want to make macros with Lisp. But it is much more than just about macros. Language REBOL is for example fully homoiconic.

  >> a: [ 300 3 ]
  >> insert a 'add
  >> probe a
  [ add 1 2 ]
  >> print first a
  >> do first a 100 1
  == 101
  >> do a
  == 303
  >> f: func [] a
  >> f
  == 3
  >> change second :f 'subtract
  >> f
  == 297
all code is data (blocks of words and values).

From * Io * example below I would say it's a little different. It's more like it has a runtime api to change/generate it's runtime. I think SmallTalk has something similar to this.

* Factor * has compile time macros. At runtime it has quotations, which are blocks of code (separate from it's other data strucures I think, but don't shoot me if I'm wrong) that can be modified and executed by your code with few specific words like curry and compose. This means you have a little less freedom than in rebol where block of code is in no way different than a block of data. What is awesome about factor is that it also compiles quotations at runtime.

Otherwise Factor is very very cool, and I envy some runtime features of Io a lot.

And Python has as little to do with lisp as Visual Basic. Python is the world's best dynamic Imperative lang IMHO :)

From Io example below I would say it's a little different. It's more like it has a runtime api to change/generate it's runtime. I think SmallTalk has something similar to this.

In Io, all code is data. Below is an example of changing a functions behaviour (from addition to subtraction):

    Io> plus := block (a, b, a + b)
    ==> method(a, b, 
        a + b
    Io> plus call (1, 2)
    ==> 3
    Io> plus message next setName ("-")
    ==> -(b)
    Io> plus
    ==> method(a, b, 
        a - b
    Io> plus call (1, 2)
    ==> -1
ref: Io Has A Very Clean Mirror (WayBackMachine copy) - http://web.archive.org/web/20080212010904/http://hackety.org...

Very cool!

There are few properties about concurrency, coroutines, embed-ability, and I suppose nice process of making bindings that I value really a lot and Io HAS. Looking at your example, I will definitely look again at Io. Thanks!

Io (http://www.iolanguage.com) is a bit of an exception then.

Its homoiconic but its not a Lisp. And it doesn't have macros yet its code is fully inspectable/modifiable at runtime in an very easy & elegant way.

For eg. if you wanted to have a line like this:

    list(1, 2, 3) each foo bar
which needs to be transformed into below at runtime:

    list(1, 2, 3) foreach(foo bar)
Then this can be done like so:

    List each := method(
	m := call message
	m setName("foreach") setArguments(list(m attached)) setAttached(nil)
	self doMessage(m)
ref: http://www.iolanguage.com/blog/blog.cgi?do=item&id=86

Does it have lazy evaluation?

Yes. The List each above is acting lazily.

Here is a better example of how it works: http://news.ycombinator.com/item?id=1627777

You could argue of course this is a moot point, since even Lispers only write macros infrequently.

This simply isn't true. In fact it's so far from the truth it hurts my brain. Well written Lisp applications are in large part (and sometimes mostly) macros. PG once mentioned that the ITA guys said over half of their codebase is (was?) macros.

Can you find the quote? I'd be surprised to learn that.

For my own code, it consists of maybe 10% macros. I would say the same is true for PG's code (for the hacker news app- The arc core libraries have more macros but are a special case.)

Unless, of course, if you mean _calling_ macros- I would agree that Lisp code often has 50% of its calls be macro calls instead of function calls.

50% code by volume being macros sounds terrifying.

Macros are like 2nd-amendment rights. You hope you never have to use them, but when you do you'd be in a world of hurt without them.

Factor supposedly has macros; from my cursory knowledge of the language I could see them being as convenient as Lisp.

Forth and Prolog also have static metaprogramming via transformations on the parse tree, i.e., Lisp-style macros. They're not emphasized as much in Prolog, though.

If your language doesn't have Lisp syntax (or more precisely, a lack of syntax entirely) writing macros will be very unpleasant. How are you going to logically think about manipulating the structure of a program that looks like C# or Perl? Macros are already difficult enough to get right as is, without introducing the problem of syntax. Those languages don't have macros any more than Python has a lambda.

How are you going to logically think about manipulating the structure of a program that looks like C# or Perl?

In Perl 6, you extend the active grammar within a delimited scope.

My point still holds, just replace manipulating the structure of a program with manipulating the structure of a grammar. In no way is that comparable to manipulating raw parse trees.

But you are manipulating parse trees: surely you have to add new grammar rules to the parser, but then the macro itself does just that.

Furthermore, when adding syntax you are not limited to a Lisp-style grammar.

Furthermore, when adding syntax you are not limited to a Lisp-style grammar.

You aren't limited to Lisp-style grammar in Lisp either. You can write reader macros if you have to. The point is, that on a basic level manipulating s-expressions is far easier to think about than manipulating the syntax of whatever the flavor of the day language is. You're essentially trying to invent reasons why macros in other languages are the same as Lisp macros and it's simply not true.

Lisp macros manipulate Lisp code once it has been read (parsed and tokenized into lists) and sometime before it is executed. That has nothing to do with grammars. Common Lisp also offers a facility to augment or replace the parsing and tokenizing mechanism in an unrestricted fashion.

Furthermore, when adding syntax you are not limited to a Lisp-style grammar

And this is a very good point because you can add things like XML grammar to Perl6: http://github.com/krunen/xml

At least P. Norvig things python is an acceptable language, if not LISP. I think the point about being optimized for small teams is a significant one. Also, because LISP appeals mostly to programmers, but it often "repeals" people with expert knowledge (e.g. scientists without a strong programming background). P. Norvig mentioned that when he converted his AI book from Lisp to python code examples, it seemed much more natural to most people,

I think you meant "repels", unless you meant "refudiates", though perhaps your finger stumbled in the middle of "repls".

"refudiate" isn't a word (though it appears to be a Palin-ism -- http://www.urbandictionary.com/define.php?term=Refudiate). Perhaps you meant "repudiate?"

I have at least the excuse of not being a native speaker, contrary to S. Palin :) Thanks for the correction.

I think the phrase "acceptable Lisp" itself needs to go away. It's caused quite a bit more heat than light.

Python is a great language, sure, but it's not a Lisp, and it's best appreciated on its own terms. Both because Lisp and Python really aren't very similar, and because people coming to Python with transliterated Lisp idioms are going to be disappointed.

I've used Python for about 4 years and am just starting to use Clojure, so I'll just add a few comments that others haven't mentioned. I'm not trying to offer a definitive comparison.

Clojure's data structures seem a lot like Python's but are a bit more elegant and avoid many of the little python headaches that come up often like d['k'] vs d.k vs d('k'). In clojure it would be (d :k) or (:k d) and both work. If you need memoization there are functions to help you. In clojure there definitely seems to be an attempt to make all the core data structures as compatible as possible (even having a formal abstraction (iSeq) for all of them)

Culturally, Python seems to care more about a minimal core language. Clojure.core has probably 3-4 times as many built-ins as Python. Many of the clojure functions are supporting features Python doesn't have or handles with syntax, like macros and conditional expressions, but there are also clojure functions like even?, that probably won't ever be a Python built-in.

Especially for predicates, functions like even?, every?, ffirst,

> and avoid many of the little python headaches that come up often like d['k'] vs d.k vs d('k').

When does that come up? d['k'] is a key access (to a collection), d.k is an attribute access (to an object) and d('k') is a method call. I'm not sure where the headache/confusion would be here, unless you're trying to do very weird things with Python (which you should not)

They are all mapping functions, though. You have a key 'k', which is accepted by d and will return a unique result. Why should I have to care whether d is a collection, object, or method? In fact in many cases it's pretty easy to implement all three.

    class months(object):
        def __init__(self):
            m='jan feb mar apr may jun jul aug sep oct nov dec'.split(' ')
        def __getitem__(self, key):
            return self.__dict__[key]
        def __call__(self, key):
            return self.__dict__[key]
    print d['jan']  # 1
    print d('jan')  # 1
    print d.jan     # 1

> I'm not sure where the headache/confusion would be here, unless you're trying to do very weird things with Python (which you should not)

I never said there was confusion. The headache comes when you're trying decide whether to present an interface to a mapping function as a collection, an object, or a function.

Especially for predicates, functions like even?, every?, ffirst,

(For the record, this line supposed to be edited out, but apparently I forgot to actually delete it)

Quotes from PG's Essays: If you look at these languages in order, Java, Perl, Python, you notice an interesting pattern. At least, you notice this pattern if you are a Lisp hacker. Each one is progressively more like Lisp. Python copies even features that many Lisp hackers consider to be mistakes. You could translate simple Lisp programs into Python line for line. It's 2002, and programming languages have almost caught up with 1958. Macros (in the Lisp sense) are still, as far as I know, unique to Lisp. This is partly because in order to have macros you probably have to make your language look as strange as Lisp. It may also be because if you do add that final increment of power, you can no longer claim to have invented a new language, but only a new dialect of Lisp. I think it is true even today

When I last tried Lisp (SBCL) i was surprised how hard it is to find standard, common sense 'batteries' as you would in python, and how hard it is to get things going if you're not experienced. I think the point where I jumped out of the boat was trying out 'Hunchentoot' or something like that (webserver).

I haven't seen that with Python, Ruby, C++, Java, even Haskell feels 'modern' in that way. Why must it be so hard to get simple stuff going ?

(Note: I like function programming concepts, but I would expect a programming language to be easier to bind into a context of reality.)

> standard, common sense 'batteries' ... even Haskell feels 'modern' in that way

Glad you noticed! We've been working hard on this: http://haskell.org/platform

For the macro side of things you may find these two old HN posts of interest:

* Why Lisp macros are cool, a Perl perspective (2005) http://news.ycombinator.com/item?id=795344

* Can a language have Lisp's powerful macros without the parentheses? http://news.ycombinator.com/item?id=9172

Python versus Lisp is not so much a dichotomy. Some bridges:

1.- Mathematica: you can use both infix and prefix form, Fullform[a+b] = [Plus,a,b]. Mathematica internally use prefix notation. Evaluation is more flexible than Lisp, you can define a function and decide whether it evaluates some, all or none of its arguments. 2.- Maxima: A layer over Lisp to define a infix language,in which you can define operators to resemble math notation, for example f(x):= x^2 similar to (defun f(x)(* x x)) 3.- Dylan. A lisp with infix notation. 4.- Willem Broekema cl-python, python in Lisp. 5.- Clojure. Clojure brings some nice syntax for getters and setters, function arguments and much more. 6.- comp.lang.lisp versus clojure. Clojure has a great community, lisp has some problems with lords. 7.- abcl is here, that is Lisp in java. abcl can run maxima without errors and that is great. 7.- Ruby, jruby, ioke, duby, those are efforts to achieve a very expressible language. 8.- javascript, the good parts. javascript with some anotations can be the next lisp. 9.- quick-lisp for a better installer than asdf.

There is a language and there is an ecosystem surrounding it (libraries, community, etc). Ignoring the ecosystem, the question "to lisp or not to lisp" pretty much boils down to "syntax or macros" - if you want macros, you go with a lisp, if you want syntax, you go with Python or another modern language.

I used to think macros matter more than syntax, because you can freely define your own micro-languages. I didn't really practice it, because of the practical limitations of the available lisps [1]. Now I think the opposite, that syntax matters more. Syntax helps you parse the code visually and you use lower level parts of your cortex to understand the code [2]. You can build arbitrary DSLs in lisps, but they all have no syntax, so they are of limited cognitive help. I think the real win are modern languages with syntax, that is malleable enough to facilitate the cognitive apparatus of the programmer in most cases, or at least most cases that matter. For example, an obvious DSL is the mathematical notation - Python / Ruby handle it well enough with operator overloading, Lisp actually does worse because of the prefix notation.

It is important to understand that you can approximate the bottom-up style of building abstractions with libraries (instead of DSLs), parameterizing the proper things, with minimal syntax noise. The remaining difference between macros and using higher level functions is mostly in run time optimization.

I guess seasoned lispers learn to "see through" all the brackets and engage the lower-level part of the brain in parsing lisp code. Ironically, something similar happens to Java developers - after enough hours looking at the code they start to ignore the ugly try/catch clauses that can't be properly abstracted away because of language limitatons.

[1] with the exception of one big project in Common Lisp, but I did only a little programming in it, and this was before I fully appreciated macros - but the guy before me used them extensively to build two layers of domain specific languages

[2] L Peter Deutsch talks about this in Coders at Work and this is probably more valuable than what I have to say:

Deutsch: I can tell you why I don’t want to work with Lisp syntax anymore. There are two reasons. Number one, and I alluded to this earlier, is that the older I’ve gotten, the more important it is to me that the density of information per square inch in front of my face is high. The density of information per square inch in infix languages is higher than in Lisp.

Seibel: But almost all languages are, in fact, prefix, except for a small handful of arithmetic operators.

Deutsch: That’s not actually true. In Python, for example, it’s not true for list, tuple, and dictionary construction. That’s done with bracketing. String formatting is done infix.

Seibel: As it is in Common Lisp with FORMAT.

Deutsch: OK, right. But the things that aren’t done infix; the common ones, being loops and conditionals, are not prefix. They’re done by alternating keywords and what it is they apply to. In that respect they are actually more verbose than Lisp. But that brings me to the other half, the other reason why I like Python syntax better, which is that Lisp is lexically pretty monotonous.

Seibel: I think Larry Wall described it as a bowl of oatmeal with fingernail clippings in it.

Deutsch: Well, my description of Perl is something that looks like it came out of the wrong end of a dog. I think Larry Wall has a lot of nerve talking about language design—Perl is an abomination as a language. But let’s not go there. If you look at a piece of Lisp code, in order to extract its meaning there are two things that you have to do that you don’t have to do in a language like Python.

First you have to filter out all those damn parentheses. It’s not intellectual work but your brain does understanding at multiple levels and I think the first thing it does is symbol recognition. So it’s going to recognize all those parenthesis symbols and then you have to filter them out at a higher level.

So you’re making the brain symbol-recognition mechanism do extra work. These days it may be that the arithmetic functions in Lisp are actually spelled with their common names, I mean, you write plus sign and multiply sign and so forth.

Seibel: Yes.

Deutsch: Alright, so the second thing I was going to say you have to do, you don’t actually have to do anymore, which is understanding those things using token recognition rather than symbol recognition, which also happens at a higher level in your brain. Then there’s a third thing, which may seem like a small thing but I don’t think it is. Which is that in an infix world, every operator is next to both of its operands. In a prefix world it isn’t. You have to do more work to see the other operand. You know, these all sound like small things. But to me the biggest one is the density of information per square inch.

Seibel: But the fact that Lisp’s basic syntax, the lexical syntax, is pretty close to the abstract syntax tree of the program does permit the language to support macros. And macros allow you to create syntactic abstraction, which is the best way to compress what you’re looking at.

Deutsch: Yes, it is.

Seibel: In my Lisp book I wrote a chapter about parsing binary files, using ID3 tags in MP3 files as an example. And the nice thing about that is you can use this style of programming where you take the specification—in this case the ID3 spec—put parentheses around it, and then make that be the code you want.

Deutsch: Right.

Seibel: So my description of how to parse an ID3 header is essentially exactly as many tokens as the specification for an ID3 header.

Deutsch: Well, the interesting thing is I did almost exactly the same thing in Python. I had a situation where I had to parse really quite a complex file format. It was one of the more complex music file formats. So in Python I wrote a set of classes that provided both parsing and pretty printing. The correspondence between the class construction and the method name is all done in a common superclass. So this is all done object-oriented; you don’t need a macro facility. It doesn’t look quite as nice as some other way you might do it, but what you get is something that is approximately as readable as the corresponding Lisp macros. There are some things that you can do in a cleaner and more general way in Lisp. I don’t disagree with that. If you look at the code for Ghostscript, Ghostscript is all written in C. But it’s C augmented with hundreds of preprocessor macros. So in effect, in order to write code that’s going to become part of Ghostscript, you have to learn not only C, but you have to learn what amounts to an extended language. So you can do things like that in C; you do them when you have to. It happens in every language. In Python I have my own what amount to little extensions to Python. They’re not syntactic extensions; they’re classes, they’re mixins—many of them are mixins that augment what most people think of as the semantics of the language. You get one set of facilities for doing that in Python, you get a different set in Lisp. Some people like one better, some people like the other better.

Lisp has a lot syntax. It is just a bit different and it looks different externally.

Lisp has a 2-stage syntax.

The first stage is the syntax of s-expressions, which is surprisingly complex. S-Expressions provide a textual syntax for data: symbols, lists, pairs, strings, various number formats, arrays, characters, pathnames, ...

The first stage is implemented by the 'reader' and can be reprogrammed by an ancient API to the reader via read tables.

The second stage is the syntax of the Lisp programming language. This is defined on top of s-expressions and is really a syntax over data structures (not text). This Lisp syntax deals with: data items, function calls, special forms (thirty something) and macro forms.

This syntax stage is implemented as part of the interpreter/compiler (EVAL, COMPILE, COMPILE-FILE) and can be extended by writing macros, symbol macros and compiler macros. In earlier dialects it could also be extended by writing so-called FEXPRs, functions which get called with unevaluated source code (-> data in Lisp).

So, we get a lot of complex syntax due to special forms and macros. It just looks a bit different, since the data syntax is always underneath it (unless one uses a different reader).

For example a function definition would be:

   (defun foo (a b) (+ (sin a) (sin b)))
The syntax for that is:

    defun function-name lambda-list [[declaration* | documentation]] form*
With more complex syntax for 'function-name', 'lambda-list' and 'declaration'.

Lambda-list has this syntax:

    lambda-list::= (var* 
                    [&optional {var | (var [init-form [supplied-p-parameter]])}*] 
                    [&rest var] 
                    [&key {var | ({var | (keyword-name var)}
                      [init-form [supplied-p-parameter]])}*
                    [&aux {var | (var [init-form])}*])

Not every valid Lisp program has an external representation as an s-expression - because it can be constructed internally and can contain objects which can't be read back.

Not every s-expression is a valid Lisp program. Actually most s-expressions are not valid Lisp programs.

For example

   (defun foo bar)
is not a valid Lisp program. It violates the syntax above.

"The first stage is implemented by the 'reader' and can be reprogrammed by an ancient API to the reader via read tables."

Readtables aren't any more ancient than the rest of ANSI INCITS 226-1994 (R2004) (the language standard formerly known as X3.226-1994 (R1999)), but the interface to them is very low-level and non-modular.

The current readtable is specified by a dynamic variable, so the readtable facility can be made modular with a nicer interface, in a portable manner. This is exactly what the library Named-Readtables does: http://common-lisp.net/project/named-readtables/

Now realize the significance of this: Common Lisp is the only language allowing total unrestricted syntactic extension and <i>modification</i> in a modular and (somewhat) composable way.

I've been using named-readtables for the past month, and between it and Quicklisp, I haven't been this excited about programming in CL since I started (which is 8 years ago, not that long, but I'm not a total noob either).

Read tables existed before CL. CLOS for example not, it was developed for ANSI CL (based on experience with LOOPS and Flavors). See for example READTABLE in Maclisp:


Also note that I wrote that the API is ancient. It is. It is old and could be easier to use.

'Named readtables' are related to 'syntaxes' on the Lisp Machine. For example source files have a syntax attribute in the header, which switches between the various Lisp dialects (or other languages), including using different readers. This is for example used by the file compiler and Zmacs.

Cool, I didn't know readtables were in Maclisp, or about LM syntaxes.

"It is old and could be easier to use."

Aside from something like named-readtables, how would you design the lowest-level interface to readtables? Or you wouldn't do that, and just specify something like named-readtables to be the interface? I'm curious because this could be something for http://www.cliki.net/Proposed%20Extensions%20To%20ANSI

I haven't thought about it much, but the character level interface is very primitive. Second, what about things like symbols, numbers, etc.? There is no sane way to specify number syntax or symbol syntax. That might be useful. Currently the reader provides an interface on a character level, but not on the level of s-expressions components.

What I think you're saying is that the reader should be customizable in terms of some DSL for a grammar. It would be nice, but I'm not sure how composable it would be (in the general case, I think it would come down to having the behavior of READ dependent upon a black-box current-parser procedure).

The nice thing about readtables is that it exposes what in essence are transition hooks for each character, so you really don't need to care about the grammar of stuff you're not interested in parsing.

OTOH like you said, extending the syntax for numbers or symbols becomes quite hairy. But even with a DSL grammar approach, you'd need to change major parts of the grammar (which means copying and modifying the normative grammar of CL syntax - not that different from grabbing a portable CL reader implementation today (http://twitter.com/vsedach/status/26484049015)).

The big downside is deciding on which class of grammars that will support, and how they will be represented.

You are right of course, but this kind of syntax doesn't help the programmer to cognitively parse the code, which is what I am talking about.

No, why not?

I can parse Lisp quite good.

I find it hard to maintain large-ish python codebase - it tends to wire up itself into a mess (in my case). It's probably due to my inability to do so, but I don't have same problems with C code.

Is it possible you're comparing lines of code directly? The "size" of a code base is tricky to define, but many people accept a rough rule of thumb that a project of complexity X that might require, say, 10K LOC in competent/straightforward C would require say 2K LOC in competent/straightforward Python/Ruby/etc. The implication is that an average 10K LOC Python program would in fact be a more complicated program than an average 10K LOC C program, and therefore "messier", all things being equal.

I can't really describe it. It's as if literal size grows, it's harder for me to wrap my head around what is where and why - I suspect it is due to the fact I don't write very pythonic code... more like I'm writing C code with Python syntax, if you know what I mean.

I am not particularly familiar with python too, I should educate myself properly, dwell more into it though. I only get by for what I need to do in Maya with it.

Ah, right, different problem. There's much to be said and disputed about what makes a 10K program Pythonic; my oversimplification would be something like: - between half a dozen and two dozen files in 2-3 directory levels - lots of small classes with short methods - some short functions - lots of dictionaries - no God objects - some comprehensions, generators, and decorators - about 2-4 levels of abstraction between the problem statement and the core routines in inner loops - pivot away from any of the above if the problem calls for it, but many problems don't

Even simpler: if you're writing short callables (functions/methods, mostly) and using them as the values in dictionaries, you're most of the way to writing intermediate Python.

Am I missing something, why does it have to be either Python or Lisp? After all I know, Peter Norvig has embraced Python in addition to Lisp, rather than switched completely as in "no more Lisp". Same goes for everyone else -- some tools are more fit for some tasks than others, especially with corprorate restrictions being a constraint.

Paul, this would be a great opportunity for you to plug Arc, an increasingly active Anarki (http://github.com/nex3/arc), and the growing Arc community (http://arclanguage.org/forum).

I feel like the Arc project pivoted until it became YC.

I used to write a reasonable amount of Lisp at my first job, in an industrial research lab, in addition to a fond exposure to it at my institution of higher learning.

I really, really like Lisp. The syntax, as a result of it having been introduced to me fairly early, never was a hangup of any kind. SLIME is quite wonderful. I always yearn for quasiquotation.

However, I cannot justify using it for a small-scale project that has to ship relatively soon with limited resources. The reason: Python has useful projects with good release culture and documentation that are not seen in the Lisp world, if only for the lack of contributors. (Considering the number of authors, I found common lisp documentation to be quite good, in fact, but I still find them pretty hard pressed to compete with, say, the Django manual, or most of the Python standard library itself)

Software — especially at the small, early stage scale — is still loaded with problems that are not matters of extensive, inspired, and unique programming. Or, they could be, but did you really want to write your own whiz-bang form validation instead of building all the other features of your product?

Clojure I think represents a very interesting escape from this trap, and if I had a lot of Java dependencies (whether bequeathed or because of unique library advantages Java may have. WEKA comes to mind for some things) I would most certainly want to use it. But in a project I control from the beginning, Python wins, hands down, in a number of domains.

I will also expose one heretical opinion of mine about Lisp Macros: they're great, but easy to overuse, and seldom So Necessary if one makes good use of higher order functions. Some code, admittedly, comes out nicer using macros, but I can almost always get the expressive power I need with little contortion using higher order functions.

If I had the time to follow my heart, I'd want to do something to continue to vitalize Lisp. But not while I need to ship.

To round out this post, here are some minor peeves on the Python side though:

* No symbol data type. I don't like using strings for these. I use SymbolType for this, but it's still comparatively clunky compared to CL.

* Hyphens are the best intra-token-delimiter (see what I mean?)

* No macros

* No quasiquotation

* Slow

* Have to avoid stepping on myself because of the mutable environment frames you see a bit more frequently as opposed to say, Scheme or Erlang where tail recursion is the name of the game. I also make use of "labels" in cl.

* Did I mention tail recursion elimination? No? Well...sometimes this would be nice.

Banana vs Oranges?

Bananas have much more starch than oranges, but oranges are easier to juice.

@Goldus - you seem like a old timer here. why post comments like these? Are you adding any value to the main conversation - python/lisp.

The point of the comment was to demonstrate that sometimes, it's worthwhile or at least interesting to compare (or actually, contrast) bananas and oranges, like Python and Lisp.

(Also, I had already posted my small contribution to the original topic)

They're both good fruit.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact