
Is Lisp Still Unique? Or at Least Different? (2002) - tosh
http://norvig.com/Lisp-retro.html
======
bachmeier
Last update was 16 years ago. I'm not sure _this particular page_ is in any
way relevant in 2018. We could have a discussion about the question posed in
the title, and I'm sure some might anyway, but we won't learn anything useful
from the posted article.

------
ebzzry
The key problem with this article is, that Norvig cherry-picked features of
Lisp that are present with the languages that he contrasts it with. So, what
happens then is that it will give the impression that Lisp somehow lost its
uniqueness, or whatever makes it stand out among the others. The premises were
at best, loaded.

If we’re going to talk about CL, here are some of the features that still make
it unique:

    
    
      - live update of a running program, including (re)definitions of classes, condition handlers, etc.
      - object system which has multimethods, multiclasses, multidispatch.
      - it has a debugger and stepper which has complete access to the stack, with unwind protection
      - it has a very strong, unhygienic macro system.
    

The average programmer does not need a lot of these things because:

    
    
      - the tasks do not demand those features
      - the programmer doesn’t know them
      - the programmer doesn’t want to or can’t invest time in them

~~~
nerdponx
The main problem I had when learning CL was its standard library that at felt
both enormous and strangely deficient in utilities for day-to-day work. CLHS
is fine but the definitions can be obtuse, navigation isn't all that easy, and
it's just a generally daunting approach to learning a language. There's a big
gap between the code you learn in Practical Common Lisp and what you see if
you open the source for any popular CL library.

It's like Haskell, without the hype. The underlying paradigms aren't that
terribly different, but the actual terminology for expressing ideas
idiomatically is wildly different, and represents an unusual obstacle.

~~~
qbrass
>The underlying paradigms aren't that terribly different, but the actual
terminology for expressing ideas idiomatically is wildly different, and
represents an unusual obstacle.

Haskell uses a lot of terminology that's familiar to mathematicians, but
unfamiliar to everyone else.

CL uses a lot of strange terminology, because it incorporated concepts before
their names were standardized and used in other languages. You're left
spending a lot of time trying to figure out that a wheel is called a frob or
you just end up re-inventing the wheel even though the frob was already in the
standard.

------
neokantian
As I see it, Lisp is a language in (pretty much) pure prefix notation and of
which the core data structure is a linked list and in which functions are
first-class values. The advantages of the prefix notational purity are
manifold, but schools train people in the use of infix/Eulerian notation:

f(a+b,g(x,y)) versus (f (+ a b) (g x y))

The problem of notational convention has not been solved since 1958. Other
languages painstakingly support infix/Euler, requiring comparatively
incredible compiler construction effort. Doing so also removes the manifold
advantages of Lisp's notational purity.

JSON is much more popular than S-expressions. Therefore, we can assume that
nested arraylist/hashtables are more popular than linked lists as the core
language data structure.

Pretty much every serious language treats functions as first-class values
nowadays. So, this is no longer a strategic advantage in Lisp.

~~~
WalterBright
> incredible compiler construction effort

Not really. BYTE magazine, in the 1970s, published the complete source code
for a Pascal compiler, including the code generator.

~~~
neokantian
Pascal can undoubtedly be shoehorned into the target of a simple, single-pass
LL(1) parser. Doing so comes at a real "flexibility" cost, though. For
example, don't try to define your functions in any arbitrary order. It won't
fly. As a programmer, you will surely notice the limitations of single-pass
LL(1) compilers. Nowadays, infix/Eulerian notations are generally targeted
with at least an LALR(1) parser. The notational complexity in languages like
C++ (in gcc) and C# generally even exceed what LALR(1) can handle. Apparently,
their parsers need to be painstakingly hand-coded. In my book, that is clearly
overshoot. Do we really need to create a dependency on languages that even
LALR(1) cannot handle? What are you even getting in exchange? Debatable ...

~~~
WalterBright
Those are all valid points, but not relevant to infix vs prefix notation.

I don't know about C#, but parsing C++ is complex because you need a symbol
table to do it.

D doesn't need a symbol table to parse, but it does require arbitrary
lookahead. It still isn't hard to parse, and the parser fits in one file:

[https://github.com/dlang/dmd/blob/master/src/dmd/parse.d](https://github.com/dlang/dmd/blob/master/src/dmd/parse.d)

------
tabtab
I've always been intellectually fascinated by Lisp, but would NOT want to use
it in a production (team) environment. It has two problems:

First, is that it's _too_ "meta" in that one can shape it to be just about
anything they want. This can make it difficult for others to read your code,
for they have to know your mind's style. Our typical "office" languages tend
to hard-wire idioms, such as control structures (if, while, case, try, etc.)
which make it easier for other readers to digest. It may be less parsimonious
(more code), but it's overall easier for an outsider to come in and read it.

Second is that nested lists are the wrong base structure for humans. Our
natural languages are closer to maps or ordered maps (AKA "associative
arrays"), not nested lists. I was working with others at the C2.com wiki to
form something similar for maps: a meta-heavy map-oriented language. We
nicknamed the project "MASP". C2 fell apart, but I have some new ideas for
MASP that I may write up one of these days.

MASP may never go mainstream because of its heavy meta-ing, but could be great
for experimental, hobby, and rush-to-market startup projects. Paul Graham has
said Lisp's meta ability gave him an edge in the "stores race". Being first-
to-market matters in dot-com-ville more than team-friendly code.

~~~
lispm
> This can make it difficult for others to read your code

Actually I found that one can write very nice code with macro heavy code.
Reading the code is not the real problem - the problem is debugging/extending.
There are some macros, which are difficult to use - but Lisp has all the
mechanisms to write very clean code with excellent error handling.

> but it's overall easier for an outsider to come in and read it.

That's not different from a large layered object-oriented architecture - like
in Java. There are enough stories of Java teams where the architect went fully
generic and reimplemented all kinds of layers in his/her own style. That's no
less difficult to understand than a layered language in Lisp - with the added
advantage that the layered languages can be made much more problem specific
and thus is easy to read - because the code is actually shorter.

> Second is that nested lists are the wrong base structure for humans.

That would be surprising, since people are used to work with lists.

Actually lists in Lisp are only a carrier for things. A property list or an
assoc list is a map.

The really first object-systems were based on property lists hanging of from
symbols:

    
    
        CL-USER 1 > (setf (symbol-plist 'hans)
                          '(:class        :computer
                            :machine-type :lispm
                            :vendor       :symbolics
                            :model        :3600
                            :memory-size  20
                            :site         (:city hamburg :project :hamans)))
        (:CLASS :COMPUTER :MACHINE-TYPE :LISPM
         :VENDOR :SYMBOLICS :MODEL :|3600|
         :MEMORY-SIZE 20 :SITE (:CITY HAMBURG :PROJECT :HAMANS))
    
        CL-USER 2 > (get 'hans :model)
        :|3600|

~~~
tabtab
Re: _people are used to work with lists._

To a degree, yes. But it's not the best root _linguistic_ structure, by my
observation. But I do realize it probably varies from person to person. Others
have agreed with my map observation more or less, and thus there is a
potential audience for it, perhaps even bigger than nested lists.

And I do realize one can emulate maps in Lisp, but it's not native, and ends
up getting mixed in with lists unless somebody with a big stick stops it. I'm
working on a "native" map language so sticks are not necessary. (Similarly,
MASP drafts can emulate lists, it's not just as easy or natural.)

~~~
lispm
> root linguistic structure

The root linguistic structure of Lisp are lists of symbols:

    
    
       (move ship harbour)
    
       (move ship :location harbour)
    
       (collide ship asteroid :with-speed 100)
    
       (differentiate '(x * 2 + 3) 'x)
    
       (loop from i below 100 when (evenp i) collect i)
    

> even bigger than nested lists

since lists have been used in Lisp for map-like data a lot, there is very
litte advantage on a 'linguistic' level.

    
    
       (loop for (name age) in '((fred 10) (jane 21) (alfred 9)) ...)
    

It makes not much different if the data map uses a different character. A Lisp
user would probably layout the data so that it is easy recognizable as a
map/table. A typical Lisp user is trained to read structure and not so much
parentheses. It's a bit like bicycle riding - it looks difficult if you
haven't learned it already.

    
    
      ((fred  10)
       (jane  20)
       (alfred 9))
    

What Lisp already makes usual to many is that it uses a data-structure -
s-expressions - to encode programs in.

> but it's not native

Lisp-based maps were in Lisp since basically day one and were widely used.
There was little demand for different maps for a long time. There is little
added benefit from having 'maps' native. Lisp developers instead added
flexible object systems, where the instances were map-like. There are really a
zillion of them in all kinds of shades. Hash-tables were added to Lisp
probably in the 70s not as a new notation, but as an efficient data structure
for large amounts of key/value data - for example to implement the symbol
table.

There is even an example in CLTL2 how they looked like in Connection Machine
Lisp:

    
    
      {moe->"Oh, a wise guy, eh?"
       larry->"Hey, what's the idea?" 
       curly->"Nyuk, nyuk, nyuk!"}
    

[https://www.cs.cmu.edu/Groups/AI/util/html/cltl/clm/node192....](https://www.cs.cmu.edu/Groups/AI/util/html/cltl/clm/node192.html#READTABLESECTION)

There are many other examples of these, but generally people were working
towards objects as maps, with types and inheritance, etc.

Personally I don't think anonymous maps scale well for large interactive
applications I'm interested in. I prefer to see at runtime, what type this map
is about (is it price data, is it a product list, ...) - usually in form of an
object-system, a frame system, or similar. If I don't need that and want to
write literal data, I can just as well use a list (assoc list, ...). But then
I've seen a lot of Lisp code and are used to it.

~~~
tabtab
You ignored my "stick" point; it still stands. Anyhow, I'd like clarification
on "doesn't scale well". (Large apps should probably be partitioned into
smaller apps that use a database or the like to share info/state.)

And I think we have a different concept of "native". But anyhow, I don't want
to continue a "language war": people like what they like.

------
mark_l_watson
I had lunch with Peter around the time he updated this article. I had written
two Lisp books for Springer-Verlag about 5-6 years before. I was still using
Lisp but I think he had more or less switched to Python although he did write
a nice Scheme system in Java around that time.

In 2018 I still use Common Lisp for a major personal project (hybrid AI,
wrapping deep learning models so they can be used as functional components
(just blogged about this yesterday
[http://markwatson.com/blog/2018/08/19/hybrid-artificial-
inte...](http://markwatson.com/blog/2018/08/19/hybrid-artificial-intelligence-
systems.html)).

At the end of the day, language matters but libraries, tooling, and platform
support matter more.

------
shagie
I'm really curious about the benchmarks used for Java (and C++). List
processing being 20x slower than C++? Array access being 7x slower than C++?

So yep. A 16 year old benchmark when Java 1.4 was current (Java 5 was 2004
which brought with it a collections rework and a lot of improvements with not
everything needing to be synchronized) based on a 20 year old update on
something that was written in '91.

> Built-in Support for Lists. Java has the Vector type, which allows for
> sequences of dynamically varying length

That's ancient history.

> Dynamic Typing. Java attaches run-time type information to instances of the
> class Object, but not to primitive data elements. However, Java requires a
> type declaration for every variable. This has some advantages for production
> code, but has disadvantages for rapid prototyping and evolution of programs.
> Java does not have a generic/template system that would allow types life
> Vector<String>, and suffers greatly because of it. Python's object model is
> the same as Lisp's, but Python does not allow optional type declarations as
> Lisp does.

More ancient history.

The Java IDE being compared is BlueJ.

Some of those points were valid in '97\. Some were interesting in 2002. But
languages have changed and Java (1.4 -> Java 8 and beyond), Python (2.1 ->
3.7) and Perl (5.7 -> 5.28.0) have all matured quite a bit since then. And
then there's the ideas of functional programming that are seeping into other
languages.

This article aged worse than the bumperstickers from '85 that's also currently
on the front page.

~~~
kamaal
>>But languages have changed and Java (1.4 -> Java 8 and beyond), Python (2.1
-> 3.7) and Perl (5.7 -> 5.28.0) have all matured quite a bit since then. And
then there's the ideas of functional programming that are seeping into other
languages.

Apart from Perl no other language has lispy features like CL. Clojure is the
only language that comes close. Perl is the only option you have even now if
you want to do decent enough higher order practical programming.

[https://hop.perl.plover.com/](https://hop.perl.plover.com/)

Also there is big difference between having some token features(like lambdas)
bolted on as a after thought and enabling fully blown functional programming
which are well designed and integrated into the language.

Apart from Perl, Functional programming looks very half baked, hard and
brittle in most of these languages.

If you want a full featured programming language, including both Functional
and OO styles, take a look at Perl 6. Its designed to work will all
programming styles.

>>This article aged worse than the bumperstickers from '85 that's also
currently on the front page.

Every single thing from that article is true. What are you talking about?

~~~
shagie
> Java has the Vector type, which allows for sequences of dynamically varying
> length

The Vector type in Java has been not best practice since Java 1.5 when the
Collections framework came out. In particular, the Java 1.4 days and before
all of the collection classes were synchronized leading to some not
insignificant performance hits (even in single threaded applications).
ArrayList and LinkedList are the classes that came with the rework. In
particular, while Vector implements List, its backed by an array and so
various list like operations (like adding a large number of elements without
presizing the array) can take a performance hit.

> Automatic Storage Management. Java and Python support this. Lisp
> implementations tend to be more mature and perform better.

There have been numerous changes to the JVM for garbage collection since 1.4.
While I can't speak to LISP's garbage collection implementations, Java has
come a long ways in the past 16 years.

> Dynamic Typing. Java attaches run-time type information to instances of the
> class Object, but not to primitive data elements. However, Java requires a
> type declaration for every variable. This has some advantages for production
> code, but has disadvantages for rapid prototyping and evolution of programs.
> Java does not have a generic/template system that would allow types life
> Vector<String>, and suffers greatly because of it.

Specifically, Vector<String> has been available since Java 1.5. Java 10 brings
local type inference ( [https://developer.oracle.com/java/jdk-10-local-
variable-type...](https://developer.oracle.com/java/jdk-10-local-variable-
type-inference) ) so instead of

    
    
        Map<User, List<String>> userChannel = new HashMap<User, List<String>>();
    

in Java 1.7 one can write:

    
    
        Map<User, List<String>> userChannel = new HashMap<>();
    

and in Java 10:

    
    
        var userChannels = new HashMap<User, List<String>>();
    

> First-Class Functions. Java has anonymous classes, which serve some of the
> purposes of closures, although in a less versatile way with a more clumsy
> syntax. In Lisp, we can say (lambda (x) (f (g x))) where in Java we would
> have to say new UnaryFunction() { public Object execute(Object x) { return
> (Cast x).g().f(); } }

Lambdas are available in Java 8.
[https://docs.oracle.com/javase/tutorial/java/javaOO/lambdaex...](https://docs.oracle.com/javase/tutorial/java/javaOO/lambdaexpressions.html)

As a note, the bit about python closures being read only ("The only drawback
is that closed-over variables are read-only.") gets to a very interesting
debate on first class environments that is described at
[http://funcall.blogspot.com/2009/09/first-class-
environments...](http://funcall.blogspot.com/2009/09/first-class-
environments.html)

> Interactive Environment. Some Java environments allow Lisp-like features
> such as an intgeractive command loop and stop-and-fix debugging. Lisp
> environments are still ahead, but that probably won't last long. BlueJ in
> particular has most of the major features of a good Lisp environment: you
> can recompile a method into a running program, and you can type in an
> expression and have it evaluated immediately. It is intended for teaching
> purposes, and I can't tell if it is suitable for production use. Python has
> the same interactive approach as Lisp, but the environment is less mature
> than Lisp's.

Eclipse, Netbeans and IntelliJ all have rather advanced debuggers. Eclipse has
a 'display' view as shown in [https://www.ibm.com/developerworks/library/os-
ecbug/](https://www.ibm.com/developerworks/library/os-ecbug/) that allows
compilation and execution of arbitrary Java code.

> Extensibility. This may prove to be Java's weak point. Java works well as
> long as you are willing to make everything a class. If something new like,
> say, aspect-oriented programming takes off, Lisp would be able to
> incorporate it with macros, but Java would not.

Aspect oriented coding is well supported in Java with libraries such as
[https://en.wikipedia.org/wiki/Spring_Framework#Aspect-
orient...](https://en.wikipedia.org/wiki/Spring_Framework#Aspect-
oriented_programming_framework) and
[https://en.wikipedia.org/wiki/AspectJ](https://en.wikipedia.org/wiki/AspectJ)
.

\---------------

Yes, in the Java 1.4 world, what was said was true. That is a very, very long
time ago for Java and many of the criticisms that were provided are no longer
valid.

The higher order functions of Java are not as Lispy as Perl's, but they exist
as Streams. In Java 8, one can write:

    
    
        int sum = widgets.stream()
            .filter(b -> b.getColor() == RED)
            .mapToInt(b -> b.getWeight())
            .sum();
    

Note that filter and mapToInt are both taking lambdas are arguments.

I would encourage you to glance at Groovy which in Java is my favorite Perl.

    
    
        ['a', 'b', 'c'].eachWithIndex { it, i ->
            println "$i: $it"
        }
    

or for a perl style map:

    
    
        assert [1, 2, 3].collect { it * 2 } == [2, 4, 6]
    

and a perl style grep:

    
    
        assert [1, 2, 3].findAll { it > 1 } == [2, 3]
    

It also has nice support of closures, lambdas currying and other functional
programing concepts ( [http://groovy-lang.org/closures.html](http://groovy-
lang.org/closures.html) ). There's also some neat things that one can do with
JSR 223 and groovy - [http://groovy-
lang.org/integrating.html#jsr223](http://groovy-
lang.org/integrating.html#jsr223)

~~~
kamaal
Sorry, none of this comes remotely close to what Lisp provides.

Also the code you wrote won't pass production code review at any decent Java
shop.

~~~
zmmmmm
If you want you can do very functional stuff in Groovy. Here's an example of
function composition:

    
    
       f = { it * 3 }
       g = { it + 4 }
       h = f << g
       h(2)
       18
    

The reason this is not particularly common usage is not because you can't do
it, but because most people don't find it very intuitive.

~~~
vorg
The unintuitive part is the choice of operator `<<` instead of `>>`. in your
example `f << g` means do `g` first (i.e. 2 + 4 = 6) then do `f` on the result
(i.e. 6 * 3 = 18). I would have expected `f >> g` for this behavior.

~~~
dllthomas
I really don't follow your expectation here. "Data moves in the direction of
the arrows" seems natural to me. Can you unpack it at all? I'd like to
understand where you're coming from.

------
bitwize
Lisp is largely obsolete. Clojure is the truly modern Lisp. It integrates
seamleasly with the world's most widely used runtime, its data structures are
immutable and persistent by default, vectors and maps are first-class objects
alongside lists, and its list data structure is really a tree of vectors.
Every C++ developer knows that you should always prefer vectors to linked
lists because of how poorly linked lists perform under modern CPU caching
hierarchies. Clojure lists are much more suited to modern CPUs than
traditional Lisp lists are. Clojure also provides the abstraction of the
"conjable", which allows transducers to be abstracted over many different
collection types.

~~~
lispm
> poorly linked lists perform under modern CPU caching hierarchies

It's poor practice to have all data structures being dictated by machine
architectures.

Actually there is very little evidence that a purely vector-based Lisp would
be faster. From what I've seen, Clojure programs are not faster than CL
programs. Maybe you have numbers which prove otherwise?

> Clojure lists are much more suited to modern CPUs than traditional Lisp
> lists are

'much more'? Evidence?

------
blarg1
My favourite thing about lisp is the lists and symbols, it is so nice for
representing data because you don't have to declare data structures etc if
that makes any sense.

No other language I know of that does that.

~~~
henrikeh
Rebol and Mathematica/Wolfram Language are two other that come to mind. Albeit
Wolfram is kind of a Lisp. Possibly also Lua and its tables concept.

------
bla2
(2002)

~~~
dang
Thanks, added.

------
thayne
Yes, just like every other language.

------
roenxi
The premise of the title is a bit odd, because technically anything a lisp can
do assembly language can do. The bar on being able to do 'anything' is so low
that many things accidentally become Turing complete.

Can lisp programmers still feel smug about using lisp? No, because it never
was and never will be productive to feel smug.

Is lisp still the best language for expressing a programmers thoughts? Yes,
because it has basically no syntax, so any program will be a direct
representation of a programmer's mental model for how their program works. No
language with syntax will do that, but the average programmer is probably best
actively discouraged from utilising this freedom because they probably have
flawed mental models of what is going on.

~~~
mamon
>> because it has basically no syntax, so any program will be a direct
representation of a programmer's mental model for how their program works

That's actually a problem with Lisp, because every programmer would have
different mental model and without some syntax rule imposing at least basic
structure most of Lisp programs would be PITA to maintain by the unlucky next
programmer. It also seriously limits the scale of teams of Lisp programmers
(is that even a thing?), because every one of them will have their own "right
way" of doing things.

~~~
cultus
I think this criticism of lisp is overblown. It’s never idiomatic to introduce
a macro unless you really need one. In the clojure community at least, macros
are used pretty sparsely. The base language is is expressive enough.

~~~
pharrington
Macros are only tangentally related to mamon's point.

~~~
cultus
Really? Because apart from macros, Lisp has less syntax and little complexity
compared to most languages. The issue mamon was talking about is about
different programmers using different mental models or features. If macros
aren't overused, this isn't nearly as big of a deal as it is for C++ or Scala.

I should except common lisp here. That is indeed a complex language. But
Clojure and Scheme aren't/

~~~
zeveb
No need to except Common Lisp here. I typically find it quite easy to
understand other programmers' Common Lisp. It's not a difficult-to-understand
language, and neither are macros, properly used (and they _are_ indeed rarely
improperly used).

