Hacker News new | past | comments | ask | show | jobs | submit login
A Critique of Common Lisp (1984) [pdf] (dreamsongs.com)
22 points by susam 41 days ago | hide | past | favorite | 15 comments

The critique is premised on a fundamental aesthetic mistake. It couches itself as a dissertation committee. But Common Lisp is an industrial artifact not an academic one. It was engineered using off the shelf parts. It's development was brownfield not green. Consensus not BDFL.

The eventual standard still stands up after more than three decades because Common Lisp was designed by people using Lisp. And designed for people using Common Lisp.

Seems like the authors soon had overcome most of the problems they were writing about. They were founding a Lisp company (Lucid Inc.) and selling a portable high-performance Common Lisp implementation:


This is two of the people whose work gave rise to Common Lisp criticizing work in progress on the language. True, they were both academics at MIT, but both were also industrialists--Gabriel founded Lucid and Brooks founded iRobot.

The paper is probably best understood as two insiders criticizing their own work in progress, trying to steer the community they were part of in a direction they thought would be beneficial. Gabriel was a member of the standardization committee, and Brooks was the author of papers influential in development of the standard.

Brooks was also one of the Lucid Inc. co-founders. Looks like he was also instrumental in writing Lucid CL.


"He was one of ten founders of Lucid Inc., and worked with them until the company's closure in 1993."


There are some cool tidbits in this interview:


> We argue that the resulting language definitionis too large for many short-term and medium-term potential applications (...)

Don't think this passage from the abstract stood the test of time. Common Lisp's standard library was large at the time, but is nowadays either comparable or small (or tiny!) compared to the standard libraries shipped with C++, Python, Java, Haskell. Software complexity skyrocketed since 1984.

I think GNU Common LISP was OK in 1994 if you had a 486 or other 32-bit computer.

The std library of Python is remarkably similar to that of CL and the issues compiling it much like those in Python. JIT compilers can beat the 'microcoded' system to deal with cases like

   x = y + z
where y and z are 'always' doubles so you can remove dynamism and even keep calm and carry on when the assumption is broken.

If you ask 'whatever happened to CL?' it is Python, Erlang, Clojure, Java, Javascript and all ripped off most of the good ideas from CL.

I am not sure about most of the good ideas. I would say it is "enough of the good parts with syntax people can relate to".

People are re-discovering lisp all the time. The nice things are still seemingly CL-exclusive; Conditions and restarts, proper macros (getting adopted in many languages NOW, about 30 years later), multiple dispatch OO.

And first and foremost: proper interactive development. Using a python repl (or even a notebook) I have the feeling I get all the frustrating bits of the repl experience without any of the good parts. With very lackluster performance to that.

I was kind of tired of not being impressed by new languages. Clojure and Erlang were the first ones in a very long time where it was the language (and not any additional libraries) that made me go "Wow. This is really cool".

Now, I still write most of my code in guile scheme and occasionally ocaml, but not without at least glancing once or twice at Erlang.

To provide perspective: "new languages" is understandable from CL POV but Erlang is 30 years old and Clojure is 15.

I chuckled, when I got to "FORMAT". It sounded familiar, like the C "printf" function often used in IOCCC.

If you are interested in some cool developments in the scheme world regarding string formatting, you should have a look at srfi-166:


On-topic: How common is the usage of linked lists in real world Lisp projects? Isn't the prevalence of this data structure in books encourage writing inherently inefficient programs?

Depends on what you mean.

In one sense, it's the most common thing in the world, because Common Lisp source code is made of linked lists, so every Lisp program makes extensive use of them. But that doesn't have any necessary relationship to the efficiency of running programs, because the compiled code doesn't have to traverse linked lists just because that's what the source code is made of.

In another sense, it's still pretty common because linked lists are a standard feature of the language with extremely convenient surface-level support, and because linked lists are sufficiently flexible to represent arbitrary data structures (albeit not necessarily efficiently). Those characteristics make it really convenient to use lists for modeling data while you're figuring out how you want things to work, and I personally do that all the time.

Again, that doesn't have any necessary relationship to efficiency in your completed program. I commonly use lists to knock together speculative data models and figure out access patterns. That doesn't mean I'll stick with them once I figure out the needed access pattern. It depends on the access pattern.

If it happens to be the case that linked lists are close to optimal for the access pattern I discover, then I'll stick with them. Why wouldn't I? If not, I'll use another data structure. Lisp makes it easy to design APIs that are agnostic about the data structures they operate on, which makes it easy to swap out one structure for another once you figure out which is the right one.

Whenever you need random access you use something else. If all you need is sequential access, there are lots of optimizations that can be done to the representation of lists that make them more efficient.

In guile scheme you gain very little by using vectors instead of lists unless you want random access. There is however a never-ending flood of people coming from python using lists in exciting ways to create accidentally O(n^3) programs :) Those can almost always be re-written to be o(n) using linked lists, otherwise you point them to vectors instead.

Besides random access, I mean the data locality, cache-friendliness in general.

That is usually done as a part of the GC. With a generational ormcopying GC you can "defragment" memory so you have the list in a contiguous region.

Not as good as a vector of course, but still decent.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact