Hacker News new | comments | show | ask | jobs | submit login
Ask HN: Will the committee that built Common Lisp make a new one in the future?
41 points by behnamoh 38 days ago | hide | past | web | 36 comments | favorite

No. ANSI X3J13 committee has been discontinued.

It seems that using ANSI to make new standard is unadvisable. The process is cumbersome and there are copyright issues with ANSI.

If there is need to revise the standard. Common Lisp folks can for less byrocratic organization to do it. Library standardization can happen organically.

There is a trend that the second and subsequent revisions of ANSI and ISO languages tend to increasingly suffer from major "second system effects".*

Instead of just standardizing what is out there, the committees become self-serving; they keep meeting just for the sake of keeping the committee going, and begin inventing things for others to implement.

That X3J13 got the job done and then dissolved is very, very respectable.


* https://en.wikipedia.org/wiki/Second-system_effect

Even software companies do it.

This is the reason we have bloatware.

It's too bad, since being a "standard" has helped CL to grow so much, despite some warts that the language inherited from old times habits.

Without a standard, it seems most efforts to build a new Lisp come to a naught (think Racket and Clojure and Arc and ...)

Clojure come to naught? Better tell everyone who is hiring! [0]

[0] https://www.indeed.com/jobs?q=clojure

yeah, i second this. Clojure is a major success. Depending on how you look at it Racket is a decent success too. It seems to just be more focused on things like helping universities/uni students explore language creation, instead of being a tool for use in businesses.

Racket seems focused on education and experimentation and in that arena I think it's very successful. I'm pretty impressed with where Racket has gone and I think it's very cool.

I use CL quite a bit, and disagree with you.

The standard was great for removing incompatibilities between all of the old competing Lisps, but now that all of the implementations are on the same page and support the base line standard, I don't see the point in updating and revising the standard.

Is there really anything to add that couldn't be done as a library? And while there are warts, it's easy enough to avoid those parts or use macros and libraries to smooth over them. There may be some undefined behavior or implementation specific things in the standard, but it's far less than C or C++, and facilities like the features list makes it pretty easy for libraries to detect implementation differences and do the right thing.

> Without a standard, it seems most efforts to build a new Lisp come to a naught (think Racket and Clojure and Arc and ...)

Seems like a non sequitur. There are plenty of languages without standards that have become popular (Python, Perl, Java, Ruby, Go, etc.) and also plenty of non-Lisp languages that don't have standards and have gone no where. And then there's ILisp, which has a standard and has still gone basically nowhere.

> I don't see the point in updating and revising the standard.

I'd like to see outstanding issues noted in the existing standard to be resolved.

For instance, there is one ambiguity in unwinding. Unwinding occurs under the locus of a dynamic control transfer which has selected an exit point. Unwinding can be intercepted by a form like unwind-protect. That form can "hijack" the control transfer by initiating another one, possibly to a different exit point. The question is: what exit points are still visible and hence available when a given unwind protect cleanup block is executing? Can the unwind-protect choose a less distant exit point than the original control transfer?

ANSI CL basically leaves this aspect nonportable: when an unwind protect cleanup is executing, it could be the case that all of the exit points which were "on the way" to the current exit point are already "torn down". Or it could be the case that the implementation performs "tear down as you unwind": exit points are torn down as their binding forms are terminated during unwinding.

In a nutshell, it is implementation-defined whether to tear down intervening exit points during the original search for the exit point, or leave them up and take them down during the actual unwinding.

The latter behavior is probably less surprising and should probably be ANSI CL standardized.

This is a good point.

I have two things I wish were in the standard:

* I wish the abstract syntax generated by backquote were standardized. This would permit backquote forms to be used for pattern matching in something like Optima.

* I wish there were a standard coroutine mechanism, that could be used for CLU-style iterators (aka Python generators). Although threads can be used for this, the context-switch cost makes it undesirable to do so.

So it's not that hard to come up with examples of things we would like to have added or clarified. But I think they're things we can, after all, live without.

> Is there really anything to add that couldn't be done as a library

Yes. One example is tail call elimination. Without that being standardized, programmers are dissuaded from writing recursive code. And it can't be added with a library.

Clojure seems to be thriving, as far as I can tell.

Clojure is fine language, but it's not substitute for Common Lisp. They are different languages with different philosophies.

Has reach its peak and its slowly declining.

{{Citation needed}}

You have to think about this ecologically. What niche would a new CL occupy? What use would anyone have for something similar to CL but not identical? How could such a thing catch on?

If the features you think should be added are just extensions, it's very likely they can be added as libraries (though there are occasional exceptions). If they actually change the language so as to break existing code, it's very hard to see why anyone currently using CL would want to use your language. The CL ecosystem is already considered to be behind other major languages in terms of library coverage; your new language would be starting from zero on that point.

I think the only way a completely new Lisp would catch on is for some company to create their own, and then to spend enough years using it for enough different things that they create an adequate library on their own -- requiring that they get pretty large. But why would a small company, just starting out, create their own Lisp when they could use CL? (Okay, okay, of course PG did exactly that with Arc. But it's still an unusual choice.)

CL is unusually mutable as languages go, anyway. It's even possible for a library to turn it into a subtly but pervasively different language -- I think my functional collections library, FSet [0], is a great example. And of course, this is done without breaking existing libraries.

[0] https://github.com/slburson/fset

> How could such a thing catch on?

The way C++11 catches on: implementations update to the standard and it creeps into use.

C++ implementations catch up to the standard because they know there's user demand. The hard thing for a new CL standard would be getting users to care. Do you know of anything a new standard could offer that would attract wide interest?

- Unicode support

- Special characters in string literals via something analogous to \x3F, \177, \n, \t, \u+1234.

- Way to write long string literals split across lines with indentation, without involving format:

  TXR Lisp:
  (foo bar "this is just one \
           \ string literal with only single spaces")
- Standardized code walking primitives: one body of user code with no #+this #-that which correctly walks all special forms.

- expand-full function: perform all expansion on an expression in a given macro environment. Optionally report all free variable and function references still emanating from the expanded code.

- native lazy list via lazy-cons type which satisfies consp.

- require numerically equivalent fixnums to be friggin' EQ, damn it.

- Overhaul of path names, w.r.t. current OS landscape. One standard way to parse a POSIX or Windows path string to a path name, or a URL. path names should have a :method for this.

- Standardize the Meta-Object Protocol for CLOS.

- Standard support for weak hash table keys and values.

- Hash tables with arbitrary :test function.

- GC finalization support: register callback for finalized object.

> - Unicode support

Agreed, it would be nice if this were standard.

> - Special characters in string literals via something analogous to \x3F, \177, \n, \t, \u+1234.

> - Way to write long string literals split across lines with indentation

I haven't needed these often, but I guess some people would use them a lot and there's no downside.

> - Standardized code walking primitives: one body of user code with no #+this #-that which correctly walks all special forms.

Also agreed. Do you happen to know of an open source code walker that has all the required conditionalizations? I would be curious to see it.

> - expand-full function

Can't this be in a library?

> - native lazy list

I haven't done enough programming with lazy lists to have a good feel for the value of this.

> - require numerically equivalent fixnums to be friggin' EQ, damn it.

Heh :-) Why? Seems like this would be a big problem for ABCL.

> - Overhaul of path names, w.r.t. current OS landscape.

I agree, it would be nice if there were more standardization here.

> - Standardize the Meta-Object Protocol for CLOS.

Agreed; this should be easy by now.

> - Standard support for weak hash table keys and values.

I would prefer a more general weak-reference mechanism that would let us build our own weak data structures. Java has this. It's going to be hard for some implementors, though.

> - Hash tables with arbitrary :test function.

Desirable, though I don't use the built-in hash tables that much anymore anyway.

> - GC finalization support

I've never found finalization to be as useful as it sounds like it ought to be. It also seems to have a lot of overlap, both in use cases and in implementation difficulties, with weak references. I would just go with the latter.

Overall, this is a good list, though I don't agree with every item, and I have my own desires, as I mentioned previously. The question is whether the community can be pushed to any kind of consensus, at least on the point that there are enough things worth doing that it's worth setting up some kind of formal process to decide on them. It seems like an uphill battle to me. (In any case, I have no time to devote to it myself.)

Some of the things in that list are being done already or provided in different ways.

Implementations have weak hash support; not all exactly the same. I haven't seen weak references; that would be an invention.

There are expand-all type functions; not all the same.

Could it be a library function? Not one that supports everything in every implementation. The code walker support would help.

What if an implementation has a &foobar keyword that is supported in its macro lambda lists? A portable expander that doesn't know about this won't properly handle macros that use it. (It will handle global macros because it just calls those, but expand-all has to itself implement macrolet).

Code walking is done, with ugly portability switching and whatnot.

Also w.r.t. that, add:

- useful accessors on macro environment objects!

You can't do anything with a macro &env now other than pass it to macroexpand or macroexpand-1. It holds useful information like "is X a lexical variable in this scope?"

Speaking of lambda lists:

- standard parser for lambda and macro lambda lists.

That's a very good question. I asked this question at a dinner after European Common Lisp Meeting in Madrid in 2013, a wonderful, memorable gig, BTW. I was surprised how unanimous CL elites were on this subject. They were strongly against even small, incremental improvements or additions. IMHO, it would be nice to have a standard socket library, for example. One of the reasons mentioned was cost of the standardization, including updates of all the implementations. I kind of understand it, but, on the other hand, it makes me a bit sad. Things either grow, even very slowly, or become fossil.

I served on NCITS/J13 many years ago when we were trying to decide whether to revise the standard or merely reaffirm the old one. I believe we chose the latter course after many months of discussion but I don't remember why. Was anybody else here on the committee who remembers this? My vague recollection was that there were many directions we could have gone with a revised standard and we couldn't come to a consensus that wouldn't have ended up splitting the community into factions.

I think most standardization in the LISP sphere happens around RnRS family of standards, ie Scheme.

RnRS are barebone compared to ANSI CL, and the last few versions of Scheme suffered severe resistance from the community.

do you think its because they got old and fat?

(sorry for being snarky, there is a huge subjective difference between r5 and r6)

There is no reason to build a second Common Lisp. Those ideas are all fairly well represented, and you only need something new if you're diverging to different ideas: shen or haskell for types and values, julia for method dispatch, or pony for safe concurrency.

But there is need for an update of the Common Lisp standard. There are lots of improvements needed without creating a new language. There are different implementations for various extensions and one needs to standardize them, instead of providing a portable wrapper library. Stuff like Unicode strings/characters/files/streams really needs to be defined on a language level.

Typical things to fix:

  * basic threading
  * Unicode characters and strings
  * less undefined behavior
  * environments
  * extensible loops
  * CLOS conditions and streams
  * extensible sequences
  * security (fixing reader eval, ...)
and more...



so, there's nothing to improve in CL? My experience seems to be almost diametrically opposed to yours. CL seems a fossil from 20+ ago that hasn't incorporated any of the Computer Science learnings and advancements that have been made since its standardization.

I'm confused how you got to this interpretation after I listed a bunch of cool languages that (I think) have made many improvements to CL based on CS research.

There is only so far you can go with an "everything is mutable -- but you can do anything you want with syntax" model. Common Lisp 2.0 is not Common Lisp, it's lessons learned from Common Lisp plus new ideas rolled into something completely different.

Just as an example, Julia's take on CLOS is far more powerful, it preserves compile time macros, and takes SBCL to the cleaners on any benchmark you can find.

> hasn't incorporated any of the Computer Science learnings and advancements

somehow I fear that this is a good thing

Can you elaborate?



Getting monads into Lisp in a useful way would require the introduction of features that would certainly make it a different language. You may consider this a feature but you'd be arguing with a lot of people who consider it a bug. (I mean that sentence very seriously; no sarcasm, no joke, it is an accurate description of the situation, nor am I implying I would find the idea horrifying... it would simply be a quite significant change to the core type system.)

Are monads an /advance/, or are they just a language feature that might be nice in some cases, pointless or harmful in others -- like any other feature?

Some monads like the maybe monad would be a dramatic improvement over just defaulting everything to nil everywhere. Likewise error monads would nicely complement the existing CL error mechanism.

You can get some monad behavior in CL now. Bind is fairly easy to write in CL but unit is messy. And CL's lack of automatic currying doesn't help.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact