Ask HN: Will the committee that built Common Lisp make a new one in the future? - behnamoh
======
nabla9
No. ANSI X3J13 committee has been discontinued.

It seems that using ANSI to make new standard is unadvisable. The process is
cumbersome and there are copyright issues with ANSI.

If there is need to revise the standard. Common Lisp folks can for less
byrocratic organization to do it. Library standardization can happen
organically.

~~~
behnamoh
It's too bad, since being a "standard" has helped CL to grow so much, despite
some warts that the language inherited from old times habits.

Without a standard, it seems most efforts to build a new Lisp come to a naught
(think Racket and Clojure and Arc and ...)

~~~
jlarocco
I use CL quite a bit, and disagree with you.

The standard was great for removing incompatibilities between all of the old
competing Lisps, but now that all of the implementations are on the same page
and support the base line standard, I don't see the point in updating and
revising the standard.

Is there really anything to add that couldn't be done as a library? And while
there are warts, it's easy enough to avoid those parts or use macros and
libraries to smooth over them. There may be some undefined behavior or
implementation specific things in the standard, but it's far less than C or
C++, and facilities like the _features_ list makes it pretty easy for
libraries to detect implementation differences and do the right thing.

> Without a standard, it seems most efforts to build a new Lisp come to a
> naught (think Racket and Clojure and Arc and ...)

Seems like a non sequitur. There are plenty of languages without standards
that have become popular (Python, Perl, Java, Ruby, Go, etc.) and also plenty
of non-Lisp languages that don't have standards and have gone no where. And
then there's ILisp, which has a standard and has still gone basically nowhere.

~~~
kazinator
> _I don 't see the point in updating and revising the standard._

I'd like to see outstanding issues noted in the existing standard to be
resolved.

For instance, there is one ambiguity in unwinding. Unwinding occurs under the
locus of a dynamic control transfer which has selected an exit point.
Unwinding can be intercepted by a form like _unwind-protect_. That form can
"hijack" the control transfer by initiating another one, possibly to a
different exit point. The question is: _what exit points are still visible and
hence available_ when a given unwind protect cleanup block is executing? Can
the unwind-protect choose a less distant exit point than the original control
transfer?

ANSI CL basically leaves this aspect nonportable: when an unwind protect
cleanup is executing, it could be the case that all of the exit points which
were "on the way" to the current exit point are already "torn down". Or it
could be the case that the implementation performs "tear down as you unwind":
exit points are torn down as their binding forms are terminated during
unwinding.

In a nutshell, it is implementation-defined whether to tear down intervening
exit points during the original search for the exit point, or leave them up
and take them down during the actual unwinding.

The latter behavior is probably less surprising and should probably be ANSI CL
standardized.

~~~
ScottBurson
This is a good point.

I have two things I wish were in the standard:

* I wish the abstract syntax generated by backquote were standardized. This would permit backquote forms to be used for pattern matching in something like Optima.

* I wish there were a standard coroutine mechanism, that could be used for CLU-style iterators (aka Python generators). Although threads can be used for this, the context-switch cost makes it undesirable to do so.

So it's not that hard to come up with examples of things we would like to have
added or clarified. But I think they're things we can, after all, live
without.

------
ScottBurson
You have to think about this ecologically. What niche would a new CL occupy?
What use would anyone have for something similar to CL but not identical? How
could such a thing catch on?

If the features you think should be added are just extensions, it's very
likely they can be added as libraries (though there are occasional
exceptions). If they actually change the language so as to break existing
code, it's very hard to see why anyone currently using CL would want to use
your language. The CL ecosystem is already considered to be behind other major
languages in terms of library coverage; your new language would be starting
from zero on that point.

I think the only way a completely new Lisp would catch on is for some company
to create their own, and then to spend enough years using it for enough
different things that they create an adequate library on their own --
requiring that they get pretty large. But why would a small company, just
starting out, create their own Lisp when they could use CL? (Okay, okay, of
course PG did exactly that with Arc. But it's still an unusual choice.)

CL is unusually mutable as languages go, anyway. It's even possible for a
library to turn it into a subtly but pervasively different language -- I think
my functional collections library, FSet [0], is a great example. And of
course, this is done without breaking existing libraries.

[0] [https://github.com/slburson/fset](https://github.com/slburson/fset)

~~~
kazinator
> _How could such a thing catch on?_

The way C++11 catches on: implementations update to the standard and it creeps
into use.

~~~
ScottBurson
C++ implementations catch up to the standard because they know there's user
demand. The hard thing for a new CL standard would be getting _users_ to care.
Do you know of anything a new standard could offer that would attract wide
interest?

~~~
kazinator
\- Unicode support

\- Special characters in string literals via something analogous to \x3F,
\177, \n, \t, \u+1234.

\- Way to write long string literals split across lines with indentation,
without involving _format_ :

    
    
      TXR Lisp:
      (foo bar "this is just one \
               \ string literal with only single spaces")
    

\- Standardized code walking primitives: one body of user code with no #+this
#-that which correctly walks all special forms.

\- _expand-full_ function: perform all expansion on an expression in a given
macro environment. Optionally report all free variable and function references
still emanating from the expanded code.

\- native lazy list via lazy-cons type which satisfies _consp_.

\- require numerically equivalent fixnums to be friggin' EQ, damn it.

\- Overhaul of path names, w.r.t. current OS landscape. One standard way to
parse a POSIX or Windows path string to a path name, or a URL. path names
should have a :method for this.

\- Standardize the Meta-Object Protocol for CLOS.

\- Standard support for weak hash table keys and values.

\- Hash tables with arbitrary :test function.

\- GC finalization support: register callback for finalized object.

~~~
ScottBurson
> \- Unicode support

Agreed, it would be nice if this were standard.

> \- Special characters in string literals via something analogous to \x3F,
> \177, \n, \t, \u+1234.

> \- Way to write long string literals split across lines with indentation

I haven't needed these often, but I guess some people would use them a lot and
there's no downside.

> \- Standardized code walking primitives: one body of user code with no
> #+this #-that which correctly walks all special forms.

Also agreed. Do you happen to know of an open source code walker that has all
the required conditionalizations? I would be curious to see it.

> \- _expand-full_ function

Can't this be in a library?

> \- native lazy list

I haven't done enough programming with lazy lists to have a good feel for the
value of this.

> \- require numerically equivalent fixnums to be friggin' EQ, damn it.

Heh :-) Why? Seems like this would be a big problem for ABCL.

> \- Overhaul of path names, w.r.t. current OS landscape.

I agree, it would be nice if there were more standardization here.

> \- Standardize the Meta-Object Protocol for CLOS.

Agreed; this should be easy by now.

> \- Standard support for weak hash table keys and values.

I would prefer a more general weak-reference mechanism that would let us build
our own weak data structures. Java has this. It's going to be hard for some
implementors, though.

> \- Hash tables with arbitrary :test function.

Desirable, though I don't use the built-in hash tables that much anymore
anyway.

> \- GC finalization support

I've never found finalization to be as useful as it sounds like it ought to
be. It also seems to have a lot of overlap, both in use cases and in
implementation difficulties, with weak references. I would just go with the
latter.

Overall, this is a good list, though I don't agree with every item, and I have
my own desires, as I mentioned previously. The question is whether the
community can be pushed to any kind of consensus, at least on the point that
there are enough things worth doing that it's worth setting up some kind of
formal process to decide on them. It seems like an uphill battle to me. (In
any case, I have no time to devote to it myself.)

~~~
kazinator
Some of the things in that list are being done already or provided in
different ways.

Implementations have weak hash support; not all exactly the same. I haven't
seen weak references; that would be an invention.

There are _expand-all_ type functions; not all the same.

Could it be a library function? Not one that supports everything in every
implementation. The code walker support would help.

What if an implementation has a &foobar keyword that is supported in its macro
lambda lists? A portable expander that doesn't know about this won't properly
handle macros that use it. (It will handle global macros because it just calls
those, but _expand-all_ has to itself implement _macrolet_ ).

Code walking is done, with ugly portability switching and whatnot.

Also w.r.t. that, add:

\- _useful accessors on macro environment objects!_

You can't do anything with a macro &env now other than pass it to macroexpand
or macroexpand-1. It holds useful information like "is X a lexical variable in
this scope?"

Speaking of lambda lists:

\- standard parser for lambda and macro lambda lists.

------
piokuc
That's a very good question. I asked this question at a dinner after European
Common Lisp Meeting in Madrid in 2013, a wonderful, memorable gig, BTW. I was
surprised how unanimous CL elites were on this subject. They were strongly
against even small, incremental improvements or additions. IMHO, it would be
nice to have a standard socket library, for example. One of the reasons
mentioned was cost of the standardization, including updates of all the
implementations. I kind of understand it, but, on the other hand, it makes me
a bit sad. Things either grow, even very slowly, or become fossil.

------
dreamcompiler
I served on NCITS/J13 many years ago when we were trying to decide whether to
revise the standard or merely reaffirm the old one. I believe we chose the
latter course after many months of discussion but I don't remember why. Was
anybody else here on the committee who remembers this? My vague recollection
was that there were many directions we could have gone with a revised standard
and we couldn't come to a consensus that wouldn't have ended up splitting the
community into factions.

------
zokier
I think most standardization in the LISP sphere happens around RnRS family of
standards, ie Scheme.

~~~
erikj
RnRS are barebone compared to ANSI CL, and the last few versions of Scheme
suffered severe resistance from the community.

~~~
convolvatron
do you think its because they got old and fat?

(sorry for being snarky, there is a huge subjective difference between r5 and
r6)

------
oconnore
There is no reason to build a second Common Lisp. Those ideas are all fairly
well represented, and you only need something new if you're diverging to
different ideas: shen or haskell for types and values, julia for method
dispatch, or pony for safe concurrency.

~~~
masukomi
so, there's nothing to improve in CL? My experience seems to be almost
diametrically opposed to yours. CL seems a fossil from 20+ ago that hasn't
incorporated any of the Computer Science learnings and advancements that have
been made since its standardization.

~~~
bykovich
Can you elaborate?

~~~
lispm
Monads?

~~~
bykovich
Are monads an /advance/, or are they just a language feature that might be
nice in some cases, pointless or harmful in others -- like any other feature?

~~~
dreamcompiler
Some monads like the maybe monad would be a dramatic improvement over just
defaulting everything to nil everywhere. Likewise error monads would nicely
complement the existing CL error mechanism.

You can get some monad behavior in CL now. Bind is fairly easy to write in CL
but unit is messy. And CL's lack of automatic currying doesn't help.

