
The Evolution of Lisp (1993) [pdf] - steven741
https://www.dreamsongs.com/Files/HOPL2-Uncut.pdf
======
vindarel
To refresh your ideas on CL (I know I needed it):
[https://github.com/CodyReichert/awesome-
cl](https://github.com/CodyReichert/awesome-cl) There may be more libraries
than you think. Also recently some usual complaints have been fixed:

\- hard to set up environment ? ->
[http://portacle.github.io/](http://portacle.github.io/) There's also Lem, and
Atom is getting good. [https://lispcookbook.github.io/cl-cookbook/editor-
support.ht...](https://lispcookbook.github.io/cl-cookbook/editor-support.html)

\- horrible website ? [http://common-lisp.net/](http://common-lisp.net/) (they
fixed it last year) Also [http://lisp-lang.org/](http://lisp-lang.org/)

\- no cross-platform, featureful GUI library ? ->
[https://github.com/lispnik/iup/](https://github.com/lispnik/iup/) (alpha)
There's also Qtools for Qt4. And Ltk, but Tk lacks widgets.

\- no object-oriented, generic math or equality operators ? ->
[https://github.com/alex-gutev/generic-cl/](https://github.com/alex-
gutev/generic-cl/)

\- no decent library for numerical computing ? ->
[https://github.com/numcl/numcl](https://github.com/numcl/numcl)

\- waiting one month to publish your library into Quicklisp is too much ? ->
[http://ultralisp.org/](http://ultralisp.org/)

\- lack of online beginner friendly documentation ? ->
[https://lispcookbook.github.io/cl-
cookbook/](https://lispcookbook.github.io/cl-cookbook/)

~~~
0x445442
The lack of a really good open source IDE or IDE plugin is a real road block
for a number of people. Pointing folks to Emacs or an Emacs derivative is is
not the solution. I know there is a couple of IDE plugins for Clojure and I
think it would help if there were similar options for Common Lisp.

I know there used to be a plugin for Eclipse (Cusp) but that appears to have
died.

~~~
m463
I watched a fascinating presentation and when I saw the speaker typing (lambda
...) and having it change to (λ ...) and not having to deal with parens.. I
was jealous.

Does emacs do this?

EDIT: the presentation:

 _William Byrd on "The Most Beautiful Program Ever Written"_

[https://www.youtube.com/watch?v=OyfBQmvr2Hc](https://www.youtube.com/watch?v=OyfBQmvr2Hc)

~~~
dTal
Some languages understand λ directly (and in all lisps that support unicode,
you can explicitly alias it) - I've occasionally mapped AltGr+L to λ, when
playing with those. It's really nice.

APL perhaps goes a little far, but I'd like to see more standardized symbols
in programming languages. Imagine how nice it would be to succinctly use
things like the actual outer-product operator ⊗...

~~~
m463
> APL perhaps goes a little far

When I learned APL long ago in a languages course, it was on a dedicated APL
setup with an APL keyboard.

In that context, it was a "how SHOULD things be" sort of design decision. I
know I started off quickly thinking in terms of math and the language, not
composing symbols.

Of course, APL keyboards have gone the way of the dodo and so the language has
a higher barrier to entry.

------
mruts
I’ve always thought it unfortunate that fexprs were dropped for macros in
lisp. For a language that touts homoiconicity and first class
functions/objects, it feels strange that macros aren’t first class.

My understanding is that fexprs have some perfomance problems over macros, but
I’m not really sure why or if the state of the art has advanced since then.

A really interesting evolution of lisp is Shen. The license is terrible and I
seriously doubt it will ever achieve widespread use. And while the idea of a
kernel lisp sounds great, in practice I doubt that it actually works.

The performance trade-offs that you’re going to need to make become very
unclear when you have a language that can run on A python, CL, js, or whatever
runtime. The JVM is successful because it unifies language and implementation,
instead of fragmenting it further like Shen does.

Racket is the clear horse to pick in this race, I think. Hopefully Racket on
Chez will improve the performance to close to that of native binaries. If so,
I really think that it would be the last piece of the puzzle.

~~~
kmill
I'm not sure what the relationship between fexprs and homoiconicity is
supposed to be. If anything, the arguments to an fexpr in a Lisp with lexical
scoping are opaque and cannot be manipulated, since they are wrapped up in a
closure by the interpreter. Plus, not every macro can be written as an fexpr
(for example, LOOP). In contrast, fexprs seem to work well in PicoLisp because
it uses dynamic scoping exclusively.

Traditional Lisps are only functional in the first-class-function sense, but
not in the mathematical sense where functions give the same output for the
same inputs. Furthermore, they use an eager reduction rule for function
applications, and not lazy reduction. Put together, programmers rely on
knowing the implicit control flow so that effects occur in their proper order.
All fexprs really do is let the programmer define constructs that manipulate
this control flow in non-standard ways. If you think about it: why have fexprs
be part of the language if all you need to do is wrap each argument in (lambda
() ...)?[1] (If you make { and } be a reader macro for (lambda () (progn
...)), then you would possibly save a little typing.)

(In Haskell, the eager evaluation is sort of abstracted into the IO monad, and
pervasive lazy evaluation is that everything is an fexpr.)

I think of Lisps (Common Lisp in particular) as actually being two languages
joined together: the metaprogramming (macro) language and the core Lisp
language. It can be a bit confusing distinguishing them because of how
metaprogramming stages are interleaved and how the metaprograms are written in
Lisp itself. The macro expansion language is a simple tree rewrite language,
where for each node, the system looks to see whether there is a corresponding
rewrite rule (a _macro_ ), then replaces that subtree with that rule's result.
This continues until quiescence, at which point the completed program is
passed to the core Lisp interpreter/compiler.

One strange Lisp variant is Mathematica, where the _entire_ language is based
around these tree rewrite rules --- it's all basically macro expansion. In
principle, this means Mathematica does not need macros, but in practice it can
be very difficult to control the order of the rewrites.

Lisp macros _are_ first class in a certain way, at least in that you can refer
to them with (macro-function ...). I know what you mean though, how you can't
just apply #'and to a list of booleans and get the actual boolean "and." I
sort of think that Common Lisp should have distinguished between the function
and macro slots of a symbol. That way you could define both a macro and
function version of a symbol if they both make sense. This would also better
distinguish the metaprogramming and evaluation stages.

Something I wish were true about Lisp is that eval(eval(x)) == eval(x) (at
least if the expression x has no side effects!). There are a few things in the
design that make this difficult or impossible: (1) evaluating a symbol gets
its lexical or global binding, so double evaluation of 's will give the value
of s and (2) there is no distinction between a form and a list literal, other
than through quotation. A way around (1) is to use MDL-style explicit
accessors to get the bound value of a symbol and make symbols self-evaluating
(so .x gets the lexical value of the symbol x), and a way around (2) is to
have tagged lists like in MDL, where the tag is part of the pointer to the
first cons cell. Mathematica does this too, to some extent. For example,
{1,2,1+2} in Mathematica is short for List[1,2,Plus[1,2]], which evaluates to
List[1,2,3]. "List" is not an element of the list, but rather the tag (called
the "head"). (There's a hole in my proposal, however: if the lexical value of
a symbol is a form, then double evaluation will evaluate to what that form
represents. I don't know away around this. This might simply be a formal
logical contradiction from trying to collapse different levels of
representation into one, a Russell-like paradox.)

I'll just throw something else I've been thinking about here: the old Lisp-1
versus Lisp-2 debate is about whether a symbol should refer to a single thing,
or whether using it as a function vs using it as a value should refer to
different things. The utility of two namespaces seems to be than when Lisps
were dynamically scoped, setting the value of a symbol wouldn't cause a clash
if that symbol happened to be used as a function elsewhere. However, with the
advent of lexical scoping in Lisps in the 80s this became a bit less of an
issue, though it remained in the design to avoid collisions in macro
expansion. I wonder if the "correct" Lisp-2 ought to be global vs local
bindings for a symbol, at the cost of needing to annotate symbols like .x and
,x for local and global bindings, like in MDL. As a convenience, (f .x) would
be short for (,f .x).

[1] This is the paper in which Kent Pitman argued against fexprs in Lisp:
[https://www.nhplace.com/kent/Papers/Special-
Forms.html](https://www.nhplace.com/kent/Papers/Special-Forms.html)

~~~
lispm
> One strange Lisp variant is Mathematica

I wouldn't call Mathematica a Lisp, given that it is a runtime rewrite system
and not based on a defined evaluator (which also makes Lisp relatively easy to
compile and also compilable to relatively efficient code).

> That way you could define both a macro and function version of a symbol if
> they both make sense.

There is a variant of that in CL called 'compiler macro'.

> The utility of two namespaces seems to be than when Lisps were dynamically
> scoped

One still has the same problems with global symbols in Scheme.

    
    
       (define (foo list) (list list))
    

Here the parameter variable LIST shadows the global function LIST.

~~~
kmill
> Mathematica

It's derived from Wolfram's conception of Macsyma, and it is designed so that
all code is made out of numbers, strings, symbols, and arrays (sure, not cons
cells). I guess it's nice to know you wouldn't call Mathematica a Lisp, but I
didn't say it was strictly in the Lisp family, just a strange variant. (Does
this mean you do not consider PicoLisp a Lisp either, since it is not really
compilable, as far as I can see? Or is it fine to you because it has a defined
evaluator?)

Mathematica does have a built-in compiler, but granted it is unclear exactly
what subset of the language it is compiling.

> There is a variant of that in CL called 'compiler macro'.

I am aware, but that certainly does not have the same effect because CL is not
required to invoke your compiler macro. This means you cannot rely on compiler
macros for control flow manipulations, like what 'and' requires. (I know you
said "variant," but they are more of an optimization than what I was talking
about.)

> (define (foo list) (list list))

As I said, with lexical scoping "this became a _bit_ less of an issue." What I
had in mind is that at least any of these kinds of shadowing errors will be
entirely local. You won't have the problem that by using 'list' as the name of
an argument, if it were dynamically bound in a Lisp-1 it would mess up any
function you might call that itself uses 'list'. Lexical scoping at least
prevents this spooky action at a distance that you would need a Lisp-2 to
prevent under dynamic scoping.

~~~
lispm
> I guess it's nice to know you wouldn't call Mathematica a Lisp,

Thanks!

> but I didn't say it was strictly in the Lisp family, just a strange variant.

Right, that was my point, I don't consider term rewriting systems as Lisp
variants, even though they may have numbers and strings as data types. The
actual execution engine is too different.

Fexprs had the interesting property that it sees the actual source at runtime
(which makes it very different from lazy evaluation or wrapping things in
lambdas), which appeals especially to users/implementors of computer algebra
systems (see Reduce written in Portable Standard Lisp, which provides Fexprs)
or other math software dealing with formulas (like R, which has a feature
similar to fexprs).

> Mathematica does have a built-in compiler, but granted it is unclear exactly
> what subset of the language it is compiling.

From what I read of its very unspecific documentation, I doubt that it does
the term rewriting at compile time (like Lisp does macro expansion at compile
time) - does it? To me it looks like the compiler targets numeric code and
basic control flow...

They probably have better documentation of their language implementation,
internally. Or is there an externally available document, which actually
describes their implementation?

Mathematica as a rewrite language

[https://www3.risc.jku.at/publications/download/risc_342/1996...](https://www3.risc.jku.at/publications/download/risc_342/1996-11-01-A.pdf)

------
gumby
Nice discussion starting around page 23 discussing how the end of /
limitations of the PDP-10 led to an explosion of innovation including the
development of continuations (still so unfamiliar back in 1993 that they had
to be explained in simple language).

------
a3n
\---- 7 Conclusions [The following text is very sketchy and needs to be filled
out.] We’re still working on this. [End of sketchy section.] \----

------
kazinator
I have this printed in a three-ring binder, lying on the floor under the desk
here.

