
The Idea of Lisp - rbanffy
https://dev.to/ericnormand/the-idea-of-lisp
======
rntz
This article has many misstatements in its first half.

> John McCarthy wrote 6 easy things in machine code, then combined them to
> make a programming language.

John McCarthy didn't implement Lisp in machine code. Steve Russell did.
Implementing Lisp properly in machine code is not easy; you have to write a
garbage collector. To do that in the early 60s, you had to first _invent
garbage collection_. Lisp was and is brilliant, but not as easily
bootstrappable as this makes it out to be.

> It's not obvious that these six things are computationally complete (AKA
> Turing Complete).

`lambda` and function application alone are Turing-complete, as McCarthy would
have known. The credit here belongs with Turing and Church, not McCarthy.
`atom`, `cons`, `car` and all the rest are just icing on the cake of the
lambda calculus when it comes to computability.

> All other meaning can be defined in terms of them.

Yes, and you can build everything on top of the SK combinator calculus if you
like, but that doesn't make it a good idea. Lisp is _surprisingly practical_
given how few core constructs it has, but real Lisp implementations have
always added more primitives (eg. _numbers_ and _addition_ ) for reasons of
practicality.

> The language was defined in terms of itself as an interpreter. This is a
> proof by construction that the language is computationally complete.

No, it isn't. To prove Turing-completeness you need to show that you are as
powerful as Turing machines. To do this it suffices to show that you can
interpret a language already known to be Turing-complete. Showing you can
interpret yourself does not suffice. It's easy to define a language which can
do nothing useful _except_ interpret itself, for example. (See also wyager's
comment.)

> Well, Lisp is defined as an interpreter in terms of itself from the get-go,
> just like a Universal Turing Machine.

No. Defining a language _only_ in itself is nonsense, for exactly the reason
given above: it means nothing yet! It's like writing in a dictionary:

    
    
       qyzzyghlm, v. intr. To qyzzyghlm.
    

It explains nothing unless you already understand it!

> Lisp is a universal language because it can interpret its own code. While
> you can certainly write a JavaScript interpreter in JavaScript, none of the
> work is done for you.

Almost none of the work is done for you in Lisp either. The core of Lisp is
just a relatively easy language to implement, while Javascript is a difficult
one. Lisp is easy to implement because it has simple syntax (s-expressions)
and few core constructs. The only thing that is special about implementing
Lisp in Lisp is that Lisp uses s-expressions as its core data structure, so
you don't have to invent an AST representation. The article, to its credit,
explores this idea later.

~~~
crististm
Garbage collection is not necessary for lisp. Garbage collection only provides
the illusion of infinite memory. Just like malloc/free.

~~~
theseoafs
I assure you that "malloc" and "free" don't "provide the illusion of infinite
memory". Quite the opposite, in fact.

~~~
pjmlp
They do if always matched 1:1, it doesn't usually happen, specially in big
codebases, thus leading to CVEs.

~~~
theseoafs
They surely do not. Malloc is specified in such a way such that the request
for memory can fail (which leads to returning NULL).

~~~
rntz
The specification allows for this, yes. However, on some platforms (including
linux glibc by default, I believe), malloc() never fails, but allocates
virtual memory optimistically; the first you hear of an out of memory
condition is when the system slows down due to paging, and the next thing you
notice is when the OOM killer nixes a process.

Of course, other platforms, especially embedded ones, behave differently.

~~~
amag
Actually there is one reason for malloc to return NULL even with virtual
memory, your process can run out of address space.

~~~
roganp
Or run out of room in the paging file. Your addressable memory cannot be
larger than physical memory without a backing store.

~~~
prodigal_erik
Depending on VM_OVERCOMMIT_MEMORY, Linux might give out address space well
beyond the size of the pagefile, hoping many of those pages are never written
to (e.g., most threads never get anywhere near the bottom of their default
stacks).

------
agentgt
The conditional expression or more specifically everything being an expression
is my favorite thing about Lisp. I did not know that McCarthy pushed to add it
to Algol which apparently today is the ternary operator for most languages.

It is annoying that so many languages (C, Java, C#, etc) have both a
conditional statement (if-else) and conditional expression (ternary ?:).
Really the if-else should be an expression (I think the ternary operator is
hideous).

~~~
mroll
If you like everything being an expression, check out tcl. A lot of ideas from
lisp show up in tcl, especially the idea of everything as an expression. Tcl
embodies this idea while also having the look of an algol-like language. Funny
it can pull this off while having basically no syntax.

~~~
vertex-four
Rust also has (almost) everything being an expression - things like this
aren't uncommon:

    
    
        let x = if something {
            foo()
        } else {
            bar()
        }
    

Things which don't have a logical value evaluate to `()` (the empty tuple), I
believe.

~~~
virtualwhys
> Things which don't have a logical value evaluate to `()` (the empty tuple),
> I believe.

If Rust follows Scala then `()` is not the empty tuple, but rather Unit (void
in C*).

~~~
jimktrains2
It's the same thing. It's a type with only a single value.

~~~
theseoafs
`void` in C does not have a value. You can't make a variable and put a void in
it because there is no such object as "void".

~~~
Ericson2314
That is an artificial restriction if C. Also see how ! In Rust is also loosing
it's artificial restrictions.

~~~
lomnakkus
Indeed, and there has been some talk of removing this restriction in C++
because it makes certain kinds of metaprogramming a lot more cumbersome than
they should be.

------
QuadrupleA
It's interesting, I'm reading Black Swan at the moment by Nassim Taleb, and
one of his big rants is about how we get blinded by idealized, platonic forms
and ideas when the real world is messy and inherently unpredictable. E.g.
trying to explain the forms of nature with platonic archetypal shapes like
circles, rectangles and triangles. Lisp and the community around it kinda has
that flavor - getting lost in a world of "pure forms" and grand ideas, but
downplaying the important but messy practical reality of hardware, useful
libraries, and getting cool stuff done with a minimum of fuss. I'm
periodically fascinated by Lisp (I wrote an interpreter or two in C) but I
wonder if its "Platonicity" is part of its downfall.

~~~
mbrock
I don't think actual Lisp programmers share this obsession with purity and
ideal forms. It's more something that shows up in blog posts about Lisp by
people who probably don't actually use it. The title of this one is telling:
it's about "the _idea of_ Lisp."

On the other hand, if you look at, say, ANSI Common Lisp, it's not at all some
kind of perfectionistic attempt at divine elegance. It's a pragmatic
compromise resulting from years, decades, of actual use on real computers.

Just browse around the SBCL compiler source code and you'll see that this
stuff is developed by people who definitely aren't afraid of the messy
practical reality of hardware:

[https://github.com/sbcl/sbcl/tree/master/src/compiler/x86-64](https://github.com/sbcl/sbcl/tree/master/src/compiler/x86-64)

Generally spend some time within the Lisp community and see how many people
you see fretting over Platonic archetypal shapes and compare to people solving
actual problems and using the language as just a nice way to program a
computer.

Emacs is another example that demonstrates the spirit of actual Lisp
programming as opposed to armchair theorizing about lambda calculus
fundamentals.

~~~
pjlegato
ANSI Common Lisp is rather a design-by-committee monstrosity which was forced
on the unwilling Lisp vendors by the Defense Department.

Most of the feature set was designed via backroom political horse trading
("We'll let you include pet feature X if you support us for our pet feature
Y".) There is no coherent overall plan or design to it at all.

(Source: personal communication from a member of the committee that designed
it.)

It's based on actual use on real computers of the late 1970s and early 1980s
-- e.g. the file opening mechanism is a complex abstraction designed to
support filesystem paradigms that nobody has used for 30 years, yet there is
no standard way to open a TCP socket.

I heartily recommend Clojure (clojure.org) as an alternative: a modern,
pragmatic Lisp designed for 2016-era software engineering.

~~~
lispm
Given that most of the hundreds of Lisp implementations each have their own
way to open a TCP connection, Clojure just added another incompatible one. To
claim that it is a standard one, is a bit funny. Each of the hundreds other
implementations could claim that, too.

Clojure generally added incompatibilities, since it is fully incompatible to
any other Lisp before in fundamental ways. Clojure was designed with zero
backwards compatibility. Lisp concepts were removed, renamed, redesigned. Even
identifiers with the same name are doing completely different things. If it
did something similar to what Lisp did, Clojure sure has it renamed and
redesigned.

I doubt that you ever had talked to anyone from the ANSI CL committee. It
would also have been easy to find out that Common Lisp was designed by a few
core people (the gang of five) with lots of community input from 1980 to 1984.
This part is well documented. 1984 the first version of Steele's book Common
Lisp the Language was published. The ANSI Common Lisp standardization was
started later in 1986, when the core of Common Lisp was already defined. Even
there the major extensions were designed by small groups with community input.
See for example how CLOS was designed by a few people (Daniel G. Bobrow, Linda
G. DeMichiel, Richard P. Gabriel, Sonya E. Keene, Gregor Kiczales, and David
A. Moon.) and by providing a complete reference implementation (PCL).

~~~
pjlegato
You missed the point. Clojure's way of opneing TCP connections is standardized
_within Clojure_ \-- all Clojure programs use a standard API call to do that.

Common Lisp, on the other hand, has no standard way of opening a TCP socket at
all, because TCP was uncommon when it was designed. It relies entirely on
(poorly documented, often unmaintained) third party libraries to do that.

As for your doubts, you can doubt all you like, that changes nothing. I am
well aware of the history of ANSI CL, and I'm not sure what point you are
trying to make with all the namedropping.

~~~
lispm
> all Clojure programs use a standard API call to do that.

Clojure says nothing about creating TCP sockets, since Clojure implementations
(all three) need to call the hosting systems call or emulate it somehow. The
JVM Clojure uses a different call than the CLR Clojure.

Which makes it worse than Common Lisp, which has widely used socket support
with usocket and some others.

> Common Lisp, on the other hand

Is a real language standard with many different implementations.

> It relies entirely on (poorly documented, often unmaintained) third party
> libraries to do that.

Each Common Lisp implementation has a documented and maintained way to open a
TCP socket. Additionally there are compatibility layers like usocket

[https://common-lisp.net/project/usocket/](https://common-
lisp.net/project/usocket/)

> I am well aware of the history of ANSI CL,

Then why are you writing obviously wrong things?

------
wyager
> This is a proof by construction that the language is computationally
> complete.

The definition of Turing completeness in the article is not correct. A
language being able to execute programs written in itself is not a sufficient
condition of Turing completenes. Trivial example: define a language with one
pre-defined term, x, which is a routine that takes as input a string, checks
if it's "x", and executes it if it is. The empty language is a counter-example
as well, but that's cheating.

I'm also not sure if the use of the phrase "fixed point" is a misunderstanding
of the definition of a fixed point or just an unfortunate use of a term that
already has great significance in LISP.

~~~
zodiac
I agree...I think a real "proof by construction that the language is [Turing]
complete" would just be an interpreter for Turing machine programs written in
Lisp, which is pretty boring.

------
mroll
Nice article. Check out Paul graham's The Roots of Lisp for a similar
exploration in which he shows how to build the metacircular interpreter.

> John McCarthy wrote 6 easy things in machine code

It was actually Steve Russel, McCarthy's grad student, who had the idea of
writing McCarthy's eval function in machine code.

~~~
clifanatic
> It was actually Steve Russel

I seem to recall reading that McCarthy was actually surprised to discover that
Lisp _could_ be run by a real computer; he intended it to be a completely
theoretical tool.

~~~
mroll
I have also heard that, but only from secondary sources. Here is a clip of
Russell talking about the time he wrote the first lisp interpreter. He doesn't
mention McCarthy being surprised, but he seems to imply that he, Russell, was
quicker to grasp the idea of translating the functions McCarthy had been
writing to machine code.

[http://www.computerhistory.org/pdp-1/1020b307d766e0019de2b4a...](http://www.computerhistory.org/pdp-1/1020b307d766e0019de2b4addcc86dee/)

------
lisper
I'm working on a Lisp-based introductory programming book:

[https://github.com/rongarret/BWFP](https://github.com/rongarret/BWFP)

Still very much a work in progress. Feedback appreciated.

~~~
tyleraldrich
You may be interested in checking this out for inspiration:
[http://www.ccs.neu.edu/home/matthias/HtDP2e/](http://www.ccs.neu.edu/home/matthias/HtDP2e/)

(It's the Intro to CS book used at my alma mater, teaching programming in
Racket)

~~~
ChicagoBoy11
Tried doing some of that with High School kids -- I have to admit, at the
beginning it was very difficult for them to wrap their head around the basic
concepts in functional programming. The other issue was that the few
syntactical rules and prefix notation, while great in the long term, required
the kids to do a bit more of mental gymnastics for even basic things at the
beginning, so that didn't help either.

But man after the initial hump, I was convinced that this is the route to go
through in order to introduce someone to programming with very solid first
principles.

~~~
lisper
That's why I'm using a library
([https://github.com/rongarret/ergolib](https://github.com/rongarret/ergolib))
to smooth over some of Common Lisp's rough edges. The goal of the book, what I
hope will make it unique, is to teach all of the basics without having to get
too hung up on the details of CL.

The reason chapter 3 is taking so long is that I can't figure out a good way
to get around one of those details. I want chapter 3 to be about parsing i.e.
I want the reader to build READ before they build EVAL. So I want to introduce
READ-FROM-STRING, and one of the things I want to be able to read from strings
is characters. Unfortunately, CL uses the same character (backslash) as the
reader dispatch macro character for characters (e.g. #\x) as it does for the
escape character in strings. So if you type #\x you get the character x, but
if you type "#\x" you get a reader error because the backslash is consumed as
an escape character inside the string. I have yet to find a satisfactory
solution.

------
vram22
For anyone interested, John McCarthy's original paper on Lisp is here:

RECURSIVE FUNCTIONS OF SYMBOLIC EXPRESSIONS AND THEIR COMPUTATION BY MACHINE
(Part I)

[http://www-formal.stanford.edu/jmc/recursive.html](http://www-
formal.stanford.edu/jmc/recursive.html)

From the page:

"This paper appeared in Communications of the ACM in April 1960. It is the
original paper on Lisp."

I had mentioned it in this blog post in which I gave a few examples of doing
simple computations recursively in Python (for beginners).

Recursive computation of simple functions:

[http://jugad2.blogspot.in/2016/03/recursive-computation-
of-s...](http://jugad2.blogspot.in/2016/03/recursive-computation-of-
simple_2.html)

------
kluck
Lisp was developed because McCarthy needed a tool for experimenting with AI.
Found a video of McCarthy talking about AI:
[https://www.youtube.com/watch?v=Ozipf13jRr4](https://www.youtube.com/watch?v=Ozipf13jRr4)

And if anyone cares, here is nice Shirt with McCarthy on it ;)
[https://www.teepublic.com/t-shirt/666689-john-mccarthy-
lisp-...](https://www.teepublic.com/t-shirt/666689-john-mccarthy-lisp-man)

I think it should be mandatory for CS students to implement their own little
Lisp using the building blocks McCarthy described! Instead they are learning
Java and ist crappy OO...

~~~
lolive
Is there a kind of walkthrough/tutorial about how to develop a little Lisp
interpreter? That sounds like a fun experiment.

PS: Sorry, I am a Java OO developper. But I like to learn :)

~~~
kornakiewicz
For example:
[http://www.buildyourownlisp.com/](http://www.buildyourownlisp.com/)

But there's plenty more.

~~~
agumonkey
Another one [https://github.com/kanaka/mal](https://github.com/kanaka/mal)
(quite famous AFAIK)

------
stcredzero
The thing about the way ideas about programming are "sold" to other
programmers, is that it has as much to do with the actual profession of
programming as a typical tween's conception of being a "rockstar" has to do
with the actual profession of being a touring musician. A lot of the really
vital hard work is glossed over, and huge amounts of attention are paid to
certain abstracted "sexy" ideas.

When people watch someone soldering, their attention is drawn to the iron, and
to the shiny melted flowing metal. However, it's really _cleaning_ the tip of
the iron and having an iron that can provide enough power at the right
temperature that matters.

------
georgeecollins
When I was a kid they made us learn C and Lisp as part of Cognitive Science
degree. I don't really use either language, unless you count C++. But I do
feel that between those two languages you can understand two ideals really
well. One is the idea of a clean symbolic expression, the other is the idea of
a portable language that lets you get to the core of what the machine is
really doing. Both are useful ways to think about programming.

~~~
goatlover
Some years ago, Paul Graham wrote about there being too conceptually clean
approaches to programming Languages, C and Lisp. The C family is far more
popular, but the trend is to take C as your starting point, and add Lisp
features to it.

Gosling said that Java drug the C++ crowd halfway to Lisp.

~~~
mypalmike
I wonder what aspects of Lisp he was referring to. The programming model of
Java is basically C++ with garbage collection, minus a lot of stuff that makes
C++ unsafe and hard to parse.

~~~
goatlover
Good question. Perhaps because Java's OOP is closer to Smalltalk than Simula?

Here is the context:

[http://people.csail.mit.edu/gregs/ll1-discuss-archive-
html/m...](http://people.csail.mit.edu/gregs/ll1-discuss-archive-
html/msg04045.html)

------
erik14th
I wonder why lisp isn't as popular as say python for AI, ML, and stuff. I see
these fields as having a strong academic tone, and it feels like racket or
clojure could be bigger when it comes to that.

~~~
Blackthorn
Well, it was. Specifically, Common Lisp was. But that language's standard was
etched in stone in 1994 whereas languages like Python (where most deep
learning user-facing code is done) continue to evolve.

I think Python really took off for that because it already had quality and
widely-used libraries for writing the code in Python and doing the work in a
more efficient place (numpy, scipy). Clojure has one of those for matrix
multiplication but not much else there, and I'm not sure Racket has anything
at all.

~~~
kbp
> Well, it was. Specifically, Common Lisp was. But that language's standard
> was etched in stone in 1994 whereas languages like Python (where most deep
> learning user-facing code is done) continue to evolve.

This is an apples to oranges comparison. The Common Lisp _standard_ hasn't
been updated since 1994. The Python standard has not been written at all yet.
Lisp and Python implementations both continue to evolve and be released.

~~~
pg314
To add to that, the transition from Python 2 to Python 3 is also an example of
how not to evolve a language.

------
cr0sh
I can't comment too much on this article, as I have a very, very limited view
on LISP - basically just a couple of minor tutorials and one of the open-
source interpreters. For me, it's always been one of those "I need to learn
this" kind of languages, but I've never had a use case for it, and so it
remains a curiosity to me more than anything.

I do know, though, that LISP allows the creation (or at least I have heard)
for DSLs - so I am curious what people here think about this.

I'm also curious if anyone has an opinion on JetBrains MSL:

[https://www.jetbrains.com/mps/](https://www.jetbrains.com/mps/)

...and whether that would be a better thing to learn before or after learning
LISP, as well as how it compares to LISP?

It's yet another "thing" that has caught my eye over the years, but again - no
use case, and so it remains on the back burner for now...

~~~
mroll
Lisp is an interesting thing because once you learn it, you start seeing use
cases for the ideas you've picked up during the process all over the place.

------
bpicolo
Does anybody have a few examples of DSLs people make in a lisp (ideally
clojure because I have worked with it a tad)? I've seen plenty of cases where
people make a pseudo-dsl via optional arguments, but not seen this so-oft
mentioned "yeah we just wrote a dsl for it because lisp" sort of deal.

~~~
GavinMcG
[http://beautifulracket.com/appendix/domain-specific-
language...](http://beautifulracket.com/appendix/domain-specific-
languages.html)

~~~
bpicolo
Hmm, I guess I have seen datascript / datalog. My main question is more about
how many articles seem to suggest that it's basically par-for-the-course in
lisp programming to just make a DSL in your projects, so was wondering when
that might actually occur.

------
amelius
Question: what would a LISP dialect with static typing look like?

EDIT: Found an answer: [http://stackoverflow.com/questions/3323549/is-a-
statically-t...](http://stackoverflow.com/questions/3323549/is-a-statically-
typed-full-lisp-variant-possible)

~~~
shakna
I designed and built a statically-typed F-Expr LISP for my thesis, using some
partial AST evaluation to trace and ensure every value had at least an initial
value, and from that, type inference.

So it looked like:

    
    
        (define fib
          (lambda (n)
            (let loop ((a 0) (b 1) (n n))
              (if (= n 0) a
              (loop b (+ a b) (- n 1)))))
    

Which would do nothing by itself, and be eliminated as dead code unless
called.

If we called it with:

    
    
        (fib 10)
    

It would expand, after the macro stage, to:

    
    
        (define fib
          ((Type/Number lambda) ((Type/Number n))
            (let loop (((Type/Number a) 0) ((Type/Number b) 1) (n n))
              (if (= n 0) a
              (loop b (+ a b) (- n 1)))))
    

If a value couldn't be inferred after ensuring the validity of the AST, it was
supposed to error out with some helpful messages, but tracing the entire AST
forward and back repeatedly always managed to type every value that was at
least initialised, and if not, eliminate it as dead code.

Tradeoffs:

Compiling can be _very_ lengthy, and it would be theoretically possible to
write a program that would take ridiculous times to compile.

Once compiled, we can ensure type safety, and in the underlying
implementation, JIT everything for a decent amount of speed.

Edit: Forgot to add lambda return type. Then added it in the wrong place.

~~~
ehsanu1
Do you happen to have a link to your thesis?

~~~
shakna
Unfortunately not.

I had to surrender publishing rights, and the university only publishes about
10 submissions a yeaar... So unlikely to appear anytime soon.

~~~
kerneis
> I had to surrender publishing rights

Even as a pdf on your own webpage? What kind of university would do that? Even
the most abusive CS publishers have a more relaxed policy…

~~~
shakna
Well, the nearest competitor requires full copyright transfer, so there's
that.

And CS publishing usually happens only within STEM, not as a standalone,
within this circle of Universities.

No, not in America. There is far less interest locally in CS, and so a sort of
mild tyranny rules in academia.

------
qwertyuiop924
Since this takes so much from things Alan has said, I'd be interested in
seeing what he thinks.

Alan, if you're there, would you care to comment?

------
bcherny
Out of curiosity, what's the alternative to if/else? Assuming polymorphism
wasn't around in the 50's, did people express the idea of conditional
execution based on the result of evaluating some expression using and/or? Does
this mean that lazy evaluation was around before conditionals?

------
WhitneyLand
I know their is an active community around Lisp and it's still used for
development, but I apparently have not dug deep enough to appreciate when it's
the best choice for a new project.

Can someone mantion a few features or scenarios that make it the best choice
for starting a new project?

~~~
shakna
There's two places where I find LISPs useful.

* Where I might prototype something in Python usually, I can build the Scheme equivalent just as fast, and thanks to Gambit or Chicken's speed and static compiling, it can grow to be bigger than just a prototype with few tradeoffs.

* Anytime I need a DSL, if it isn't LISP, I find myself either disappointed by slowness, or fighting with the language. Example: HTML templates. (x-expr are great!)

In short, speed and flexibility.

------
clarkd99
This great idea of Lisp (the simple syntax of function calls in round
brackets) isn't much different than a good macro assembler even back in the
1960's. The only major difference was that more than 1 function could be
defined in 1 source code line. (I think that machine code is nothing but a
sequence of function calls where the function is the logic encoded in the CPU
itself for each opcode.) Is it fair to compare the complexity of expression
evaluation etc (Fortran) with a macro assembler? Obviously any program can be
coded in a macro assembler and therefore that would also be true from a syntax
like Lisp.

When I was in my 20's, I programmed at least 100,000 lines of Z80 assembler
for the first micro computers. One project was at least 40,000 lines and so I
know how difficult it is to program larger assembler programs. The biggest
problem is that it is hard to see the structure of the loops and conditionals
that we normally indent in higher level languages. (You can indent a Lisp
program in any way but the language doesn't require any at all.) It is also
difficult to recognize expressions. Both of these problems are also there in
Lisp (unlike most other high level languages).

One last point about the linked list structure at the heart of Lisp. Linked
lists are poorly executed in modern computers that rely heavily on locality of
data, to optimize the L1 cache. Lisp was very easy on the compiler/interpreter
writer but wasn't very good at optimizing the readability of the code for the
programmer. (I don't want a religious war but I will point out that most
programmers have never programmed in Lisp even though it was one of the first
computer languages created.) Before I get a lot of dissing comments, I think
with practice, some programmers developed an eye for the lack of structural
clues and made some reasonable size code. You could say the same about some
programmers making quite good large scale programs in assembler but that
doesn't mean that writing in assembler or Lisp should be encouraged.

~~~
hindenburg
You make some specific claims here that sound a little odd to this LISP and
assembly language hacker.

Assembly language doesn't provide any datatypes. LISP does. Assembly language
doesn't provide any type checking. LISP does. Assembly language doesn't
provide automatic storage reclamation. LISP does. Assembly language doesn't
provide naming. LISP does.

You also make a claim about L1 caches and locality of reference. Every LISP
compiler writer, and every LISP garbage collector writer, knows about CDR-
coding. We also know about how Cheney copying garbage collectors and their
descendants like the Baker incremental collector compact data, precisely for
locality of reference. The compiler writer of course is thinking about cache
performance and how lines are mapped in particular target architectures.

You should probably educate yourself a little more about LISP if you are so
interested in it as to make statements in a public forum.

~~~
clarkd99
I was only talking about the 'list of function calls' aspect of assembler and
Lisp, not the type system. I agree that Lisp has a type system and assembler
doesn't. Forth is another language that also has very simple syntax that
approximates the 'list of function calls' style that I would say isn't unlike
a macro assembler either.

I am writing a new language with built-in garbage collection that I think is
quite superior to other languages. I have created a full standard library with
almost 1,000 built-in functions and none of my data structures (lists, maps,
trees, stacks, indexes, tables etc) contain pointers or linked lists (that use
pointers). I sold over 30,000 copies of a language/database system in 1987 so
I think your last comment is quite inappropriate. I have know about Lisp since
I started CS in University in 1975.

Linked lists are horrible data structures when being used as well as when
being freed (your garbage collection comment). I use simple dynamic multi-
typed arrays instead of linked lists (pointers) and they can be freed in 1
chunk or a bigger version can be freed with a few memory de-allocations. I get
full cache locality and improved speed of allocation and de-allocation.

I would love to see an incremental GC that can copy all linked lists nodes
into contiguous memory automatically. Nice trick if you can do it but that
doesn't help you if your linked list doesn't cause a GC.

~~~
tlack
Big time array fan here. I'd love to check out your language when you publish
it. Sooner the better.. we need new ideas! Email in profile if you'd like to
chat about it.

