
Lively Linear Lisp – 'Look Ma, No Garbage' (1992) - tosh
http://home.pipeline.com/~hbaker1/LinearLisp.html
======
kibwen
_" Some have suggested that garbage collection not be done at all [White80] or
after the program has finished [Moon84] (comment on the Boyer benchmark).
Linear Lisp provides a hyper-clean environment in which garbage is never
produced, and therefore garbage collection is not necessary."_

 _" However, linear logic's lack of sharing may introduce significant
inefficiencies of its own."_

 _" The sharing of data structures can be efficient, because sharing can
substitute for copying, but it creates ambiguity as to who is responsible for
their management."_

These quotes foreshadow the goals of Rust 20 years later, whose eventual
intent was to realize both the doesn't-require-a-garbage-collector property of
linear logic with the efficiently-sharing-data-is-still-possible property of
traditional systems languages by separating these concepts into ownership
(covering the former) and borrowing (covering the latter, and built on top of
ownership).

~~~
cwzwarich
> These quotes foreshadow the goals of Rust 20 years later, whose eventual
> intent was to realize both the doesn't-require-a-garbage-collector property
> of linear logic with the efficiently-sharing-data-is-still-possible property
> of traditional systems languages by separating these concepts into ownership
> (covering the former) and borrowing (covering the latter, and built on top
> of ownership).

The idea of borrowing is not original to Rust in any way, and is found from
the first application of linear logic to PL type systems in Wadler's "Linear
types can change the world!" onward, including Vault, Cyclone, etc. Any system
without it would be impractical for writing real programs.

~~~
pjmlp
It appears that just like there are those that think C was the genesis of
systems programming, with very little attention being given to what happened
15 years earlier and beyond Bell Labs walls, in 20 years from now same might
apply to Rust and the languages that were its inspiration.

~~~
bch
> It appears that just like there are those that think C was the genesis of
> systems programming, with very little attention being given to what happened
> 15 years earlier

This is why I think it’s important to get familiar and listen to Old Masters
like Rob Pike, Brian Kernighan, John Ousterhout, and even war stories of
Richard Stallman, and chase down their references and get familiar with them,
too. It’s not that there’s nothing left to do, but an awful lot _has_ been
done. We can at least learn from it...

> and beyond Bell Labs walls

As I cite Pike, Kernighan, ... :)

~~~
pjmlp
There is a Rob Pike talk where he praises the compile times of the Algol
compiler he was using as a student for example.

------
hollerith
The 'use-once variable' described in the OP is the intellectual ancestor of
Rust's borrow checker. It got a lot of attention from programming-languages
researchers (such as the author of the OP) after it got a lot of attention
from academic logicians after the publication in 1987 by logician Jean-Yves
Girard of a seminal paper entitled 'Linear Logic', the full text of which is
available at
[https://www.sciencedirect.com/science/article/pii/0304397587...](https://www.sciencedirect.com/science/article/pii/0304397587900454)

That paper is the most abstract, airy, 'ungrounded' paper I have ever read
that will probably have a significant positive practical impact on human
flourishing (via Rust's borrow checker).

~~~
opnitro
More specifically, while linear logic grants the 'must use-once variable',
Rust implements "affine types", where values must be used no more than once.

~~~
cwzwarich
Strictly speaking, Rust doesn't even have affine types, since even move-once
variables can be borrowed multiple times before they are moved. You could
argue that Rust could be elaborated into a system with affine types, but the
elaboration isn't particularly simple in all cases.

~~~
hollerith
I prefer to think of a reference to a type as a type in its own right
_separate_ from the borrowed type. Looked at that way, Rust does have affine
types.

To bring the discussion back to my first comment, it would've been less likely
for anyone to have invented the borrow checker if programming-languages
researchers weren't paying a lot of attention to use-once variables, and
languages researchers would've been less likely to be paying attention were it
not for Girard.

I haven't read Girard's paper in 20 years, but I would be extremely surprised
if it contained any mention of borrowing or references, so Girard must share
any credit for Rust's borrow checker with later innovators, and here I would
be remiss not to mention Graydon Hoare.

~~~
cwzwarich
> I prefer to think of a reference to a type as a type in its own right
> separate from the borrowed type. Looked at that way, Rust does have affine
> types.

The reference type is a separate type, but the & borrowing operator is a use
of the original path. You can borrow (i.e. use) the same path multiple times
before it is moved or implicitly destroyed.

~~~
hollerith
Good point: my assertion in grandparent that "Rust does have affine types" is
wrong.

Do you agree with my belief that it is significantly less likely the borrow
checker would've been invented if programming-languages researchers hadn't
paid a lot of attention to linear and affine types? Even though a Rust coder
can take as many non-mutable references to a location in memory as he wants,
there are certain operations (e.g., move) that the coder can only do once to
it, and the inventor(s) (probably Graydon Hoare) of the borrow checker must
have explored that part of the design space extensively, and it seems to me it
would have been very non-obvious that it was worth exploring extensively to
someone not influenced directly or indirectly by Girard.

~~~
cwzwarich
> Do you agree with my belief that it is significantly less likely the borrow
> checker would've been invented if programming-languages researchers hadn't
> paid a lot of attention to linear and affine types?

As I mentioned in another comment, the concept of borrowing was already
present in the earliest applications of linear types to programming languages.
There was contemporaneous research into region systems for ML. Later languages
like Vault and Cyclone combined these two ideas, using substructural types to
manage regions.

From a language feature perspective, the biggest innovation of Rust was
integrating the ideas from Cyclone and other research languages with the
emerging C++11 style of programming with implicit destructors, move semantics,
and smart pointers.

------
ghosthamlet
Linear Logic, Linear Lisp, Linear Types and Concatenative Languages:
[https://cdiggins.github.io/blog/linear-logic-and-linear-
lisp...](https://cdiggins.github.io/blog/linear-logic-and-linear-lisp.html)

~~~
carapace
Oh awesome.

See also Conal Elliott's "Compiling to categories" (
[http://conal.net/papers/compiling-to-
categories/](http://conal.net/papers/compiling-to-categories/) ) where he is
converting Haskell automatically to point-free form (like a concatinative
language but not quite) and then instantiating that over different Categories
to get different correct programs from the same expression.

I've been working with Joy recently and I think this stuff is "the next big
thing" for PLs.
[http://joypy.osdn.io/notebooks/Types.html](http://joypy.osdn.io/notebooks/Types.html)

~~~
bjourne
Cool project! I've been working on typing for stack-based languages. Although
I got stuck trying to make the inferencer work on higher-order functions.
Kleffner's Master thesis supposedly explains how to accomplish that, but I
haven't been able to wrap my head around it yet. I'd be interested to hear how
you have solved the problem in Thun.

~~~
carapace
My original implementation is in Python and I documented a bit of research and
implementation of type inference here: "The Blissful Elegance of Typing Joy"
[http://joypy.osdn.io/notebooks/Types.html](http://joypy.osdn.io/notebooks/Types.html)
The “Type Inference in Stack-Based Programming Languages” talk given by Rob
Kleffner informed it. I don't recall now whether I read his thesis but it's
likely. I probably couldn't wrap my head around it either.

For typing combinators (Joy's higher-order functions) I tried making a hybrid
inferencer and interpreter that just evaluated them and it worked.
(Incidentally that's what drove home to me the categorical nature of Joy. When
I read Conal Elliott's "Compiling to Categories" I recognized what I had
done.) In Joy the higher order combinators "don't care" if they are working on
e.g. values or types. In other words they only care about the shape or
structure (structural typing) of the data on the stack.

When I wrote the interpreter in Prolog and then wrote the inferencer in Prolog
I noticed they were the same code, so I deleted one of them. In Prolog, you
can pass a stack and compute values or pass logic variables and it will tell
you what kind of stack a given expression expects/generates. If you implement
math ops with CLP(FD) you get a nice constraint compiler that get generate new
Prolog implementations of Joy expressions. Sick, eh?

In _both_ Prolog and Python I haven't yet closed the loop for recursive
combinators. Meaning the type inferencer generates the base-case and then the
case for recurring once, then twice, and so on. I know the answer is some
simple application of fixed-point theory or something, but I'm an idiot, and
I've been working on other aspects (because I'm sure the solution is like
decades old in the "compiling FP languages" literature. "Somebody else has had
this problem.")

In any event, I don't think I'll have to figure it out, because I just found
out that the next steps I was going to take have already been done by the
"Seven Sketches" folks and then some:
[https://news.ycombinator.com/item?id=20376325](https://news.ycombinator.com/item?id=20376325)

I'm pretty sure most of that stuff would make great Joy combinators. And
_something_ in there would be the way to deal with e.g. _genrec_ and _x_
combinators.

------
rwmj
Garbage is always produced somewhere. It might be in variables which are no
longer used before being freed, or in wasted space inside the allocator
itself. What actually matters is (a) whether this performed better than GC'd
Lisp equivalents of the era, and/or (b) if a GC Lisp is easier/faster/cheaper
to write code for.

~~~
kibwen
Agreed that it's important not to overlook that allocator implementations
often curiously resemble garbage collectors, although even keeping this in
mind the act of eliminating what is usually referred to as "garbage
collection" still turns out to be valuable since that garbage collector lives
on top of the allocator already; having only one garbage collector in one's
program can be advantageous to having two garbage collectors, one atop the
other.

~~~
rwmj
The OCaml GC uses huge long-lived blocks from the underlying allocator
(malloc) so I don't think having two allocators is really a problem. Even less
so on a LISP running on a LISP system where the LISP is the system allocator.

------
sansnomme
Any authors of the Carp programming language reading this?

~~~
mncharity
(Carp is a statically-typed GC-free lisp for real-time applications.)

The "Carp 0.3.0 release" is being discussed today as
[https://news.ycombinator.com/item?id=20368969](https://news.ycombinator.com/item?id=20368969)
.

------
dang
Thread from a couple years ago:
[https://news.ycombinator.com/item?id=14248419](https://news.ycombinator.com/item?id=14248419)

------
canjobear
Why is linear logic called linear? What is the connection to the usual idea of
linearity?

~~~
nabla9
The semantics resembles structures from linear algebra. Linear logic was
initially seen as a linear algebra built on coherent spaces. A coherent space
is a reflexive undirected graph.

Alternatively: The category of finite dimensional vector spaces over finite
fields is a model of linear logic.
[http://www.cs.bham.ac.uk/~drg/bll/steve.pdf](http://www.cs.bham.ac.uk/~drg/bll/steve.pdf)

