
Not Lisp again (2009) - kamaal
https://funcall.blogspot.com/2009/03/not-lisp-again.html
======
dharmon
If this presentation of Lisp and model of computation clicks with you, you owe
it to yourself to read Structures and Interpretations of Computer Programs
(SICP).

The Abelson from the OP's lecture is the co-author, and the presentation
described exactly follows the structure of the book (including the derivative
example).

It opened my eyes to a new way of thinking about coding when I first read it
(and worked through the exercises) many years ago.

~~~
justaj
I've always wondered if recommending SICP would also be applicable to a person
who has never programmed before.

Perhaps pragmatism aside, would you?

~~~
xyzzyz
SICP is literally a textbook for programming 101 class. It's whole purpose is
to teach programming to people who might have never programmed before.

~~~
tjr
One potential downside for novice programmers is that some of the examples are
a bit deeper than what you find in most beginning programming books. For
example (as exhibited in the blog post topping this thread), writing programs
pertaining to differential calculus. Obviously, lots of non-programmers have
studied calculus, but lots haven't, as well. To the extent a reader might
stumble on some example programs, SICP could be challenging, even if the
programming content itself starts at zero and builds up.

~~~
jacobolus
It’s supposed to be a rigorous and challenging introductory course for first-
year MIT students 40 years ago who had no previous computer programming
background.

For someone who is not as well prepared as a typical first-year MIT student,
something else might be better, but it’s reasonably accessible in my opinion.
I would have loved to have a course like SICP as a high school student.

~~~
leoc
The target audience for SICP seems to be basically a freshman Gerry Sussman.
There's no doubt it's reasonably accessible to people with a suitable
combination of background and aptitude, but the historical evidence seems to
be that it's not accessible to the typical CS undergraduate: according to
[https://www2.ccs.neu.edu/racket/pubs/jfp2004-fffk.pdf](https://www2.ccs.neu.edu/racket/pubs/jfp2004-fffk.pdf)
it had a wave of adoption as a CS 101 text but enthusiasm soon soured. In my
experience, even _The Little Schemer_ , which covers much of the same
material, from a similar perspective on computing, but in a much more
streamlined and learning-friendly form, isn't close to being universally
accessible for _unsupervised_ learning.

~~~
jacobolus
My quick take is that the authors of the linked essay have a circumscribed and
very industry-focused view of what an introductory CS course should teach and
what kind of things a student should be expected to learn. Their primary goal
seems to be preparing students for a follow-up Java course (of the early 2000s
style of Java), but without dropping all of the incidental complexity of
Java/C++ on the students right away. To that effect they have pared down most
of the “computer science” content of a SICP type course, and focused heavily
on the computer programming process, analogous to the way middle school
English classes teach about writing.

(Disclaimer: I have not directly evaluated their curriculum/textbook.)

This differs markedly from my own opinion about the proper goals of a
‘computer science’ course at the undergraduate level, which is to teach
timeless principles and flexible thinking without bending to fashion
(especially fashion which is now a decade out of date), and to prepare
students for follow-up computer science courses such as data structures /
algorithms, theory of computation, programming languages, or in a more applied
direction databases, networking, graphics, operating systems, numerical
analysis, machine learning, and so on. A lot of these follow-up courses will
be substantially mathematical and will rely heavily on analytical skills
developed in in introductory CS course and in mathematics courses (not just on
programming skills per se).

But if updated for 2018 the authors’ curriculum would be appropriate as a
course titled “introduction to programming” or the like. I agree it sounds
like an improvement vs. first courses that start students out on C++ or Java
(their main comparison in the linked paper).

YMMV.

~~~
leoc
The authors are Matthias Felleisen and other core PLT Scheme/Racket guys.
They're hardcore members of the academic Scheme community, absolutely not very
industry-focussed guys, and the real industry-friendly contingent apparently
doesn't much like HtDP either:
[http://www.ccs.neu.edu/home/matthias/Thoughts/colleagues.htm...](http://www.ccs.neu.edu/home/matthias/Thoughts/colleagues.html)
. HtDP is often described as a bit of a grey slog (I've dipped into it but
haven't done it or read it cover to cover yet), but the authors absolutely
haven't given up on the ideal of teaching timeless principles and modes of
thinking, they're just more realistic about how the material has to be paced
and prepared to get a normal body of first-year undergraduates through it
successfully.

------
Insanity
Lisp is a fun language to program in. I learned Lisp after already being
familiar with Java / Python and some other languages, which maybe made it even
more beautiful.

One of my favourite pieces of code is a Lisp REPL in Lisp:

    
    
        (defun repl()
                (loop (print (eval (read)))))

~~~
keymone
looks even better with threading in imaginary lisp dialect:

    
    
        (-> read eval print loop)

~~~
earenndil
I think clojure has that; I could be wrong, though, I've never used clojure.

~~~
intertextuality
Clojure does indeed have -> (thread-first), ->> (thread-last), some->,
some->>, cond->, and as->. [0]

(->) inserts each form's value between the function name & the first argument
in the next form.

(->>) inserts each form's value as the last argument in the next form.

The latter forms are much more niche and I haven't found need for them yet.

[0]:
[https://clojure.org/guides/threading_macros](https://clojure.org/guides/threading_macros)

~~~
iterati
as-> is great for when you have to mix thread-first and thread-last.

cond-> is great for building maps where some keys might not be needed:

    
    
        (cond-> {:k1 v1}
          v2 (assoc :k2 v2)
          ...)
    

some-> and some->> I don't use as often, but when mixing in some java code
that could throw an NPE, these will avoid that and just return nil early (they
short-circuit when getting a nil value).

~~~
dustingetz
You can next ->> inside ->, took far too many years for me to realize:

    
    
        @(-> ctx
            :hypercrud.browser/fiddle
            (reactive/cursor [:fiddle/ident])
            (->> (reactive/fmap name)))
    

another

    
    
        (-> [10 11] 
            (conj 12) 
            (as-> xs (map - xs [3 2 1])) 
            (reverse))

------
QuadrupleA
My previous, somewhat-cranky comment got downvoted away, but I do think Lisp's
"higher-order magic" and "ability to do calculus" is being a bit exaggerated
here. From below - evaluating an Nth derivative in 80s-era C is not so
atrocious:

    
    
        #define DX 0.0001
        typedef double (*func)(double);
    
        double NthDeriv(int n, func f, double x) {
            if (n == 1) {
                return (f(x + DX) - f(x)) / DX;
            }
            else {
                return (NthDeriv(n - 1, f, x + DX) - NthDeriv(n - 1, f, x)) / DX;
            }
        }
    
        double Cube(double x) { return x * x * x; }
    
        double result = NthDeriv(3, &Cube, 5.0);
    

As mentioned in previous discussions of this article, the equivalent in Python
is pretty elegant:

    
    
        >>> def deriv(f):
        ...   dx = 0.0001
        ...   def fp(x):
        ...     return (f(x + dx) - f(x)) / dx
        ...   return fp
        ...
        >>> cube = lambda x: x**3
        >>> deriv(cube)(2.0)
        12.000600010022566

~~~
derefr
I think the thing that impressed the author at the time was more the fact that
functions, being first-class values, can be introduced at runtime in a REPL,
rather than having to be "planned" at compile time. So the C code isn't really
analogous, but the Python code is.

But re: the Python code—I would say that, from the perspective of the 1960s,
all modern "dynamic" languages that have REPLs (like Python) _are_ Lisps in
essential character.

"Lisp", back then, referred less to "a language that uses a lot of
parentheses", and more to things like:

• runtime sum-typing using implicit tagged unions;

• parameterization of functions using linked lists (or hash-maps) of paired
interned-string "keys" and arbitrary product-typed values, rather than
parameterization using bitflags or product-types of optional positional
parameters;

• heap allocation and garbage-collection;

• a compiler accessible by the runtime;

• "symbolic linkage" of functions and global variables, such that a named
function or variable "slot" can be redefined (even to a new type!) at runtime,
and its call-sites will then use the new version.

We only notice the parens as the differentiating feature of Lisps nowadays,
because everything else has become widely disseminated. Perl and Python and
PHP and Ruby (and even Bash) are fundamentally Lisps, in all of the above
ways. Lisp "won."

~~~
goatlover
However, PHP, Python, Ruby, JS, etc aren’t homioconic, which would disqualify
them as “lisps” in the minds of most Lispers, including its creator.

~~~
derefr
One might say that they're all implementations of
[https://en.wikipedia.org/wiki/M-expression](https://en.wikipedia.org/wiki/M-expression)
s.

My point, though, was that while the _proponents_ of Lisp define Lisp one way
(by the things _only_ Lisp can do due to e.g. homoiconicity), the _opponents_
of Lisp (like the author was, coming into the course) define Lisp by the set
of features that make Lisp "not a Real Programmer†'s programming
language"—i.e. the set of things that make them not want to use it.

† [http://www.catb.org/jargon/html/R/Real-
Programmer.html](http://www.catb.org/jargon/html/R/Real-Programmer.html)

My assertion was, from these opponents' perspectives, there are very few
languages left for "Real Programmers"; most modern languages have inherited
nearly all of those horribly convenient Lisp-isms. Heck—modern CPUs are so
good at pointer-chasing, they may as well be Lisp Machines!

------
okket
Previous discussions:

2017:
[https://news.ycombinator.com/item?id=14247269](https://news.ycombinator.com/item?id=14247269)
(261 comments)

2013:
[https://news.ycombinator.com/item?id=5375735](https://news.ycombinator.com/item?id=5375735)
(176 comments)

2009:
[https://news.ycombinator.com/item?id=504667](https://news.ycombinator.com/item?id=504667)
(39 comments)

~~~
bruth
Its a matter of time before there is a "bot" that auto-posts this kind
response for historical reference.

~~~
FPGAhacker
There is a link at the top called “past” that does exactly what all these
“helpful” posts do. And it’s been there for a long time.

------
tombert
This isn't identical, but I remember when I first learned first-class
functions and partial application in Haskell about ten years ago, it was a
"the world is different now" moment. I had reinvented partial-application
approximately thirty billion times by using overloading or with elaborate if-
elseif-else chains. When I learned Haskell, and saw that I could magically
make my non-lists function automatically lift across a list, and how I could
_define things like this myself_ , I was immediately convinced that I would
never go back to C again (I didn't know about function pointers).

I learned Clojure about a year ago (which was my first introduction to Lisp
outside of reading SICP), and it gave me a similar feeling. I felt like
Clojure was a "better Java than Java", and now its my go-to JVM language. I
think McCarthy was really onto something with Lisp :).

~~~
joekunin
Could you give an example of partial application with Haskell in the way you
mention? I've had similar feelings with Clojure.

------
phoe-krk
_I was impressed first by the fact that whoever designed this particular Lisp
system cared about efficiency._

This isn't a particularly unusual approach. Common Lisp, a contemporary Lisp
dialect, was designed with efficiency in mind. The approach of "Lisp = slow +
inefficient + spending 80% of time on garbage collection" is a myth.

~~~
SolarNet
This article is set about 20 years prior to common lisp's standardization. It
was a reasonable belief then.

~~~
dasmoth
It’s set in 1983. “Common LISP The Language” was published in 1984.

(The ANSI standard was 1994. Still nowhere near 20 years).

~~~
coldtea
Well, 10 years is the same order of magnitude as 20.

------
n4r9
Isn't the derivate relatively easy in C?

    
    
        #include <stdio.h>
      
        const double delta = 1.0e-6;
        double cube(double x) { return x * x * x; }
        double deriv(double (*f)(double), double x) { return (f(x+delta) - f(x)) / delta; }
    
        int main()
        {
            printf("%f", deriv(&cube, 2));
            return 0;
        }
    

Am I missing something?

~~~
mishac
I think it's that you can calculate the value, but you can't (or can't as
easily) create a new function that is the derivative of another function, and
pass that around.

~~~
gameswithgo
yeah you can return a new function, but it is less easy than in lisp, or F#

but in general a lot of that stuff in the article seems mundane today, you
have to imagine seeing it in 1983

~~~
omaranto
> yeah you can return a new function

How?

~~~
fasquoika
You could do it with JIT compilation or self-modifying code. I'm not sure if
there's a simpler way in C

~~~
3b9cd7355317f55
You can exec a C compiler and use dlopen on the generated object file, or use
libtcc[1].

[https://www.bellard.org/tcc/](https://www.bellard.org/tcc/)

------
kleopullin
I thought it would be a down with Lisp article, but it was like my experience
with the language, omg, how cool. From that ancient introduction to Lisp all I
learned was that Lisp was as close to perfection and beauty as I would get in
a programming language.

~~~
wycy
Serious question: what is it about the "beauty" of Lisp that the HN community
seems to like so much? To me, how Lisp looks is its worst quality--the number
of parentheses is just mind boggling. I want to understand why it is so loved.

~~~
lisper
> the number of parentheses is just mind boggling

There aren't actually more parens in a Lisp program than a program written in
a C-like syntax (which is really an algol-like syntax). They just stand out
more for two reasons:

1\. There is less punctuation in general, so the parens are more obvious.
Instead of f(x, y, z) you write (f x y z). Without the commas, the parens
stand out because that's all that is left.

2\. There is only one kind of parens in Lisp whereas C-like languages use at
least three: (), [], and {}, so that makes any particular kind of paren less
prominent.

~~~
dxhdr
I believe it's purely seeing lines that end with )))))))) that leads the OP's
observation ("To me, how Lisp looks is its worst quality--the number of
parentheses is just mind boggling"). You don't get that in a C-like syntax. In
practice you don't read each individual closing paren to understand the code
so it doesn't matter. But it sure looks "scary."

~~~
lisper
> You don't get that in a C-like syntax.

No, instead you get something like );}]);) except that that's usually split up
over several lines.

BTW, it's pretty easy to tweak Lisp's syntax so that your parens don't get so
deeply nested. See

[https://github.com/rongarret/tweetnacl/blob/master/ratchet.l...](https://github.com/rongarret/tweetnacl/blob/master/ratchet.lisp)

for an example.

------
GlenTheMachine
When I was in high school I had access to three programming tools: Commodore
BASIC, 6502 assembly, and Turbo Pascal. I spent years hand coding 6502
assembly because BASIC was clearly for amateurs and I could only get at the
Pascal system at school.

At one point I wrote my own Pascal interpreter for the C64. In BASIC, because
I wasn't good enough to do it in assembly. It was a little slow.

I desperately wish someone had introduced me to lisp. The entire course of my
life would have been different. The most frustrating thing is that I probably
could have written a naive lisp interpreter in assembly.

~~~
colomon
I got a Forth for my C64, and it was the PERFECT language for the machine:
fast, compact, structured, extensible. Generally much easier to understand the
code than C64 Basic, but you could also easily call into assembly language if
you needed a bit more speed.

------
lixtra
I’m too young (or studied in the wrong time or place) so my CS teachers failed
to impress me to such an extent.

However, I very well remember reading the Compilerbau book by N. Wirth and the
awe I felt. It’s just a few pages but it was a revelation.

~~~
sures
Can you share the name of the book? Is it "Compiler Construction" by Niklaus
Wirth?

~~~
tralarpa
Check also his other books: Algorithms and Data Structures, Programming in
Oberon, Project Oberon, etc.

They are all available on the ETH website and his homepage (in various
revisions). Basically everything you need to know about imperative programming
on CS bachelor level for free.

~~~
lixtra
Indeed, a small search yields:
[https://www.inf.ethz.ch/personal/wirth/](https://www.inf.ethz.ch/personal/wirth/)

Also:
[https://news.ycombinator.com/item?id=10764672](https://news.ycombinator.com/item?id=10764672)

------
ilovecaching
Any computer scientist worth his salt has read SICP. I had similar reactions
as the author of the post to the expressiveness of Lisp, and it set me on a
journey towards functional programming that has been so rewarding to my
career. I also got a chance to meet some of the people who worked on Lisp
machines at MIT, who interacted with Stallman back in the day. The Lisp
machines were truly ahead of their time, and Lisp is still as beautiful in its
homoiconic glory today as it was back in the 60s.

I now focus primarily on Rust, which has given me a lot of the same feelings
as when I first encountered Lisp. It's truly a language ahead of its time, and
I hope it will continue to grow beyond what even Lisp was able to accomplish.

------
lunchladydoris
For anyone who wants to feel what this might have felt like (and you've yet to
watch them), MIT posted videos of Abelson and Sussman presenting the course on
YouTube [0]. I felt a similar sense of magic the first time I saw the
derivative section.

[0]
[https://www.youtube.com/watch?v=2Op3QLzMgSY](https://www.youtube.com/watch?v=2Op3QLzMgSY)

------
divs1210
Differentiation/Integration implementation in Clojure:

[https://gist.github.com/divs1210/4ca74577711eb996a89a36d86a3...](https://gist.github.com/divs1210/4ca74577711eb996a89a36d86a38ea78)

------
chris_mc
Scheme was my favorite language in school back in 2006, but the only class I
ever used it in was a class where we learned search algorithms and the like
(it was either AI or Robotics). I was dismayed to find out there isn't a lot
of software written in LISP dialects, because I would love to write LISP code
all day, it's so much fun.

------
void_starer
> Hal went on to explain how the substitution model of evaluation worked in
> this example and I carefully watched. It seemed simple. It couldn't be that
> simple, though, could it? Something must be missing. I had to find out.

Well, not that simple... as the author hasn't taken the potential problem of
free variable capture into account.

------
Rezhe
I was working on a webpage for translating SICP lecture, it is always
something worth to split out.
[https://learningsicp.github.io](https://learningsicp.github.io)

------
bribri
I wish so much that I had known about lisp in college. I only knew Java, C++,
Javascript, and Python.

------
rifung
I feel like I am missing something here but are the things mentioned in the
article (tail call recursion, first class functions) unique to LISP?

They seem available in other functional languages.

I was under the impression LISP was unique due to its macros.

~~~
cnasc
Where do you think those languages got those features from? This story appears
to take place in the 80s, incidentally.

------
moocowtruck
my question is yes, lisp.. but what is beyond lisp?

~~~
gizmo385
LISP is more akin to a class of languages rather than a single language in
isolation, so it is kind of trick to ask what's "beyond" LISP.

~~~
moocowtruck
if we can solidly say what lisp is, then i'd assume there's beyond the
horizon...if we can't say what lisp is.. then whats the point? is it just
anything?

i mean, if i have a lisp with static types vs not static types, why is the
lisp part important at all? and you just have two different languages one with
static types and one without?

seems very unimportant? i dunno could someone enlighten me

------
pweissbrod
So the author was excited about the concept of function composition? I'm a fan
of lisp but I'm not seeing what's particularly novel about lisp here.

~~~
reikonomusha
Context is also helpful. “I’m not sure what’s novel about function
composition” is a little bit of a smug take on a language and concept that was
pretty cutting edge back in the early 80s. Now we Meat-and-Potatoes
Programmers know the composition is useful, but I’d be hard pressed to find
first-class composition in 1980 in any popular language that wasn’t {academic,
Lisp}.

~~~
int_19h
Was it, though? If I remember correctly, Algol 68 had facilities sufficient to
do everything the author describes, and just as neatly.

------
johan_larson
Lisp: gnosticism for programmers.

------
QuadrupleA
All this could be done easily in C (function pointers, factorial for loop
etc.). And sampling a function twice at 0.0001 dx's apart and calling it
"doing calculus" is a stretch.

The author seems to be making the case that, contrary to his original
skepticism, Lisp is indeed some magical higher-order language, but I honestly
don't see it. Lisp's ratio of philosophizing to noteworthy projects seems to
tend toward infinity.

~~~
ehaliewicz2
How could you create the equivalent semantics without closures and higher
level functions in C? For example, if you wanted to take the second derivative
of f

(deriv (deriv f))

or if you wanted to take the nth derivative

(define (nth-derivative f n)

    
    
        (if (= n 0)
    
            f
    
            (deriv (nth-derivative f (- n 1))))
    

You could do it with some data structures, which you'd then have to interpret,
but not as nicely or easily.

~~~
QuadrupleA

        #define DX 0.0001
        typedef double (*func)(double);
    
        double NthDeriv(int n, func f, double x) {
            if (n == 1) {
                return (f(x + DX) - f(x)) / DX;
            }
            else {
                return (NthDeriv(n - 1, f, x + DX) - NthDeriv(n - 1, f, x)) / DX;
            }
        }
    
        double Cube(double x) { return x * x * x; }
    
        double result = NthDeriv(3, &Cube, 5.0);

~~~
ehaliewicz2
Touche. It's not as general as the lisp example though.

