Hacker News new | past | comments | ask | show | jobs | submit login
How Lisp Became God's Own Programming Language (twobithistory.org)
614 points by chwolfe 5 months ago | hide | past | web | favorite | 307 comments



> McCarthy built Lisp out of parts so fundamental that it is hard to say whether he invented it or discovered it.

I loved that sentence! I'm guessing epistemology or a similar field has pondered the "invented or discovered" question already, and if so, I want to read about it.

> [on SICP:] Those concepts were general enough that any language could have been used

What?? In chapter 4, you write your own Lisp interpreter. If they had chosen C++, would you be writing a C++ compiler? Or a Lisp interpreter in C++? Either way, it would be ugly. And most languages would encounter problems even before they got to chapter 4. What made SICP great was building abstractions out of primitives. Most languages give you some abstractions, and others simply can't be built (at least not elegantly). I can't imagine SICP using a language that doesn't feel a lot like Lisp.

That said, count me among the people whose first Lisp exposure was SICP. And yeah, it was really fun and really enlightening. I am loving Racket now, but Racket is big and practical. SICP is small and beautiful - as I recall, the authors deliberately avoid using most of the language. (They used Scheme, but I think Lisp would have worked fine too, right?)


> In chapter 4, you write your own Lisp interpreter. If they had chosen C++, would you be writing a C++ compiler? Or a Lisp interpreter in C++? Either way, it would be ugly.

It's an interpreter for a small subset of Scheme, but Scheme is of course itself small and simple†, which means that it's not hard to mentally extrapolate from the small subset to the rest of the language. C++ is a large language, and so it would be less compelling to implement a small subset of it. And you could of course use C++ to implement Lisp, as you say; chapter 4 actually covers several programming paradigms quite alien to Scheme itself, so implementing a functional programming language in C++ would be quite reasonable. As I read it, the main point of chapter 4 is that it's not true that "Most languages give you some abstractions, and others simply can't be built." Metacircularity makes chapter 4 very effective at convincing you that what you've written is actually a real programming language, but it has confused you into missing (what I read as) the main point of the chapter. Perhaps it was counterproductive.

If not, though, it would be quite reasonable to achieve the same pedagogical metacircularity in Lua, Prolog, Forth, ML, Smalltalk, GHC Core, my own Bicicleta, or any number of other small, simple languages. I don't know about you, but those languages feel very different (from Lisp and from one another) to me.

† "Simple" here refers, for better or for worse, to the observed behavior of the language, not the difficulty of its implementation. Much of this is easy to gloss over in a metacircular interpreter, where you can inherit garbage collection, procedure call semantics, dynamic typing, threading, call/cc, and even parsing, lazy evaluation, and deterministic evaluation order, from the host language. (The book points out many of these points.) If you were going to design a language at Scheme's programming level with an eye to making the implementation as simple as possible, Scheme would not be that language.


Even though it doesn't look like it, I could've sworn Prolog was homoiconic.

Smalltalk can also do a lot of the cool lisp wizardry. I know Alan Kay (a user on here who was on the team that designed Smalltalk at Xerox-PARC) likes both langs. I've never heard him describe what he thinks are the pros and cons of both though.

You can write a lisp in a few lines of Forth. I've never done it, but people talk about it a lot, so I'm going to assume fact.


> You can write a lisp in a few lines of Forth.

Seems unlikely. Forth isn't garbage collected, is untyped, etc etc. So you'd have to add those things in "a few lines".

Though I guess it depends what you mean by "a few". I was thinking a couple of dozen. Looks like someone's done it in 500 (though without GC -- is that still Lisp?): https://forums.parallax.com/discussion/160027/lisp-technical...


Not all Lisps had GC in the early days.

There were some mainframe implementations that would just allocate until there was no more memory left.


This is actually a random side project of mine! https://github.com/tgvaughan/scheme.forth.jl


In addition to not technically needing a GC, a simple GC can be small if you don't care (much) about performance. One variation over a basic tricolor mark and sweep using linked lists for each "color" is basically (ruby-ish pseudocode):

  # At this point all objects are in the "white" set. We don't know if they're reachable (in which case they're garbage)

  # Mark roots
  push_registers_to_stack
  stack.each {|ob| grey.append(ob) } # You need to do this to any other roots as well, e.g. any pointers from bss.

  # Iteratively scan every object, moving unscanned objects into the "grey" set (reachable but unscanned)
  grey.each  {|ob|
    ob.referenced_objects.each {|ref| grey.append(ref) }
    # "ob" has now been scanned, so moving it into the black set (reachable *and* scanned)
    grey.delete(ob)
    black.append(ob)
  } 
  # What remains in the "white" set is now the garbage, so free it. You can also, if you want, move this aside and free it lazily.
  white.each {|ob| ob.free }
  white = black
A "real" implementation of the above can be done in <100 lines in many languages. Doing efficient gc gets more complex very quickly, though.

The mruby gc is a good place to see a more realistic tricolor mark and sweep that is also incremental and also includes a generational collection mode. It's about 1830 lines of verbose and well commented C including about 300 lines of test cases and a Ruby extension to interact with it. A simpler one like the above could certainly be much smaller even in C.


Ruby-ish pseudo code is somewhat remote from working Forth.


It's remote in that it looks very different, but not in terms of complexity. The operations used in the pseudo code are basically just.

- Pushing registers onto the stack

- Iterating over the stack

- Appending to a linked list

- Popping the first entry off a linked list

- Iterating over a linked list

- Calling a function/method ("free") to release the memory. In it's most naive form this would mean appending it to a linked list of free memory.

The two first requires dipping into assembler in a lot of languages. The rest will be tiny in most languages, including Forth, but my Forth knowledge is very rudimentary hence why I didn't try.


I always have problems with the definition of homoiconicism...

Prolog code is represented the same way as data, but (AFAIK) there is no functionality for manipulating code at run-time or interpretation-time. I believe allowing macros is a necessary condition for homoiconicism, if I'm right, then Prolog isn't it.


> there is no functionality for manipulating code at run-time or interpretation-time

Sure there is: http://www.swi-prolog.org/pldoc/man?section=manipterm

Example:

    ?- foo(Bar, Baz) =.. List.
    List = [foo, Bar, Baz].
And:

    ?- F =.. [foo, Bar, Baz].
    F = foo(Bar, Baz).


No, I think just code is data is code right?

Also, I'm not sure macros are needed for a language which doesn't do compilation.


You can do all that with term-rewriting in Prolog, you can use a grammar to parse a file of arbitrary text and transform it into prolog terms to be executed etc


>>> It's an interpreter for a small subset of Scheme, but Scheme is of course itself small and simple†, which means that it's not hard to mentally extrapolate from the small subset to the rest of the language. C++ is a large language, and so it would be less compelling to implement a small subset of it

Exactly. This is the point of Lisp. I think many languages are over complicated for no good reason.


① Yes, in a sense, this is the point of "Recursive Functions of Symbolic Expressions and Their Computation by Machine, Part I." But Lisp is a much larger and more diverse phenomenon than McCarthy's 1960 paper. For example, Common Lisp is not small and simple, although it is not as complicated as C++, and it's still part of "Lisp".

② Regardless of whether it is or is not the point of _Lisp_, it is not really the point of _SICP_, which is about different approaches to structuring programs. One of them is the Lisp approach.


McCarthy himself said, "Pure Lisp was a discovery, everything that has been done with it since has been an invention." He said this to a class at Stanford in 2008, reported in HN! [1]

[1] https://news.ycombinator.com/item?id=185348


Chuck Moore always says he did not invent FORTH, but discovered it.

It's fascinating the relationship between Lisp and Forth. I am quite sure there is something about these two old simple languages and the concept of duality in mathematics that we are missing.


When I read SICP, I worked through the exercises in Forth. I had to implement my own garbage collector, create dialect extensions for nice list notation, build a thunk-based mechanism for closure... with the right groundwork laid, Forth can keep up pretty well. I was always somewhat dissatisfied that cute metacircular lisp interpreters leave as tautologies so many important details of their own mechanics.


It functions as a good way to whet the appetite of the reader or student. SICP's meta-circular implementation occupies a kind of sweet spot, it imparts to students that only just grokked recursion to understand how the programming language itself is just a program they can understand and tinker with. The state machine implementation of the Scheme interpreter sort of breaks the meta-circularity to dismantle the tautologies.


How did you learn how to write your own Forth? Brodie shows how to just think in and use one already written right?


Jones Forth is an extremely readable tutorial how to write a Forth.


I've gone through it before, but it has been ages. Maybe I'll check again...thanks.


You're not the only one to observe this. It was observed and explored by Manfred von Thun with Joy, a purely functional programming language based on Forth. Modern "concatenative" languages continue this tradition:

https://en.wikipedia.org/wiki/Concatenative_programming_lang...


I've been working with Joy for a few years now and I have to say that it seems to me to combine the best parts of Forth and Lisp.

It's ridiculously elegant.


Do you have a link to whatever project you are working on? I'm interested in Joy but not aware of anyone working on or with it in recent decades.


Sure! I developed an interpreter for Joy in Continuation-Passing Style in Python: http://joypy.osdn.io/

I had just implemented type inference[1] when I got smashed over the brain by Prolog. So now the Python code is in limbo because the Prolog implementation of Joy is blindingly elegant[2].

In Prolog, the Joy interpreter is also the Joy type inferencer.[2]

The compiler (from Joy expressions to new Prolog rules) is five lines long.[2]

[1] http://joypy.osdn.io/notebooks/Types.html

[2] https://osdn.net/projects/joypy/scm/hg/Joypy/blobs/tip/thun/...


Thanks!

Yes, there's a real connection between concatinative languages and logic languages. In some sense you can think of a concatinative logic language as Joy with the ability to push "unknown" values onto the stack, as well as a few constraint manipulatives. That's somewhat orthogonal, but it makes sense an interpreter for Joy would be quite compact in prolog.

Thanks for the links, I'll review your work later!


Cheers!

You might also like: http://conal.net/papers/compiling-to-categories/

He transforms Haskell to a point-free form that's suspiciously like Joy... ;-)


Starting Forth by Leo Brodie is a classic book. Forth is a very elegant language. I came across a javascript implementation of it recently, which you can look up yourself :)


As a kid I went from basic to assembly. When I was shown Forth by my dad's friend I was totally caught up in Forth on my C64. I later tried Pascal which is also great but it wasn't close to forth in my mind. It was awesome Except you had to pay for the language and I am guessing that is why it died.

Currently my preferred language is Racket because it is so fun and practical, reminds me of the earlier days of Python but much more thought out. You write you code (which is fun) and you have an executable all within one setting.


Armed with that and a copy of Threaded Interpretive Languages by Loeliger (How I found out about the book is beyond me) from the library I managed to put together a complete(ish) Forth for the Apple ][ running ProDOS.

I think the magic bit with Forth is that it's easy to see how to write one by the time you've finished Starting Forth.


Is it though? Man, maybe I'm dumb :)

I wish there was an easy checklist: Create stack data structure, create return stack, create primitives ...etc.



What does Factor have to do with my question? Yes it's a concatenative language that is still maintained although Slava left. I'm guessing it wouldn't make a great example if you're suggesting going through source.


And it seems to be free for personal use: https://www.forth.com/starting-forth/


> It's fascinating the relationship between Lisp and Forth.

Is there more to the relationship than "Lisp is written as a preorder tree, and Forth is written as a postorder tree"?


There is an ancestral relationship. Chuck Moore studied under McCarthy directly.


Yep. And back in the 1970's Chuck Moore wrote a book describing the implementation and design of FORTH, and it is pretty close to LISP. And I asked Chuck Moore directly, in person, if there was a connection to LISP, and he said "yes".


They are both very different, technically, but culturally they are very similar. Both of them are extremely simple and build complex programs from a few fundamental principles. Both of them are disengaged from the more mainstream languages. Both of them broaden your perspectives as a programmer. Both of them have excellent books that were written a long time ago but are still relevant today. Both of them gave rise to specialized hardware (stack computers and Lisp machines). Both of them encourage you to write your own interpreter when you are learning them... Although both languages are very different, learning one or the other actually is a quite similar experience.


Forth is actually disengaged from mainstream languages; Lisp being disengaged from mainstream languages is just a meme believed by some non-Lisp programmers (something to do with those parentheses). Basic Lisp is very similar to mainstream languages in the Algol family. Lisp dialects usually have nested block structure, with local variables that can be assigned, functions with formal parameters with ordinary by-value argument passing and so on. Javascript is a mainstream language and has strong ties to Lisp, much more so than to Forth. Python is more closely related to Lisp than to Forth. C is more closely related to Lisp than to Forth. Java is more closely related to Lisp than to Forth ....


Short answer: no.

Longer answer: actually another common feature is that both can execute code at compile-time (macros).

Full answer: really, no. They are fundamentally different: Lisp is a decent symbolic language, Forth is a decent portable assembler.


Really, no. They are fundamentally identical executable lambda calculus, despite Lisp's attempts to look abstract by shrouding itself in parentheses and Forth's attempts to look like "portable assembler" by exposing most implementation details.


Forth isn't written as a tree; the tree structure in the evaluation/compilation comes from the semantics of the words (how many elements of the value stack they consume).

In Lisp we know that (eql 1 2 3) is a "too many arguments" error. It doesn't just take 1 2 from the stack, push back the answer, and leave 3 beneath it.

Lisps are actually fairly conventional block-structured languages, having more in common with Pascal than with Forth. Lexical scopes, local variables, function calls with arguments going to formal parameters and all that.


> In Lisp we know that (eql 1 2 3) is a "too many arguments" error. It doesn't just take 1 2 from the stack, push back the answer, and leave 3 beneath it.

That error also comes from the semantics. You can't tell that (eql 1 2 3) is a too-many-arguments error just by looking at the tokens. That could be a 3-ary function, or an "eql is not a function" error.

The parentheses in Lisp code certainly suggest a particular tree structure, but they don't require it by virtue of their graphical form. And once you're willing to look up the arity of functions, Forth is written directly in its own tree structure to exactly the same degree that Lisp is, just with structural markers elided.

Think about it another way: a classmate in compilers class asked me once how Lisp handled operator precedence. Obviously, with the tree structure of the code fully specified, Lisp doesn't handle operator precedence because the question can't arise.

Note that this is just as true of Forth.


We can tell that 3 arguments are being applied to eql from that expression; when this is compiled, the generated call will pass 3 arguments, regardless of whether eql likes that, or whether such a function even exists. In the absence of any static checks, it will have to be caught at run time; but it will be caught.

Also, Lisp supports variadic functions with the same safety. Delimitation of arguments is necessary for variadic functions, like list.

The structure of every top-level form is an actual tree object in memory that is constructed before anything else happens. It can be walked by macros, or quoted so that the program accesses it as data directly.

Basically, nothing like Forth.


> Delimitation of arguments is necessary for variadic functions, like list.

This is not the case; you can easily have a Forth word that consumes however many arguments as are on the stack.


Ah, but the argument was made that Forth code has an implicit tree structure based on looking up the arity of functions, in place of explicit delimitation with parentheses. How can that be if some function takes N arguments, where N is, say, a run-time value, itself on the stack.


This resonates. I once sought to create a DSL on top of F# for workflows only to end up inventing lisp again. It was a humbling experience. The thing about lisp and why I dont use it in the workplace regularly is that it's dense and inconsistent. Dense because it's easy to write a profound amount of logic in one line which hurts readability if youre not careful. Inconsistent because you and I could be master programmers and write perfectly effective solutions that do the same thing and the code can look entirely different. This hurts interchangeability and it doesnt scale well, which disqualifies it from most enterprise projects. Similar to perl. That being said all my personal projects are done using clojure because it's such an easy/fun platform to work with


Regarding the discovery vs. invention question. It has indeed already been discussed within philosophy. Here are just two examples in philosophy of mathematics: https://philpapers.org/rec/DETDIA-3 https://philpapers.org/rec/FINMDO


Incidentally, I found building a Lisp interpreter in C++ using the SICP interpreter recipe a great exercise.


We had a follow-up course to SICP that started with the meta-circular interpreter, broke that one down even further and eventually moved completely to a C-implementation. Most insightful course I ever had. We got assignments such as: "add co-routines to the continuation-passing style meta-circular interpreter, and to the stack-based C interpreter."


>What?? In chapter 4, you write your own Lisp interpreter. If they had chosen C++, would you be writing a C++ compiler?

There was a site [0] that had SiCP exercises in various other languages.

[0]: https://web.archive.org/web/20100503233725/http://www.codepo...


> What?? In chapter 4, you write your own Lisp interpreter. If they had chosen C++, would you be writing a C++ compiler? Or a Lisp interpreter in C++?

That's in chapter 5 :)

Exercise 5.51. Develop a rudimentary implementation of Scheme in C (or some other low-level language of your choice) by translating the explicit-control evaluator of section 5.4 into C. In order to run this code you will need to also provide appropriate storage-allocation routines and other run-time support.


Ha, well I guess my secret's out - I skipped some of the exercises! And I'm not surprised to see I skipped that one. Even now, 5.51 and 5.52 seem like they would be more hard than fun. But given that SICP is my favorite CS book, maybe I owe it to myself to go for 100% completion.


> I can't imagine SICP using a language that doesn't feel a lot like Lisp.

SICP's main value proposition is not a language but developing thinking in terms of creating abstractions through compositions. LISP like language certainly helps but most modern languages does have this ability although verbose and possibly subjectively ugly in many cases.

SICP with Python: https://wizardforcel.gitbooks.io/sicp-in-python/content/

SICP with JavaScript: https://www.comp.nus.edu.sg/~cs1101s/sicp/


"epistemology or a similar field has pondered the "invented or discovered" question already, and if so, I want to read about it."

Yep. Me too. In fact, request for essay! It sounds like a topic I'd love to read pg meander into, for example.

I think anytime invention starts to seem like a discovery, it's probably a good invention.

I'm not sure there is a fundamental difference between the two. If you strain hard enough and get precise enough with your definitions, you could probably describe any invention as a discovery and the reverse. Recursively, that probably means any such definitions are inventions rather than discoveries.


It's discussed briefly [0] in Dirk Gently.

[0] https://www.goodreads.com/quotes/488211-sir-isaac-newton-ren...


> I'm guessing epistemology or a similar field has pondered the "invented or discovered" question already, and if so, I want to read about it.

Any course or book on the Philosophy Of Science would cover that. Are numbers discovered or invented? Did the ancient hindu mathematician invent the number zero or did he discover it? You could argue that the symbolic representation of zero was invented ( or was it? ) but the idea of the number zero was discovered. The more you delve into it, the more you realize things aren't as straightforward as you might have imagined.

Or quicksort. Is it an invention or a discovery? You could go back and forth on it forever.


Lisp is a powerful language that allows one to mix multiple logical levels within a program (especially through it's homoiconicity).

Shrdlu is perhaps the most classic illustration of Lisp's power[1], A classic program from 1968-70 that allowed natural language communication about a micro world. It was written in a version of Lisp even more free-form than today. When I looked at the source code a while back, it's parsing of natural language involved a wide-variety of on-the-fly fix-ups and such to take into account the multiple irregularities of human language.

The thing about Lisp's power is it allows the production of complex program quickly but doesn't particularly have a standard way of gluing these programs together. The other classic Lisp work, The Bipolar Lisp Programmer[2], describes how Lisp has multiple partial implementations of important libraries and programs simply because it allows a certain type of personality to by themselves produce something that's remarkably good but doesn't encourage any particular group effort.

Lisp is certainly evidence that "language matters" but not evidence that Lisp is where one should stop.

[1] https://en.wikipedia.org/wiki/SHRDLU

[2] http://marktarver.com/bipolar.html


> The thing about Lisp's power is it allows the production of complex program quickly but doesn't particularly have a standard way of gluing these programs together.

I'd argue Clojure is setting some new standard here with its powerful literal notation for basic data structures (list, vector, set, and maps) and its practice of integrating/composing first on data. Most of the time that's enough to get some libraries work together. If not enough, integrating/composing on functions is the next stage. Only after having covered data and functions should you consider macros, which should be easier to write if you have done a good job at the data and function levels.

If such powerful and ubiquitous data structures work so well for programs written in clojure(script), why would that not be true in the larger picture for integrating systems? JSON worked much better than XML because it actually represents simple data structures, not objects talking to each other. Redis, Avro, and similar tech are continuing that story while Kafka adds a fantastic transport and storage mechanism. Sounds like we're closer to having better foundations for integration in the small and in the large.


I have used Clojure on several customer projects and my own Cookingspace.com hobby project. I like Clojure except for the way recursion is handled. I have been using various Lisps since 1982 and Clojure’s support for recursion bugs me.


IANA clojurian, but its inelegance of tail recursion is a limitation of java, correct?


Yeah, because the JVM can't do TCO you can't have nice recursion. The options in the language provides are good enough but definitely not the most ergonomic.


I can't remember the last time I wrote a recursive call in Clojure, other than for multi-arity. In general, if I were in a code review and saw the "recur" keyword, I think I would regard it as a code smell and see if I can re-implement it using higher order functions.


That is how we coded configuration files in Tcl back in late 90's, serialized data structures which we only needed to source on load.


> Lisp is a powerful language that allows one to mix multiple logical levels within a program (especially through it's homoiconicity).

Statement like these are what make me suspect Lisp code would be a maintenance nightmare. I am regularly refactoring code to separate multiple logical levels so we can more easily maintain our codebases. If this is a misunderstanding, can someone clarify?


>>Statement like these are what make me suspect Lisp code would be a maintenance nightmare.

After having maintained Java code bases that do 300 classes just to post a JSON to a REST end point, I'm fairly confident Lisp code can't be that hard to maintain.


There are perhaps languages out there which are better than Lisp for maintenance in some respect, but it's none of the mainstream ones. Lisp code is not particularly harder or easier to maintain; most of that depends on the original author(s) as with any codebase.


It's the reason Aaron Swartz gave for rewriting reddit in python:

>The others knew Lisp (they wrote their whole site in it) and they knew Python (they rewrote their whole site in it) and yet they decided liked Python better for this project. The Python version had less code that ran faster and was far easier to read and maintain.

http://www.aaronsw.com/weblog/rewritingreddit

I suspect the problem is that LISP is too powerful. Language power is inversely related to maintainability/readability, as per the greatly underrated rule of least power: https://en.wikipedia.org/wiki/Rule_of_least_power


> Language power is inversely related to maintainability/readability,

Is that really True?

Having worked on VB6 codebases in the past, the less powerful nature of the language just meant a LOT more code which is definitely harder to maintain.


No, it’s not, and even if it were true it wouldn’t disqualify lisp, since you can quite happily wield lisp in a context which lets you express logic in simpler/constrained forms. Consider hiccup, datalog DSLs, etc.


It's inversely proportional to length too. Maintainability is about maintaining a delicate a balancing act between several competing fundamental concerns - two of which are expressiveness and verbosity.

I think python's popularity is partly because it found a sweet spot that was neither too expressive nor too verbose. It's certainly possible to dial up the expressiveness (LISP, Perl) or dial it down (Golang, VB) and get something that is harder to maintain both ways.


Python derives a fair amount of expressiveness from the power it gives to library authors. A lot of this in turn comes from its dynamic implementation.

As a library author you can write effective python at different levels. Some involve quite heavy meta-programming (jinja2, collections.namedtuple, django's ORM) and most users would never find this out.


Python has C++/Lisp level of power, but since many just do toy scripts they don't realize it.


Again I think Perl / Python are a perfect example of why post I was responding to is not true.

> Language power is inversely related to maintainability/readability,

Perl and Python are both considered equally powerful languages, yet Perl is a fair bit more expressive and as a result is often harder to maintain.


Well,

The law of power linked above says to use the least powerful language appropriate for the task. So it's possible VB6 fell into the category not-powerful-enough/not-appropriate.


This. I find Lisp optimal for the small band of coder heroes scenario. They must align on cs philosophy, and taste. Grow the team, or have divergent approaches: the efficacy of Lisp quickly disappears and problems appear.


Is that hearsay, experience or a guess?

There have been a few projects with 10 to 100+ Lisp programmers. What was their experience?


I have experience of great success and productivity in small teams with Common Lisp and Clojure. I don't have direct experience of dysfunction in a larger Lisp team. That was a guess based on how being a senior dev in a 50 man C# shop felt :)


Well, at some point we have to face reality. There have been hundreds if not thousands of successful projects with teams of hundreds if not thousands of developers working on them, at the same time or over time. These projects have been generally in Cobol (ok, we can chalk that up to brute forcing a tech from they didn't know any better), C, C++, Java, C#.

There seems to be a connection between scaling a project regarding developer numbers and programming language "power". Having a smaller, shared vocabulary seems to greatly outweigh programming language "power" as the number of programmers goes up greatly.

Lisp had 60 years to prove its human scaling capabilities. So far it hasn't convinced.


> There seems to be a connection between scaling a project regarding developer numbers and programming language "power". Having a smaller, shared vocabulary seems to greatly outweigh programming language "power" as the number of programmers goes up greatly.

That goes against evidence from current enterprise software experience. The Java eco-system has the absolute largest programming vocabulary ever (J2EE, JEE, ...) and is widely used.

AT&T/Lucent once wrote the software for a telephony switch in Lisp - I think the project ran over a decade and created more than one generation of working/shipping hard&software. The team was easily 100+ people. I heard a talk of the responsible manager years ago. They wrote basically the same functionality as a ten times larger C++ project and the Lisp team lead was extremely satisfied with what they wrote.

AT&T Management favored the C++ project for mostly non-technical reasons - C++ being an 'industry language' with a larger supply of developers. Not surprising for AT&T - Ericsson made a similar decision at some point in time with their (later lifted) 'Erlang ban'.


We have to decide what's enough to convince (and convince whom) I guess.

Just to throw out a couple of examples:

One of the larger businesses I'm aware of that's using a Lisp is Nubank in São Paulo. Valued at $1bn+ and 4 million paying customers[1]. Finance is a fairly complex domain.

King in Stockholm has also fairly recently rewritten their main game creation tooling in Clojure[2].

1: https://www.reuters.com/article/nubank-creditcards/brazilian...

2: https://techblog.king.com/how-we-use-clojure-at-king-to-writ...


> Lisp had 60 years to prove its human scaling capabilities. So far it hasn't convinced.

It can take more than you might expect to "convince".

> In 1601, an English sea captain did a controlled experiment to test whether lemon juice could prevent scurvy. He had four ships, three control and one experimental. The experimental group got three teaspoons of lemon juice a day while the control group received none. No one in the experimental group developed scurvy while 110 out of 278 in the control group died of scurvy. Nevertheless, citrus juice was not fully adopted to prevent scurvy until 1865.

( https://www.johndcook.com/blog/2008/03/25/innovation-ii/ )


Sooo...what would be the "programmer scurvy" Lisp (and especially Lisp) could help to prevent?


And even in 1911, Scott's expedition suffered from scurvy, mainly because they misunderstood its cause! http://idlewords.com/2010/03/scott_and_scurvy.htm


I have heard that the British knew what caused scurvey, and how to treat it with citrus, but they kept it a closely guarded secret to keep an advantage vis-a-vis other countries navies.


That is more or less correct, but it's a very weird situation. Scurvy had been known -- along with effective cures -- for thousands of years. The method of curing scurvy hasn't been secret since -- at the latest -- the Ebers papyrus of ~1500 BC, which correctly prescribed feeding the patient an onion. And in fact cures were widely known throughout the world since that time, including in Europe. Note again that James Lancaster had heard that lemons were effective against scurvy, 200 years before the Navy got around to requiring them.

From https://tudorblog.com/2012/06/16/voyage-of-the-scurvy/ :

> from about 1500 to 1800 two million sailors are estimated to have died from scurvy on expeditions to Asia, Africa and the New World.

> Why no cure? In Tudor England an effective treatment, scurvy grass (a corruption of cress), was commonly recommended; at the same time the Portuguese knew a cure, and so did the Spanish and the Dutch. For unknown reasons such wisdom was applied inconsistently (even by Britain’s Royal Navy after the Napoleonic Wars) until the identification of vitamin C in the 20th century.

It looks more like a case of the people making the decisions being unfortunately disconnected from the people who knew what scurvy was and how to deal with it. (And wilfully ignoring those who tried to point it out to them.)


I know about that the whole lime - lemon thing. Two things:

1. That was before the scientific age. They didn't know about vitamin C at the time, they couldn't even see it, even if they'd somehow believe in it.

2. There is no scientific proof, 0, none, that Lisp is superior. And it's almost impossible to prove it, since it requires controlled studies at a large scale - good luck with taking away hundreds of productive programmers for that study :) All we have is hearsay and personal opinions.


> I know about that the whole lime - lemon thing.

Be careful; my quote has nothing to do with the confusion between lemons and limes that occurred hundreds of years later. The Navy instituted a lime juice ration in 1799. They switched out lemons ("limes") for what we would call limes in 1865, setting themselves up for the reintroduction of scurvy. But James Lancaster performed his experiment (and reported his result of 100% scurvy prevention to Naval authorities) in 1601.


And all managed languages that still provide halfway solutions for what was the Interlisp-D/Lisp Machines developer experience.


I love how I'm being downvoted for saying that there's no scientific proof that Lisp is superior. Come on, show it, I want to see it. I want to see thorough, statistically representative studies that show Lisp's superiority :)


"How the strengths of Lisp-family languages facilitate building complex and flexible bioinformatics applications."

https://www.ncbi.nlm.nih.gov/pubmed/28040748

-- US National Library of Medicine National Institutes of Health

"Clasp: Common Lisp using LLVM and C++ for Designing Molecules"

https://www.youtube.com/watch?v=0rSMt1pAlbE

-- GoogleTechTalks


That's nowhere near something like medical clinical trials, though.

Think of studies where you:

a) have statistically representative sample size (100+ developers)

b) actual numbers and compare them (the average development time needed for the Java/C++/etc. applications was Y and the average development time needed for Lisp applications was Y - Z, where Z > 0, etc.; the average execution time for Java/C++/etc. applications, etc., you get the idea)

Computer Science studies are still in their infancy.

All this Lisp hype is just hype and gut feelings.


There's Lisp as an Alternative to Java: http://www.flownet.com/gat/papers/lisp-java.pdf


It strikes me as a matter of structure. Lisp provides almost none, whereas python provides more. Anything not provided by the language and environment must be provided by the team. If they can, that's great, but it's going to be easier to get a small team to agree on a structure than a large one.


Wouldn't say so, there are plenty of conventions and common practices shared by overwhelming majority of Lisp developers. Project structure, system definition, development environment, naming conventions, idiomatic use of data structures and object system, all that is established across the board.


Well I don't find mature Python codebases (e.g. Homeassistant) particularly easy to inspect or change. Guess that's where me and 2005 Aaron Swartz would disagree.


What language's mature codebases do you find particularly easy to inspect or change?


None, which was my original point. The difference between the projects/teams easily trumps whatever is there between the languages.


When I talked with Alexis Ohanian he told me that the Lisp version was difficult to keep running. I think the term he used was that it often fell down.


I don't use it, but I'm not noticing any such backlash for Clojure


Good SW engineering practice helps. Code review, collaborative design, etc.


The more power and liberty a language offers, the more likely you will end in the state of "write once, never understand again".


I agree, strong barriers between layers are important. Right now the best way to build them is by using different languages or runtimes - it’s hard to mix layers if they require explicit network or shell calls.

If everything was in lisp the layers are naturally blurred.


Racket is a Lisp that makes these layers somewhat explicit.


Yes, this is why macros have been abandoned in most entreprise languages (C#, Java)


As if they've tried them and then abandoned them? Java, fone one, didn't even have generics and closures when it came out.

It didn't have macros because it was intended for the ho-hum enterprise programmer of the time, who, the thought was, would not know what to do with them.

Instead, they re-invented all those things badly (e.g. through gobs of XML and ugly reflection based metaprogramming).


Nobody still uses them, even if they are available. In fact anything that doesn't looks enterprisey enough never goes past code reviews.

The adoption cycle for even the simplest of Syntactical features when it comes to Java enterprise is >10+ years. In some companies its never.

The saddest part isn't even that. The sad part is a whole generation of programmers have been raised, and turned to be architect astronauts without ever using something like a lambda or a closure.

The only hope for programming as a craft now is hoping Oracle kills Java(even if by mistake), and then some thing like Perl 6 comes along to replace it.


I'm not sure that will help. I see a kind of "family resemblance" between COBOL and enterprise Java. I wonder if any language that is going to play in this space is destined to become a monstrosity - destined by the nature of the problem space rather than the nature of the language.

I am aware that when I say this, I am basing my opinion on a sample size of two...


Nonsense, the enterprise programmer of the time was programming in C or C++ and was making heavy use of macros.

Macros, unlike generics are super easy to implement, so there is no reason not implement them, unless you specifically don't want to implement them.

Java didn't have macros because no one wanted the kind of dialect that macros allow, ie. re-learning a language every time you change company.


> Nonsense, the enterprise programmer of the time was programming in C or C++ and was making heavy use of macros.

C macros are entirely different in use from Lisp macros. C macros are written in a text substitution language that knows virtually nothing about C, whereas Lisp macros are written in Lisp and operate on regular Lisp data structures. Even trivial things are very difficult to get right with C macros, whereas very complicated things are often elegant and simple with Lisp macros. It's an entirely different experience. They're not at all comparable.

C programmers learn to fear macros because C macros have a lot of problems. Few of those carry over to Lisp macros.


then again enterprise software and languages are also full of sdls, plugins, frameworks, libraries and other pieces and parts that enable the same kind of power that lisp has.

People haven't really given up on the idea of building out functionality for specific portions of their codebase (and we really shouldn't) we've just seemingly exchanged one very powerful solution for a sea of different tools.


Actually C# and Java do have macros, they just aren't on the syntax level.

Attributes/annotations, compiler plugins, expression trees and aspects take the role of macros.


Makes me think of Cyc which is still around ( https://www.cyc.com ). Wonder if any HNers have experiences with it? I tried opencyc and was fascinated. It's a prology lisp married to a huge database of rules and knowledge. The open source version doesn't ship with the knowledge part so it's hard to evaluate if it's of any practical use. Was meaning to apply for a research license but didn't find time yet.


Lispers praise Lisp's homoiconicity but today's networked world calls for a strict separation of data and executable code as in NX bits (and yes I'm aware of the classic "Can programming be liberated from the von Neumann style?", and don't agree with it).

I don't know. I've always seen Lambda calculus as a model for computation to reason about program execution, but not as an actual implementation technique.


You are mixing up concepts. The Lisp concept of "the program is data" has little to do with modifying the program at runtime. It might be used for that in uncommon cases, but it is not recommended. A Lisp program usually is compiled into a static executable with a strict separation of data and executable code.

The mixing between program and data happens at build time in the compiler through the macro facilities. The first important thing is, that a Lisp compiler is not something which runs as an abstract process, but runs in a Lisp system and can execute arbitrary Lisp code. So Lisp macros can execute any Lisp functions - written by the user - to transform the input code, represented as data which can be easily processed, into the output code, which gets compiled. The whole point is, that with macro expansion you are not tied to some pattern language, but can run any user-written code to process the code for output. This gives Lisp great power and extensibility. But this happens at compile-time, not at run-time.

A good example of the power of this is, that the Common Lisp Object System (CLOS) can be entirely implemented in Common Lisp. You take any Common Lisp implementation which doesn't have an object system and load the CLOS code and you have the object system available.


Well, it's not like that's a thoroughly unknown problem. eBPF (https://lwn.net/Articles/740157/), for example, which effectively relies on running user-supplied bytecode in kernelspace, attempts to solve this (and does it reasonably well) by imposing a couple of constraints on the code you supply. These constraints, in turn, allow the program to be statically-analyzed before running it.

In my experience, 90% of the cases where homoiconicity is useful are cases where the code you execute is trivial enough that you can statically analyze it, for example. E.g. you use the data to generate trivial processing code that would be tedious to write by hand.


I don't think that's the case. Template Haskell lets you generate Haskell code at compile time, including by doing arbitrary IO. This is technically unsafe but is usually okay because you're only reading local files and so forth. It's occasionally really useful for generating things like database mappings, lenses, etc.


I first came across SHRDLU in Gödel, Escher, Bach.

I thought it was fictional!


I think the vast majority of coders would struggle to write anything like SHRDLU.


The problem is that you can't arrive at such a thing by ordinary exploratory coding without a strong conceptual framework guiding what you're doing.


Indeed, Winograd wrote a short but serious book explaining the linguistic and other theories behind shrdlu.

However, I don't think even a complete understanding of the points of his book would be sufficient to understand shrdlu's code.


> I don't think even a complete understanding of the points of his book would be sufficient to understand shrdlu's code

That sounds like a point against the use of Lisp. There's certainly quite a lot of handwritten case-specific parsing code:

https://github.com/stuartpb/shrdlu/blob/master/gramar

Perhaps the bit which looks most lispish is the dictionary: https://github.com/stuartpb/shrdlu/blob/master/dictio

(It also looks like this was written on one of those ancient systems that didn't support lowercase and only allowed 6-character filenames)


> (It also looks like this was written on one of those ancient systems that didn't support lowercase and only allowed 6-character filenames)

A limitation that also led to 'Schemer' dropping the R.


Could you tell me which of his publications you refer to here?


>> Two decades after its creation, Lisp had become, according to the famous Hacker’s Dictionary, the “mother tongue” of artificial intelligence research.

More precisely, Lisp became the "mother tongue" of AI research in the United States. Europe and Japan, which at the time also had a significant output into AI research, instead used Prolog as a lingua franca.

This is interesting to note, because a common use of Lisp in AI was (is?) to write an interpreter for a logic programming language and then use that interpreter to perform inference tasks as required (this approach is evident, for example, in Structure and Interpretation of Computer Programs, which devotes chapter 4.3 to the development of a logic programming language, the query language, which is basically Prolog with Lisp parentheses).

Hence the common response, by Prolog programmers, to Greenspun's tenth rule, that:

  Any sufficiently complicated Lisp program contains an ad-hoc,
  informally-specified, bug-ridden, slow implementation of Edinburgh Prolog.


> This is interesting to note, because a common use of Lisp in AI was (is?) to write an interpreter for a logic programming language and then use that interpreter to perform inference tasks as required

Not sure how you come to that conclusion. Literally none of the notable Lisp AI programs (ELIZA, STUDENT, SHRDLU, AM, EURISKO, MICYN) had anything to do with reimplementing Prolog.

Implementing Prolog however is trivial in Lisp, so many textbooks used it as an intermediate level exercise.


It's the response to Greenspun's tenth rule that mentions Prolog explicitly. My previous comment states that AI programs written in Lisp implemented an interpreter for a logic programming language, unspecified. Although to be fair, that usually means "an informally-specified, ad-hoc, bug-ridden, slow implementation of Prolog" (1).

Now, my knowledge of Eliza, Shrdlu, etc is a little limited (I've never read the source, say; btw, have you?) but let's see. Wikipedia is my friend, below.

According to wikipedia MYCIN was a backwards chaining expert system. That means a logic programming loop, very likely a resolution-based theorem prover, or, really, (1).

ELIZA was first implemented in MAD-SLIP, which, despite the name was not a Lisp variant (although it was a list-processing language). STUDENT is actually part of ELIZA- it's one of the two scripts that came with the original implementation, by Wizenbaum. The other script is DOCTOR, which is the one more often associated with ELIZA (it's the psychoanalyst). If I understand correctly, STUDENT solves logic problems, so I'm guessing it incorporates an automated theorem prover; so basically, (1).

Eurisko was written in a frame-based (think OOP) representation language called RLL-1, the interpeter for which was written in Lisp.

SHRDLU was written in Lisp and MicroPlanner an early logic programming language (with forward chaining, if memory serves). In the event, (1) was not necessary, as MicroPlanner was already all that- and more!

The Automated Mathematician (AM) was indeed written in Lisp, but I don't know anything about its implementation. However, from the description on its wikipedia page it appears to have included a very basic inference procedure and a rule-base of heuristics. Sounds like a case of (1) to me.


MYCIN is a fuzzy logic inference engine, among the first ones. Would love to see that implemented in period-true Prolog, but tbh won't hold my breath.

RLL-1 and Microplanner are domain-specific languages, traditional Lisp approach to building complex applications. Fact is both were written in Lisp, that's just sophistry.

Neither AM nor EURISKO would map naturally to unification with their very special heuristic-driven task prioritizing and scheduling algorithms.


I don't know what "period-true Prolog" is. You can still run programs found in Prolog textbooks from the 1970's in modern Prolog interpreters- just like with Lisp.

>> Fact is both were written in Lisp, that's just sophistry.

I don't see the sophistry. If I implement Lisp in Prolog, the interpreter may be Prolog, but the programs it then evaluates are Lisp.

I don't see why unification would be an impediment in implementing any heuristics or scheduling algorithms.


Period-true as in from 1970s. There were attempts to build fuzzy Prolog dialects in late 1980s, although it seems not particularly successful.

> If I implement Lisp in Prolog, the interpreter may be Prolog, but the programs it then evaluates are Lisp.

But it does not implement Lisp in Prolog. It implements a domain specific language in Lisp in a time proven manner. Googling "DSL Lisp" will give you probably hundreds of references. I did it before for my own projects and my "languages" even had no name. I could call it VARLANG-22 or whatever, but unless I told you all you'd see is some Lisp code.

> I don't see why unification would be an impediment in implementing any heuristics or scheduling algorithms.

It does not bring anything to the table there, because that's not how those systems work. Not every project is a first order predicate logic expression, as incredible as it sounds.


We're in a weird situation here. In this thread, you're arguing that a programming language implemented in Lisp, which retains the syntax of Lisp, is a DSL, and so still just Lisp; but in the other thread, you're arguing that Prolog implemented in Lisp, with the syntax of Lisp, is not a DSL, but Prolog.

I recognise that the above comes across as an attempt at sophistry, again, but I sincerely think you are not being objective, with this line of thinking.


Thanks for taking the time to break that down.


>> Implementing Prolog however is trivial in Lisp, so many textbooks used it as an intermediate level exercise.

I've heard this kind of thing before- but it's not true. What is usually implemented in Lisp textbooks, is not Prolog. It's a logic programming loop with pattern matching, sure, though I'm not even sure it's unification (i.e. Turing-complete pattern matching) rather than regular expression matching.

Even the SICP book makes sure to call the language it implements the "query language", rather than Prolog. The whole point of Prolog is to use the syntax and semantics of first order logic as a programming language. You can't have Prolog without the representation of programs as Horn clauses, any more than you can have Lisp without S-expressions.

And those "trivial" implementations you note, don't do that- they implement a logic programming loop, but retain the full syntax of Lisp. That's why they are "trivial" to implement- and that's why they are not Prolog.


> What is usually implemented in Lisp textbooks, is not Prolog

When a Lisp textbook says it implements Prolog, it implements it with unification. See for example Paradigms of AI Programming, by Peter Norvig - the chapters on implementing a Prolog compiler...


My mistake. I don't think I've read that book.


No, they do implement Horn clauses, it's not really super hard. Had it as a homework long long ago. Syntax is irrelevant, and it's not like Prolog has complicated grammar if you want to go for canonical syntax anyway.


Oh, syntax is important. Otherwise, I can implement Lisp in Prolog trivially, by rewriting each function of arity n, by hand, as a predicate of arity n+1 (the extra argument binding to the output of the function). After that, I wouldn't even need to write an explicit Lisp interpreter- my "Lisp" program would simply be executed as ordinary Prolog.

I believe that if I were to suggest this as a honest-to-God way to implement Lisp in Prolog, any decent Lisper would be up in arms and accusing me of cheating.

And yet, this is pretty much the approach taken by the Lisp textbooks we're discussing. Their "Prolog interpreters" cannot parse Prolog syntax, therefore programs must be given to them as Lisp. Then, because Lisp is not an automated theorem prover, one must be implemented, accepting the Lisp pretending to be Prolog. That's not a trivial implementation of Prolog- it's an incomplete implementation.

Yes, Prolog has very simple syntax: everything is a Horn clause, plus some punctuation. That's simpler even than Lisp. You keep saying how simple Prolog is to implement, as if it was a bad thing. According to the article above, the simplicity of Lisp is where its power comes from.

Well, Prolog is simpler.


> Otherwise, I can implement Lisp in Prolog trivially, by rewriting each function of arity n, by hand, as a predicate of arity n+1 (the extra argument binding to the output of the function)

Programming languages don't consist only of functions. Lisp is no exception.

> Their "Prolog interpreters" cannot parse Prolog syntax, therefore programs must be given to them as Lisp.

There are Lisp-based Prolog implementations which can parse Edinburgh syntax and have a Lisp syntax variant. I have a version of LispWorks, which does that.

But the syntax is just a surface - the Lisp-based Prolog does unification, etc. just as a typical Prolog implementation. It supports the same ideas of adding, retracting facts and etc. The MAIN difference between REAL Prolog implementations and most Lisp-based is that the real Prolog implementations provide a much more sophisticated version of the Prolog language (and more of the typical Prolog library one would expect) and some have extensive optimizations and native code compilers - thus they are usually quite a bit faster.

Interested Lisp users are usually only looking for the reasoning features of Prolog (to include those in directly in programs) and not in the specific Prolog syntax. Sometimes, also to have Prolog-based parsing techniques in a Lisp program.


>> There are Lisp-based Prolog implementations which can parse Edinburgh syntax and have a Lisp syntax variant.

I was talking specifically about the Prolog implementations given as exercises in Lisp textbooks. Those, usually, are incomplete, in the ways that I describe above. LispWorks for instance, seems to be a commercial product, so I'd expect it to be more complete.

I don't expect a Prolog-as-a-Lisp-exercise to go full-on and implement the Warren Abstract Machine. But, equally, I think the insistence of the other commenter, varjag, to the simplicity and even triviality of implementing Prolog as an exercise in Lisp textbooks is due to the fact that those exercises only implement the trivial parts.

>> Interested Lisp users are usually only looking for the reasoning features of Prolog (to include those in directly in programs) and not in the specific Prolog syntax. Sometimes, also to have Prolog-based parsing techniques in a Lisp program.

That's my intuition also- as encoded in the addendum to Greenspun's tenth rule. Tongue in cheek and all :)


Lisp books, especially in AI Programming, have implemented various Logics - predicate logic based programming like standard Prolog is just one. Peter Norvig has a relative extensive example how to implement Prolog and how to compile it. Others concentrate on some other logic calculus. It's not surprising that various books have different depths in showing how to implement various embedded languages. In case of Norvig's implementation, others have used in their application.

It has nothing to do with Greenspun at all. Lisp and then Common Lisp was used in many AI programming frameworks and these integrated various paradigms: rule systems, various logics, objects, frames, constraints, semantic networks, fuzzy logic, relational, ... Lisp was explicitly designed for this - there is nothing accidental about it. It is one of the main purposes of Lisp in AI Programming to enable experiments and implementations of various paradigms. That means that these frameworks EXPLICLTY implement something like predicate logics or other logics. These features are advertized and documented.

Greenspun claimed that complex C++ programs more or less ACCIDENTALLY implement half of a Lisp implementation, because the large enough implementations need these features: automatic memory management, runtime loading of code, dynamic data structures like lists and trees, runtime scripting, saving and loading of state, configuration of software, a virtual machine, ... These programs implement some random subsets of what a Common Lisp system may provide, but they don't implement Common Lisp or its features directly. Most C++ developers don't know that these features actually exist in Lisp and they don't care. Similar the Java VM implements some stuff one could find in a Smalltalk or Lisp system: virtual machine, code loading, managed memory, calling semantics, ...


Well, the addendum to Greenspun's tenth rule works even if the ad-hoc etc implementation of half of Prolog is not accidental.

Like I say, it's tongue in cheek. I think (hope) programmers are somewhat over having any serious arguments about which language is "best" at this point in time. Although perhaps my own comments in this thread disagree. I am a little embarrassed about that.


> Their "Prolog interpreters" cannot parse Prolog syntax, therefore programs must be given to them as Lisp

Unless the project has a clear requirement to re-use unmodified/untranslated Prolog code, or to produce code that will be shared with Prologs, this would only be a disadvantage. The Prolog syntax as such has no value; it is all in the semantics.

> Then, because Lisp is not an automated theorem prover, one must be implemented, accepting the Lisp pretending to be Prolog.

Also, Lisp isn't a virtual machine, so one must be implemented. There is a compiler routine which pretends to be Lisp, accepting the special forms and translates them into executable code. It's just a big cheat.


> I believe that if I were to suggest this as a honest-to-God way to implement Lisp in Prolog, any decent Lisper would be up in arms and accusing me of cheating.

If you had quote and macros and such working in this manner, just with the f(x, y) style syntax, that would be a valid implementation. There is a convenience to the (f x y) syntax, but the semantics is more important.

We can have infix syntax in Lisp as a separate module that works independently of a given DSL.

Separation of concerns/responsibilities and all that.


Symbolics had a Prolog compiler for their Lisp Machines, with optimized microcode for the processor. It also supported japanese...


I'm guessing they were trying to entice the Japanese market. If I have that right, it must have all been around the time of the Fifth Generation Computer Project, which aimed to create an architecture that would run a massively parallelisable logic programming language natively.

That eventually imploded and took logic programming and, I think, the whole of symbolic AI with it. I think Symbolics, selling a very expensive computer running Lisp natively, failed soon after?

That must have been a brutal time to be involved in AI research.


> That must have been a brutal time to be involved in AI research.

They don’t call it the AI Winter because it was fun!


Even if that is true, it's probably an issue of low severity, since that Greenspunned Prolog will probably be used for a very specific task in the overall application, not as the master glue to hold it all together. (Though we can't rule that out, of course). Also, there have been some decent inference systems written in Lisp; they can just be used off the shelf.

I would say that Prolog as a standalone language is silly: an entire programming language developed from the ground up, with numerous details such as I/O and memory management, just for the sake of supporting a logic programming DSL.

Writing all this stuff from scratch for every domain-specific language is very unproductive, and has the effect of segregating all those languages into their own sandboxes that are hard to integrate into one application.


A DSL? The first order predicate calculus, a DSL?

I don't think we're really on the same page here.


To be fair, the first task of any AI project in Prolog is to implement a different search algorithm. So I don't seem much gain (or loss).


I've assisted with the teaching of Prolog at my university (as a TA) and the first task is usually something like the append/3 predicate (used to append two lists, or split them, etc).

Last year, the end-of-term exercise involved the farmer-wolf-goat-cabbage problem, which can be (and was) solved just fine with Prolog's built-in depth-first search.

I think though you may be talking of forward chaining, rather than depth-first search? That is, indeed a classic exercise in Prolog.


http://kingjamesprogramming.tumblr.com/ contains a selection of "verses" generated with a markov chain trained on SICP, the King James Bible and some other works. The results are oftentimes hilarious and some of the Lisp related quotes are very thematic:

    13:32 And we declare unto you the power and elegance of Lisp and Algol.

    Lisp, whose name is Holy
(and one that doesn't necessarily mix in something from King James)

    A powerful programming language should be an effective program that can execute any Lisp program.


For whatever reason I really enjoyed the Symbolics graphics and animations demo[0] linked in the article.

I was born in the 90s and have constantly heard the narrative that technological progress is getting faster and faster, and that we're currently at the forefront. And then I see something like this. They had basically photoshop before I was even alive!?

[0] https://www.youtube.com/watch?v=gV5obrYaogU


Paint programs go back to Xerox Parc in the 70's[1], at the very least. In the 80s and 90s there were a bunch of high-end paint programs for Symbolics, SGI workstations, Quantel Paintbox [2] etc. They were more video-graphics oriented than Photoshop, which historically at least, had a strong print emphasis.

I love computer graphics history, and loved the mystique of "high-end" computing when I was making crap pixel art on my Amiga:)

Another cool thing is, in that video, the artist is basically using the same box modelling techniques I did when I took a Maya course a couple of years ago. In 1991 I was 3d modelling on the Amiga, and although it was awesome to have access to that software at home, modelling was much more cumbersome and limited.

Now, to blow your minds even more here [3] is a recent demonstration of that same 3d modelling program. At 19 minutes in, he switches to the Lisp Listener console, and uses Lisp to inspect and modify the data belonging to the 3d model, before switching back to the 3d app to view the changes. Lisp machines were incredibly well-integrated platforms for the expert user.

[1] https://creators.vice.com/en_uk/article/wnpqnm/the-1970s-gra...

[2] https://www.youtube.com/watch?v=BwO4LP0wLbY

[3] https://vimeo.com/125771177


Caligari?


Imagine 2.0 from an Amiga Format coverdisk (Like most of my application software when I was a teenager) I remember reading about Caligari, but I never used it.


Ah thanks. I only knew about Caligari, but never used it. Just saw it at trade shows back in the day.

I was the PC guy on our computing parties. :/


According to Wikipedia, Adobe Photoshop was created in 1988. So yes, we literally had that Photoshop before you were alive, not just some photoshop.

I used GIMP in 1996, on a Sun Sparcstation 20, running Red Hat Linux.


http://www.computerhistory.org/atchm/macpaint-and-quickdraw-...

"The Apple Macintosh combined brilliant design in hardware and in software. The drawing program MacPaint, which was released with the computer in January of 1984, was an example of that brilliance both in what it did, and in how it was implemented."

"The high-level logic is written in Apple Pascal, packaged in a single file with 5,822 lines. There are an additional 3,583 lines of code in assembler language for the underlying Motorola 68000 microprocessor"

But, almost two decades before that, in 1968(!) not directly the Paint program, but all kinds of interactivity and simple drawing using the first mice ever:

https://m.youtube.com/watch?v=yJDv-zdhzMY

https://en.wikipedia.org/wiki/The_Mother_of_All_Demos


* heard the narrative that technological progress is getting faster and faster*

What direction though? Some would claim upward, others would claim circular. Given the prevalence of "x implemented in y" I tend toward circular myself.


Photoshop in its first versions had fewer features than MSPaint in 2018 :)


What are you talking about? Paint is still incredibly barebones and Photoshop 1.0 already had lots of major features that Paint lacks. Magic wand select, stamp, filters, perspective transform, etc.


>> They do this even though Lisp is now the second-oldest programming language in widespread use, younger only than Fortran, and even then by just one year.

And the third-oldest is COBOL (it's at least as widespread as FORTRAN; arguably, it's even more widespread than both FORTRAN and LISP together, considering that it's used by pretty much every financial org on the planet).

It seems that, alredy from such an early time, the kind of languages we would end up creating was already pretty much set in stone: FORTRAN, as the grandady of languages aimed at scientists and mathematicians, that modern-day R, Python, Julia etc draw their heritage from; LISP as the grandmother of languages aimed to computer scientists and AI researchers, still spawning an unending multitude of LISP variants, including Scheme, ML and Haskell; and COBOL, the amorphous blob sitting gibbering and spitting at the center of the universe of enterprise programmers, that begat the Javas, VBs and Adas of modern years.

(Note that I'm referring to language philosophy and intended uses- not syntax or semantics).

(I'm also leaving out large swaths of the programming community: the Perl uses and the C hackers etc. It's a limited simile, OK?).


I would say that R (and probably Julia as well, although I'm much less familiar) draws it's heritage from Lisp more than FORTRAN.


R was originally a Scheme dialect, so that's a pretty safe assertion.


Don't forget Algol (1958). It had a profound influence on syntax and how programs are organized.


Followed by what was the first almost safe systems programming language variant from Algol, ESPOL in 1961, with support for UNSAFE blocks.


In the history of Lisp, the paper by Richard Gabriel: https://www.dreamsongs.com/WIB.html, “Lisp: Good News, Bad News, How to Win Big” is insightful and beautifully written.

I am surprised it is not mentioned in the article wrt the “winter period” in which Lisp popularity waned.


If you want to read SICP, it is available in HTML, EPUB here: https://github.com/sarabander/sicp also here: https://sicpebook.wordpress.com/ebook/. Not available in EPUB on Amazon or Google.


I think this article reverses cause and effects. It seems to start from the assumption that there's nothing special about LISP and then points to big cultural moments where programmers revered it and said "this accounts for 20% of the meme" etc.

I'd say those cultural moments exist because LISP /is/ something special. You could write SICP in Java (someone probably has) but the code would be way longer and less beautiful.


The principle of orthogonal design is something I learned in CS, but hardly anyone mentions any more. The idea boils down to building software parts in a consistent way such that can be combined and re-used to form new things. The way you can accomplish this is by having very few rules. The more "syntaxy" a language is, the less orthogonal it is.

For more reading and discussion on this topic: https://softwareengineering.stackexchange.com/questions/1035...


As a dev from non-CS major, I personally havn't learnt anything about Lisp but I would like to know. Is Clojure (specifically ClojureScript) a good start to study about it?


Clojure is a modernised version of Lisp. It compiles to Java bytecode and runs on the JVM. Clojurescript compiles to Javascript for use in the browser. Some amazing programming tools have been written for Clojurescript. The community is very robust and opinionated, in a good way I think.

Clojure is very modern with its vectors and maps. Lisp is more of an antique, a very valuable antique though. its very interesting to learn about both at the same time, as I did :)


Contrary to popular belief, Common Lisp has support for vectors, maps, records, stack and heap allocation.

What Clojure has going for it is the wealth of Java libraries.



I just want to say that the tooling setup isn't great and you kinda have to buy into a small ecosystem to do clojurescript Dev and the easiest is probably to use lein+lein-figwheel. If you have any questions feel free to ask!


Yes.


Far too many intelligent programmers swear by lisp for there to be no "there" there. I'm trying to learn it in my nonexistent spare time.


I know what you mean and I think that feeling comes from a difference between lisp the language (learn in a matter of hours) and lisp the ecosystem (takes much longer to get confident with). And, you really need to learn both, unless you have time to make everything from scratch.


Well, you need to learn both if you're going to make a career of working in Lisp.

But even if that isn't in the cards (and it isn't for most of us, for all sorts of reasons), there's still a whole lot of value in learning the language enough to go through and make a few things from scratch.

I tend to agree with ESR on the subject: "LISP is worth learning for a different reason — the profound enlightenment experience you will have when you finally get it. That experience will make you a better programmer for the rest of your days, even if you never actually use LISP itself a lot." (http://www.catb.org/esr/faqs/hacker-howto.html)


Look, here is the deal with Lisp. It shows you the data structure of your computer program and lets you operate on it and that is neat. But it's not useful for actual work because the way you manipulate it makes it hard for you to understand what is happening without knowing all the ways your code is being manipulated. I have to write lisp for my editor (emacs) and I don't hate it, but I don't love it either. The syntax is hard to read quickly because it's cluttered and (usually) nest-y.

If you're willing to trade complete purity away, try Ruby. It's basically everything you want from Lisp without the mess. Give up purity, get comprehensibility.

Blocks are a really great way of doing things. You can even investigate the block source code as a string if you really want to.

Dynamic method definition is well supported and predictable. Data structures are easy to compose and operate on. It's basically all the power but in a more comprehensible way. There's a reason why Rails came out of Ruby. It's naturally powerful.


I don't get the whole "macros make your code incomprehensible" thing that is always brought up. A macro is just another way to add abstraction to your program.

You might as well say functions are bad because without reading the source code for the function you don't know what the function does.

A macro that looks like one thing but does something else is a bad macro. We don't throw away functions just because someone can write a function named "sort" that actually randomizes it's argument rather than sorting


Macros are way more powerful than functions. They can hide way more side effects, can fill a namespace, and can surprise you on many other ways.

This wouldn't be a problem if those surprises were rare and clearly marked, but the entire reason for macros to exist is to carry the surprises. As a consequence, having macros as the default (ok, second choice, not much better) tool of your language is bad. It's not that macros are bad by themselves, but they shouldn't be used often.

Besides, powerful tools do not get well together with dynamically typed languages.


> Macros are way more powerful than functions. They can hide way more side effects, can fill a namespace, and can surprise you on many other ways.

Abstractions that surprise you are bad. That doesn't mean the tool used was necessarily bad.

> This wouldn't be a problem if those surprises were rare and clearly marked, but the entire reason for macros to exist is to carry the surprises.

See above RE: surprised. Also, I can usually identify a macro from indentation, as most macros tend to have lambda lists similar to:

    ((FOO BAR &key BAZ) &body b)
which slime will pickup on and indent appropriately.

> It's not that macros are bad by themselves, but they shouldn't be used often.

If by "use" you mean "write" I agree. I do write macros far less often than I write functions, and this is common advice for lisp programmers anyways.

> Besides, powerful tools do not get well together with dynamically typed languages.

Not all lisps are dynamically typed, even common lisp has optional type declarations and typed racket takes this further. Also the GP post suggested Ruby, so that's not a great alternative by this argument.


I upvoted you, as I share your sentiments with regards to Ruby and Lisp, but language evangelism belongs in discussion centered on that language. It's just offtopic otherwise. This is the thread glorifying Lisp. People want to hear what makes Lisp great.


In addition to SICP, John Allen's "The Anatomy of Lisp" is also a great book to learn computing concepts through Lisp.


It is especially good if you want to implement a Lisp.


I loved this book as a kid, but I'm really not sure I would recommend it today. The Lisp of that era had "dynamic scope," and a great deal of Allen's book is concerned with fancy data structures and techniques to implement it in a reasonable fashion. But today I think we understand that stuff as basically wrong, and that "lexical scope" works better (and is much closer to the original lambda calculus that served as an inspiration). There are probably some proponents of dynamic scoping, but I think it's a lost battle.

So definitely yes, if you want to implement a historic Lisp. Otherwise, not so much.


A later text on the same topic (implementing Lisp/Scheme) is:

https://en.wikipedia.org/wiki/Lisp_in_Small_Pieces

Highly recommended.


>nobody has or will make anything practical with Lisp

I made a website with Clojure (a Lisp) that is indeed practical: http://practicalhuman.org


> I made a website with Clojure (a Lisp)

Seems like you’d be in good company ‘round here then.

https://en.m.wikipedia.org/wiki/Viaweb


Reddit was originally coded in Common Lisp


Any idea why they rewrote it?

edit: https://redditblog.com/2005/12/05/on-lisp/


So was Amazon.


I've never heard that. I do remember seeing here on Hacker News an old job ad for Amazon (close to founding) that was looking for C++ programmers.

[edit] Just found it: https://groups.google.com/forum/#!topic/ba.jobs.offered/-rvJ...


Barefoot Networks designs chips using an internal design tool written in Common Lisp.


I suppose HN is pretty impractical.


Wasn’t PG’s original, pre-yahoo-acquisition e-commerce site written in Lisp?


Yep, and he made mucho dinero when they sold it to Yahoo. As far as I understood it, Lisp enabled them to implement features so fast that their would-be competitors couldn't keep pace.


and yet. pg has been cheerleading for lisp since 2002 or so, and almost nobody has followed in his footsteps. the reddit guys believed him, tried it, and wound up doing a complete rewrite in python.

https://redditblog.com/2005/12/05/on-lisp/

so i'd have to say that the prevailing evidence says, to me, that pg's advice pretty much doesn't work, in the average case.


Seems to me that the uptake has been picking up, in no small part to "Hacker News". Bear in mind that since Lisp programs get compiled to machine code, you might be using software written in it and not know it. And that's how it should be: high quality software should be small, fast, easy to install and the user shouldn't have to care what it's written in.


I have seen his essay on this and I am curious as to how relevant it is today, not so much because of languages, but because most languages have frameworks that will speed up a lot of the development time when used properly.


How do you build a simple beautiful website like this?


It's Jekyll [1] powered static site, here is the source code: https://github.com/sinclairtarget/sinclairtarget.github.io

[1] https://jekyllrb.com/


Thank you!


With HTML and minimal CSS?


Right, how do you build a website that looks like the OP as opposed to like that: http://math.mit.edu/~etingof/

(Not that it's a bad website, it's the most common format in the academia, just an example of a plain HTML website)


You’d get 90% of the way there with just font selection, paragraph spacing, and margins. Maybe colors as well.

Butterick’s typography stuff and tufte will carry you through, and then just steal some more specific detailing from sites like this, gwern, xxyxyz etc

I compiled a list of such sites I was planning to steal details from for my own blog, but never actually followed through on, if you care: https://github.com/setr/pollen-site/


I do, thank you so much!


Just add a bit more CSS then :-)


Would it be worth going through SCIP without prior knowledge of LISP (i.e. could I pick it up from the book) or is it better to have some knowledge beforehand?


At the time the book was written it was not uncommon for a new MIT student to arrive without having used a computer at all. 6.001 was intended to be the first introduction to a computer so not only is no lisp assumed but no programming at all!


It also depends on how motivated enough you are. Even as early 2005 most of us in India couldn't afford computers. I remember we wrote 8085 programs all on paper. Limited time was available, with limited kits in the lab, but we were motivated enough to do it on paper alone.

These days where everything is supposed to be easy, newbie friendly, accommodating and all that. People have a tendency to quit early and expect the ecosystem to make it easy for them.

These days you get a decent computer under $100 if you use a Rasberry Pi. I could do anything for something like that a decade back.


I wouldn't quite say motivated - we had no choice. You either wrote and debugged the 8085 assembly on paper before you ran it on the board, or you didn't do it at all :)

While I understand (and respect) the sentiment, having gone through much the same, I wouldn't disparage something being newbie-friendly. That isn't necessarily a bad thing.

Edit: s/play down/disparage


Yes. Lisp is fundamentally simple and is introduced in the beginning. I can recommend Racket with the sicp language extension to get the dialect used in the book.


How is the math? I've heard it's not exactly easy.


I’m just finishing chapter 1 (1/5th of the book) and the math so far hasn’t been too challenging, but you should certainly go in with the expectation that you’ll do some Googling to figure out some context to complete the exercises.


SICP only uses a small subset of Scheme, and it's only using it as a minimal vehicle for exploring computer science. You can absolutely learn the amount of Scheme used in the book as you go (that's why they chose that subset: it's not supposed to distract you from the topics they're discussing; they're trying to get as close as they can to simple axioms), but you won't really "know Lisp" (or Scheme) in practical terms when you're done, either (you'll be well-equipped to pick it up quickly, though).


Is assembly the devil's own programming language?


Assembly is too honest about what the program is doing to be the Devil's work. That would be C with undefined behavior kicking in on malicious input from the Devil's children.


Indeed. Assembly is the language of the nameless ancient horror that was there before the devil was born. The language of the devil is C and the languages of his demons are JavaScript and PHP.


Seriously? Wow. Assembler is the most beautiful, simplest language that I have ever programmed in.

O tempora, o mores...


i think there are way too many widely divergent CPUs for that statement to make any sense. there is 6809 assembler, 8086 assembler, 68000 assembler, etc. some of them are simple, quite a few are not.


Lisp is the most beautiful. Brainfuck is the simplest.

I miss punchcards so much BTW, I used to draw on them when I was a child...


I find Lisp very beautiful and elegant but I find assembler even more so.


And before Assembly there was VHDL and Verilog, and the eldritch instruments that turn them into baroque patterns of silicon that cannot be contemplated by man (well, not since the 1970s...)


I would think he would sway toward VB ;)


I mean, if we stick with the spirit of the article I think that it's even more likely to be Fortran. Seemingly interesting, accomplishes certain very specific tasks well, turns into an absolute monolithic nightmare when encountered in the wild.


Not if you plan and architect it well. BT back in the day wrote a billing system for its dialcom / telecom gold online service in mostly Fortran 77.

Apart from one module which our US colleges wrote in a completely different language (PL1/G) we only found that out when it was delivered.

And at my first job at BHRA /BHR Group our fluid dynamics simualor was written in Fortran and was well structured I certainly don't recall any major problems


No, the devil's programming language is lisp in mid-size or larger team settings. People feel so empowered individually with the language, but I doubt all those same people have spent time maintaining someone else's code. Death knell of perl, too.


No, that would be malbogue.



I think the test to find out if a programmer can eventually learn to work on a Lisp codebase lies in his/her opinion about the conditional (ternary) operator.

If the programmer hates the C(T)O because it is too confusing, that programmer is hopeless about using Lisp.

If the programmer sees the C(T)O as a trivial syntax that helps to make the code short and neat, then that programmer will love Lisp.


I haven't dived into lisp, but if you're telling me it'll make my code shorter, I'm sure I'll love it.


I can't think of anyone who'd hate the CTO, it's hardly more confusing than an if/else statement.


I work in JavaScript on a regular basis, and I have grown to despise them because of their overuse when a simple if statement would have made the code much clearer.


Having spent months learning Haskell, I'm interested to pickup another mind-expanding language. If I read SICP (and also watch the MIT lectures), what dialect should I follow along in? Ideally I would learn something people are using today so there would be usable libraries. I mostly write website backends and APIs.

Clojure? Common Lisp? Something else?


DrRacket IDE [0] + the SICP compt language [1] and you can start writing it instantly in a well built and maintained environment that’s racket based and pretty fleshed out library wise, certainly nothing compared to Clojure but among the rest, it’s the best (imo), I recall Carmack writing a server in Racket for fun and praising the experience a few years back.

[0] - https://racket-lang.org

[1] - http://docs.racket-lang.org/sicp-manual/index.html?q=sicp#%2...

Additionally, if SICP proves too slow going or difficult math wise [3] you can always use drracket for HtDP [4] and it’s corresponding misnamed edX course(s) [5] and later on, PLaI [6].

[3] - http://cs.brown.edu/~sk/Publications/Papers/Published/fffk-h...

[4] - https://htdp.org/2018-01-06/Book/part_preface.html

[5] - https://www.edx.org/course/how-code-simple-data-ubcx-htc1x

[6] - http://cs.brown.edu/courses/cs173/2012/book/



I just used mit-scheme.

https://www.gnu.org/software/mit-scheme/

It's the one that was used for the course at MIT. You're not really going to dig into real-world useful libraries when doing SICP anyway. It's a bit barebones, personally I enjoyed writing a bit of tooling while working on the book, admittedly it doesn't have the best dev experience out of the box.

Then when you do an actual project you can pick what works for that project. The fundamentals should carry over easily enough.


SICP should be done in Scheme. You can use Racket with the SICP package, which has all the necessary functions.


DrRacket has an SICP language mode, and I can attest that most of the code in the book works without modification. After that you can easily switch to the Racket language, which seems to be a modern, popular dialect of Scheme with plenty of libraries.


SICP is not so much about a particular language. I think there's a Clojure port of SICP somewhere, and Clojure has a good impedance match with web backends. If you want an "authentic" experience, Scheme would be a good choice.


> Ruby got… well, Ruby is a Lisp

If only it had the same level of AOT/JIT compilers that most Lisp variants enjoy.


Another two factors in the '00s re-rise of Lisp were Steve Yegge's Drunken Blog Rants (https://sites.google.com/site/steveyegge2/blog-rants) & Practical Common Lisp (http://gigamonkeys.com/book).

The former was, at the time, very influential (and deservedly so); the latter was and remains the best reference for actually using Common Lisp to write real software (Edi Weitz's Common Lisp Recipes is an excellent companion volume).


Back in 2007 I thought it was silly that SBCL (and SmallTalk too) distributed its applications as "images". Seems they've been re-invented today as "containers", which are suddenly an amazing idea.


Prolog is probably the language of god.


No.

(note: this is a genuine Prolog joke)


?- What


Prolog looks like a really interesting language. Why isn't it used more?



Probably, ... or provably?


Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: