Hacker News new | past | comments | ask | show | jobs | submit login
Why Lisp Syntax Works (borretti.me)
331 points by Tomte on June 5, 2023 | hide | past | favorite | 306 comments



One way I've come to answer "why Lisp syntax" is through the following proxy question:

> If you want extensible syntax, what should the base syntax be?

The regularity and austerity of Lisp syntax comes from this idea. If Lisp, by default, were loaded up with all sorts of syntactic constructs [1] many of us take for granted today (which may in and of themselves be good!), then it leaves less room for the programmer to extend it in their own way. It turns out that the syntaxes we take for granted today—like for(;;){} loops or pipe | operators—are perfectly serviceable in their S-expression-equivalent form to the working Lisp programmer.

The author is right about why Common Lisp's syntax extension facilities (macros) work; the language is in a sort of syntactic Goldilocks zone.

[1] To properly discuss Lisp, we really ought to distinguish meta-syntax (the parentheses) and syntax (the grammar of symbols and lists). Common Lisp has lots of syntax, like

    ; variable binding 
    (LET ((<var> <expr>)*) <decl>* <expr>*)

    ; type declaration
    (DECLARE (TYPE <type> <var>))

    ; function definition
    (DEFUN <var> (<arg>*) <decl>* <doc>? <expr>)
All of this is different syntax! These are different rules about how symbols etc. are allowed to be arranged to produce a semantically meaningful program. But they wear the same clothes of meta-syntax, which is relatively small, mostly based off of the fundamental idea of S-expressions:

    <s-expr> := <atom> | ( <s-expr>* )

    <atom> := <symbol> | <number> | ...
Ordinary macros allow extension of the former class of syntax, while reader macros allow extension of the latter class of syntax. When talking about "macros" unqualified, we usually mean the former, but Common Lisp supports both.


You also can use mixfix syntax [1] if you want extensible syntax.

[1] https://wiki.portal.chalmers.se/agda/ReferenceManual/Mixfix

It is used by Agda and Maude and is quite versatile.

Note that one of examples above is if_then_else_ function which implements a typical language construct, but in Agda, due to lasiness, it is just an ordinary function.

[2] https://news.ycombinator.com/item?id=20697414

McCarthy did not understand lambda calculus at the time he specified Lisp and what can be a function became a special form, if I not mistaken.

So, for another approach at extensible syntax you may consider mixfix notation and lazy evaluation.


I think the reason it really works so good is that most people follow the already established conventions when writing their own macros.

Binding constructs look like let (ITER, rackets for loops, BB, my looping construct: https://git.sr.ht/~bjoli/goof-loop which pushes the limits of what can be comfortably done without breaking hygiene). Derived conditional constructs look like cond (trivia, match, case).

After some time you get a taste for how things are supposed to work to integrate (and to some extent compose) with the rest of the system. The discussions in the SRFI mailing lists can sometimes obsess over small details that, when finished, always seems to come together in a nice way.


I write Lisp professionally and love s-expressions, but the comparison with ALTER TABLE's grammar is pretty unfair. Postgres can detect many invalid ALTER TABLE formulations by checking the grammar, whereas in Lisp it's much easier to have grammatically-correct but semantically-wrong formulations.

A major saving grace here (which was not covered in the section on macros) is that you can write clever macros that do appropriate checking at macroexpansion time, not runtime. This lets you use Lisp to write checkers for your Lisp code, which gives you a Turing-complete language you already know to do the checking instead of the (formally and practically) more limited checking of a grammar.


Can you share where do you use lisp? I'm curious what are its niches nowadays.


We use it for stock market analysis over at https://feetr.io. Honestly, I couldn't imagine a better language as it makes large tasks almost effortless. Plus the fact that I can connect to the running image and query data feels like magic and makes farming content for social media almost facile as I can pipe some of the data through cl-table and just post a screenshot.


I am curious what your deployment story is like? i.e. do you deploy on k8s? on linux VMs?


We don't really overcomplicate it. A merge into development will pull the code onto the development server and then either restart lisp or recompile (depends on the application), and the same thing will happen when merged to prod.

We're currently experimenting with Guix for server configuration and it's nice but sometimes the context switch from CL to Guile does trip you up for 0.0001ms.

I think that the only difference with CL vs another language is that we can connect to the running image and create/recompile functions, and that can land you in a scenario where you think you have committed a function because it works on the image and so you build on it, then the image restarts and suddenly a whole bunch of code doesn't work. That will happen exactly once before you draw up a guide on how the team is able to use Sly/Slime.

...Me. It was me. I didn't commit the function.


Interesting, I had forgotten about Guix and now is just the right time for me to try it out, thanks for reminding me.

> Me. It was me. I didn't commit the function

Haha. Yeah, modifying "live" systems without a strict mindsets tends to cause that, the worse is linux images running on production VMs which are partially specified by e.g. Ansible but there are little modifications here and there that are done by one person or the other and when that machine needs to be created you get a nice surprise.


Guix is in a really good place right now and I'd have no reservations recommending it. Of course, there is the initial onboarding experience which can be tough if it's a completely new concept but overcoming that is 100% worth it IMO.

> the worse is linux images running on production VMs which are partially specified

I could absolutely see that being worse in a really big and scary way. Lisp is great because if you try to call a function which doesn't exist, it'll scream bloody murder at you, so we got lucky that it was immediately picked up. But that's lisp for you, incredibly nice and easy to use.


What's the scale of the codebase, in kilolines of code? Do you find the dynamic typing to be a problem at scale?


`cloc` has it at 24k lines of code so it's by no means a huge project but it's big enough where I feel like we would have encountered a large variety of issues. As of yet there hasn't been anything major. We have felt the lack of libraries at times but that just means we need to write more lisp, which is a good thing as lisp is fun.

Honestly, I don't know that I'd classify our use of CL as dynamic. We're happy customers of `deftype` and `declaim`. While it's true that not every function makes use of them, most of them do. So in that regard, I can't comment but that's the beauty of lisp: it's the language that you need it to be.


Are you guys hiring?


We aren't currently, no, sorry. However, I'll reach out when that changes!


Ok great!


What are you using? CL?


Yeah, we use CCL for local development and SBCL for running code on servers. I hear that the development story is better with CCL due to improved error messages but I'm not sure how much I agree with that, however we continue the practice because it ensures that we're writing portable code and aren't tied to a single implementation.


That sounds pretty cool. In my next life I'd love a Lisp job.


The old joke about lisp ruining all other languages holds true (at least in my experience), so it's a monkeys paw wish of yes, you get to work with lisp but you'll never be able to enjoy another job again.


Although Lisp spoils you for other languages, but it also crystalizes and clarifies the other languages. As you enjoy the latter, the former will fade.

By clarifies what that means is that if something is being done badly in some language, your mind has a reference model for that being done well. That model can guide you around the distracting crap. It also gives you a vocabulary for talking about and thinking about it.


There are worse fates to suffer than that.


clojurians seems to still be active and somehow numerous, datascience subgroups held regular online meetups not long ago. some would say clojure is not true-lisp but well.


Clojure and ClojureScript communities are thriving. Our product, orgpad.com is written completely in those two. I have written about some of the technologies we use before.


I’ve seen people saying that but never with an argument as to why. What do they mean?


There are some other explanations given in sibling comments. They may be right, but there's another point that may also explain some of this sentiment.

In Common Lisp and its antecedents, a "list" is a chain of cons cells, as mentioned by some of the sibling comments, but that's not all. Another important point is that in Common Lisp and its antecedents, source code is not made of text strings; it's made of Lisp data structures--atoms and chains of cons cells.

A text file containing "Common Lisp source code" does not actually contain Common Lisp source code. It contains a text serialization of Common Lisp source code (one of many possible text serializations, actually). It isn't actually Common Lisp source code until the reader gets done with it.

This might sound like a trivial pedantic point, but it isn't. Because Common Lisp source code is made up of standard Common Lisp data types, its standard library provides everything you need to walk arbitrary source code, deconstruct it, transform it, construct it, and compile it. Those features are all built into the language in a way that they are not in most languages.

For those of us who are accustomed to using those features regularly, working with a language that lacks them is such an impoverished experience that I can understand folks objecting that "that's not a proper Lisp." I don't tend to make that objection myself, but I do understand it.

If your Lisp does not represent its source code in this way, or if it doesn't even have the data structures that source code is made of in Common Lisp and its antecedents, then there is a nontrivial sense in which it's not a Lisp--or at least not the kind of Lisp that those older ones are.


>In Common Lisp and its antecedents, a "list" is a chain of cons cells, as mentioned by some of the sibling comments, but that's not all. Another important point is that in Common Lisp and its antecedents, source code is not made of text strings; it's made of Lisp data structures--atoms and chains of cons cells.

Source code in Clojure is also not made of text strings, it also reads text as the serialization of data structures, which are then interpreted as source.

The difference is, Clojure uses data structures other than cons cells for source.

What do you see as particularly important about cons cells? What advantages do they give what some might call a "real LISP" over Clojure, which, I'd argue, smartly abstracts around more modern data structures like vectors, maps, sequences, collections as opposed to being "married" to cons cells as Rich Hickey once put it?


Common Lisp uses cons cells. So do a bunch of older Lisps whose design fed into the design of Common Lisp. That's all. There's nothing else special about cons cells in my mind.

Cons cells are not the important point--at least not for me. The important point in this context is that expressions are represented by something better (that is, more conveniently-structured) than flat text strings, and that the representation be a standard data structure in the language, and that operations on source code are implemented by APIs that are exposed as standard parts of the language.

As far as I'm concerned, if a language does that, then it's nailed this particular part of being Lispy. There are some other parts, but that discussion is outside the scope of this one.


The fancy word is "homoiconicity" but that is more about appearance. Homostructural? Homotypic?

I think it's also important, at least in my opinion, that that data structure be extremely simple, and cons cells are about as simple as you can get. When you start adding "Well a vector is different than a string, n-tuple, or array, so your code has to figure out which one it is dealing with", that's when you run into issues. You could step back and just go object oriented, but at it's core an object is just a struct with a pointer to a function table and/or dictionary data, so we're right back at "cons cell if you squint".

Internally it may be sped up but conceptually "everything is a small integer value or a cons cell" maps pretty closely to how I think about low level data structures. something something build your own arbitrary precision floating point number...

(123 (0 (123 nil))) ~= 123.123 ~= "{\0{" ~= [123, 0, 123]


There's a decent argument for keeping such a foundational data structure as simple as possible, but there's also a decent argument for not making it too simple.

Cons cells are certainly very simple. They're so simple that, as Moon once observed to me, there's no place on them to hang metadata. For example, if your source code is made of cons cells, you might wish that they had some sort of metadata slot so that you could use it to keep track of where a given hunk of source code came from. You can't though. You have to kludge up some out-of-band solution for things like that.

We were talking about my hobby Lisp, Bard. He liked that it separated protocol from representation, so you could have Lists that were made of something other than cons cells. In fact, in Bard your Lists can be made of anything you like, as long as it participates in the List protocol. In particular, they can be made of something that has some place to hang metadata.

Rich Hickey of course also gave a bunch of Clojure's data structures places to hang metadata, possibly for similar reasons.


Secret meta-data in a cons cell is not out of the question.

In TXR Lisp, cons cells are four pointer-sized fields wide. So one field is not used. Almost. The field is used in the hash table implementation in which entries are conses. It sticks the hash code in there. That hash code is a pointer sized word with no tag; the garbage collector can safely ignore it.

The extra field is currently not used for tracking source location information, though it could be. Source location info is instead tracked in an external hash table. (The table is configured with weak semantics, so when the code becomes garbage, the entries vaporize.)

That representation could change in the future. It would mean that when the garbage collector traverses conses, it has to look at that hidden field of each one. And each time we allocate a fresh cons cell, we have to make sure it is initialized.

I'd have to benchmark it.

Associating expressions with source location info is a cost that we bear only when processing source code. If we shoehorn it into conses, then there is some nonzero cost to all cons cell processing, whether we are scanning code or not.

An important problem is that meta-data attached to cons cells (whether internal or external) is not copied across traditional tree-structure rewriting operations.

TXR Lisp's expander does some work behind the the scenes to propagate location info, like from macro calls to their expansions. The parser has a flag for whether to attach the info to objects in the first place. It's on by default if we are reading code, but not when reading data.

Outside of the expander, a few places in the compiler have to be aware of this (when the compiler performs its own tree-writing outside of the macro framework).

Overall I'm satisfied with the reporting. From time to time I see a bug: an error occurs for which source location info isn't available but should be.

I didn't give it character precision: I think that compiler messages that report line number and character column are too rococo for my taste. If you can't figure out the problem from a line number, maybe your code is stuffing too much into one line of code.


Associating expressions with source location info is a cost that we bear only when processing source code. If we shoehorn it into conses, then there is some nonzero cost to all cons cell processing, whether we are scanning code or not.

That's true only because cons cells have a specific representation, but they don't have to. Bard classes are defined by protocols, not representations.

If I remember right, that's what Moon liked: because Bard's classes were defined by protocol and not representation, source code could be made of lists, and could have a place to hang metadata, without imposing that cost on other lists, because lists were not any specific representation; they were just any representation for which the list protocol was defined.


Even though I have two kinds of cons cells (regular and lazy) as well as the ability of objects to implement car and cdr and then work with those functions, I'm still fairly reluctant.

I wouldn't want source code to use objects, but real cons cells. Objects are heavier-weight. Each object is a cons-cell sized object, plus something in dynamic heap.

There are print-read consistency issues. Lazy conses and regular conses are indistinguishable in print. If you print some lazy conses, and read that back, you get regular conses. Of course, an infinite list made using lazy conses will not print in a readable way, so we can sweep that under the rug.

Objects implementing car and cdr have arbitrary print methods too. They won't print as lists. Those programmed to print as lists won't have print-read consistency.


Point taken, but I feel like I should explain that the word "class" has an idiosyncratic meaning in Bard.

Bard classes are not conventional object-oriented classes; they aren't even CLOS-style classes. A Bard class is a set of representations that participate in a given protocol. That being the case, a hypothetical Bard cons cell representation could be exactly the same as a TXR Lisp cons cell, or the same as a Common Lisp cons cell. In either case it need not be the only representation of a cons cell in the language.

(I feel like someone is going to object that I shouldn't use the word "class" for a concept that is so different from what it usually means in object-oriented languages, and that might be true. If someone suggests a better term for a set of representations defined by a protocol in which they all participate, I'll consider adopting it.)


I have experience with both: multiple deeply-integrated cons objects that satisfy the consp function, as well as allowing non-cons objects (including classes in the OOP sense) to take operations like car and cdr.

Once you merely go from one cons type to two, with their own tags, every place in the run-time which checks for a cons cell has to now check for two possible type tags. An atom is everything that is not a cons, as you know, so that function is also affected.

(I wonder whether it wouldn't be better to just have one cons tag, and use some flag field to distinguish lazy conses.)


Also fair points. In Bard I’ve been willing to pay that cost because exploring types that are defined by protocol was one of the motivating reasons for working on it.


The original question was why people don’t consider Clojure a lisp. By your yardstick, it is clearly a lisp.

As far as I can tell the only argument against it being is that it does not specifically use cons cells.


I did not intend to classify Clojure as "not a Lisp". I didn't intend to comment on Clojure at all, specifically, so commenting on this particular thread was an ill-conceived choice on my part.

I wanted to describe the source-code peculiarity because it hadn't been discussed elsewhere in the comments and I think it's important in what you might call old-fashioned Lispiness. I should have chosen another place for my comment. Sorry about that.


It's a slight cultural shift, rich hickey probably tried to modernize/homogenize things sensibly, adding a few literals for vectors, maps and sets (a very interesting idea ergonomics wise, i'm 80% for it personally, having easy data notation is such a bliss), and some underlying changes (immutable DS first) which makes clojure feel different than lisps/CL/scheme.


My understanding is that Clojure is meant to be Scheme like, but it is not fully compliant to a Scheme spec, my only guess is due to JVM specific nuances, but I could be wrong. It has been some time since I last dived into Clojure specifics. I will say though, that side from Racket, I think Clojure is top notch, although it seems a lot of my favorite projects from a few years ago have been abandoned. One of my favorite things with Clojure is using the REPL to build a GUI using JVM libraries.


>> My understanding is that Clojure is meant to be Scheme like, but it is not fully compliant to a Scheme spec, my only guess is due to JVM specific nuances, but I could be wrong.

Clojure's syntax and semantics are quite different from Scheme and Common Lisp:

https://clojure.org/reference/lisps

https://www.more-magic.net/posts/thoughts-on-clojure.html

Most differences are due to design choices, not merely JVM nuances. A few differences are due to JVM limitations at the time that Clojure was designed.

I don't think any attempt was made to comply with a Scheme spec.


Good to know, thank you! I have only done a small amount of Clojure and Racket.


> I’ve seen people saying that but never with an argument as to why. What do they mean?

I don't know but if I had to guess, it's because lisp is list processing language, and Clojure doesn't really support lists (I mean, it's possible to make some, but there are none out of the box); instead it has a variety of trees that mimic the runtime performance of lists, arrays, hashes, etc.


I’m not sure I understand how there are no lists out of the box in Clojure. What makes the data structures not valid lists? Basic lisp lists are nestable, doesn’t that make them trees? The underlying structure is to support immutability by default but that’s under the hood stuff. Conceptually and, I think more importantly, syntactically they’re list.


When lispers talk about lists, they are usually speaking about a very specific type of data structure - a linked list of cons cells[0]. Clojure's lists are not actual chains of cons, they are immutable hash array map tries. This means that Common Lisp code can't interoperate with Clojure code.

The precise/pedantic lisper may insist that since Lisp stands for List Processing and Lists are chains of cons cells and since Clojure doesn't have cons cells for the built-in lists, then Clojure is not a Lisp.

If, however, you view lisp (lower case) as a family of homo-iconic languages that use s-expressions, then Clojure happily fits under that umbrella.

The trick is to pay close attention to whether the person is talking about a Lisp (ANSI Common Lisp implementation) or a lisp (the family). Sometimes people say "lisp" when they are talking about "Lisp", which can cause some confusion.

[0]https://en.wikipedia.org/wiki/Cons


Clojure's lists are effectively cons cells. Clojure's vectors and maps are immutable hash array map tries.

    => (class (read-string "(some list)"))
    clojure.lang.PersistentList
They are written in Java, and implement a bunch of interfaces, so the implementation looks complicated, but they are basically just classes with _first and _rest fields.

https://github.com/clojure/clojure/blob/master/src/jvm/cloju...


AFAIU, "list" has a specific technical meaning when used by someone complaining about Clojure not having them. It has to do with the implementation details, not just the semantics of how they're used.

Clojure has "sequences" or "seqs", which are semantically the closest to what lispers mean by a "list"—but also "vectors," denoted by square brackets, "maps," denoted by curly brackets, and "sets", denoted by curly brackets prefixed with a hash sign. Having these as core data structures violates the Lisp principle of "everything is just a list," and they introduce something other than round parens into the syntax, which looks really weird to those practiced with traditional/conventional Lisps.

For those well-practiced in "true" Lisps, Clojure reads like a very strange hybrid (one might say "corruption") of Lisp and JSON.


Clojure uses a high-level data-structure for list like data. Lisp usually uses a very primitive data-structure (two-slot linked list cells).

The main point is that Lisp actually has decades old established set of data structures, their operators and programming concepts. For example when I would have read a Lisp book, I would find explanations of these core datastructures and operators. In a Clojure book many things work sufficiently differently and also are named differently. In Lisp for example "atom" is a central concept. It's anything which is not a linked list element. In Clojure this word is used to name a completely different concept: a data structure to manage state.

Clojure is more like Lisp, Java, SML put into a blender. The result may have a similar color, some traces of flavor, but other then that it is a new language.


I'm probably the wrong person to have this conversation with - I agree with you for the most part, and I think Clojure is a lisp.

My guess was the only thing that I could really think of. That and the syntax support and integration of non-list datatypes into the core language (e.g. function syntax).


It literally has a list type though.

https://clojuredocs.org/clojure.core/list


I'm sure it matters for purist (valid concern), language designers and "lisp-power-users (sic ?)"

I do feel however like that statement is kind a like a "tomato is a fruit" statement. Technically true but for the vast (again not all) amount of tomato users does it really matter ?


I don't think the article is necessarily claiming that specific syntaxes like SQL are bad. I think they just wanted to highlight how different the languages are.


What kind of work do you do? I occasionally daydream about working in something like APL/q, Forth, or Lisp.


I write Clojure for https://www.metabase.com/

Lots of interesting problems involving working with different database engines, processing data, compiler design, etc.


That sounds super cool!


I think of specialized language syntax on a spectrum similar to "Dynamic types <-> Static types". Very general syntax like S-expressions provides you maximum flexibility, but the minimum amount of guidance on semantics and protection against mistakes, along with the opportunity for nicer, more-detailed error messages.


I don't agree. One can write a serious, syntactically bullet-proof grammar for a language built out of S-expressions, with nice error messages and all. We can imagine a parallel universe in which TFA's Postgres example were written using S-expressions:

    <alter> := (ALTER-TABLE (:IF-EXISTS? :ONLY?) <symbol> :*? <action>+)
            | ...

    <action> := ...
Despite using S-expressions, this is parseable with as much rigor as more free-form character syntax.

I think Coalton [1] is a good example of this. The language is embedded as S-expressions in Common Lisp, but you get Rust-like error messages, showing exact source lines (with line numbers) and embedded underlines showing the location of the offending code.

With that said, what is true is that if you're using Common-Lisp-style macros and you're manipulating S-expressions programmatically, lexical information may get thrown out. This is not unlike doing a bunch of string manipulation in an ORM to generate SQL, which will have similarly poor error messages.

[1] https://github.com/coalton-lang/coalton


Personally I find that LISP syntax remove a layer of complexity by directly exposing the AST to my brain instead of adding a layer of internal parsing.

    1 + x * 2 - 3 % x
is longer to decipher than

    (% (- (+ (* x 2) 1) 3) x)
which is itself harder than

    (-> x (* 2) (+ 1) (- 3) (% x))
But it takes a while to be used to it. And yes, it really helps writing macros, but I wouldn't say this as always be a good thing. Macros are far from being the alpha and omega of programming, as they add an implicit layer of transformation to your code making it easier to write but very often harder to read and reason about.


I started working through Crafting Interpreters, building up a language syntax and grammar from scratch. A lot of work and 75 pages of lex/parse logic and we now have a AST... that we can debug and inspect by looking directly at its sexp representation.

It was the ah-ha moment for me... why not express the source cost directly as that AST? Most languages require lots of ceremony and custom rules just to get here. Sexps are a step ahead (inherently simpler) since they're already parsable as an unambiguous tree structure. It's hard to unsee - reading any non-Lisp language now feels like an additional layer of complexity hiding the real logic.


Much of the complexity and error reporting that exists in the lexer or parser in a non-Lisp language just gets kicked down the road to a later phase in a Lisp.

Sure, s-exprs are much easier to parse. But the compiler or runtime still needs to report an error when you have an s-expr that is syntactically valid but semantically wrong like:

    (let ())
    (1 + 2)
    (define)
Kicking that down the road is a feature because it lets macros operate at a point in time before that validation has occurred. This means they can accept as input s-exprs that are not semantically valid but will become after macro expansion.

But it can be a bug because it means later phases in the compiler and runtime have to do more sanity checking and program validation is woven throughout the entire system. Also, the definition of what "valid" code is for human readers becomes fuzzier.


> later phases in the compiler and runtime have to do more sanity checking

But they always have to do all the sanity checking they need, because earlier compiler stages might introduce errors and propagate errors they neglect to check.

> program validation is woven throughout the entire system

Also normal and unavoidable.

As far as processing has logical phases and layers, validation aligns with those layers (the compiler driver ensures that input files can be read and have the proper text encoding, the more language-specific lexer detects mismatched delimiters and unrecognized keywords, and so on); combining phases, e.g. building a symbol table on the go to detect unidentified identifiers before parsing is complete, is a deliberate choice to improve performance but increase complication.


> because earlier compiler stages might introduce errors and propagate errors they neglect to check.

Static analyzers for IDEs need to handle erroneous code in later phases (for example, being able to partially type check code that contains syntax errors). But, in general, I haven't seen a lot of compiler code that redundantly performs the same validation that was already done in earlier phases. The last thing you want to do when dealing with optimization and code generation is also re-implement your language's type checker.


Those rules help reduce runtime surprises though, to be fair. It's not like they exist for not purpose. It directly represents the language designer making decisions to limit what is a valid representation in that language. Rule #1 of building robust systems is making invalid state unrepresentable, and that's exactly what a lot of languages aim to do.


Note that this approach has been reinvented with great industry success (definitions may differ) at least twice - once in XML and another time with the god-forsaken abomination of YAML, both times without the lisp engine running in the background which actually makes working with ASTs a reasonable proposition. And I’m not what you could call a lisp fan.


You left out the more common options:

    (1 + x * 2 - 3) % x
and

    (1 + (x * 2) - 3) % x
both are clearer than the S-expression in my opinion


I don't find them to be clearer, with the background of knowing many language, because now I have to worry about precedence and I better double check, to not get it wrong or read it wrong.


I agree, and this is why for math expressions that aren't just composition of functions that aren't basic operators, I like to use a macro that lets me type them in as infix. It's the one case where lispy syntax just doesn't work well, IMO.


As someone who isn't a trained programmer (and has no background or understanding of lisp) that looks like you took something sensible and turned it into gibberish.

Is there a recommended "intro to understanding lisp" resource out there for someone like myself to dive in to?


  (-> x (* 2) (+ 1) (- 3) (% x))
The part that is confusing if you don't know Clojure is (->). This a thread macro, and it passes "x" through a list of functions.

So it basically breaks this down into a list of instructions to do to x. You will multiply it by 2, add 1 to it, take 3 from it, then do the modulus by the original value of x (the value before any of these steps).

Clojurists feel like this looks more readable than the alternative, because you have a list of transformations to read left to right, vs this

  (% (- (+ (* x 2) 1) 3) x)
Which is the most unreadable of them all, to me.


I feel like there's a joke in here somewhere about a backwards Forth dialect, but this also reminds of of chaining idioms used in other languages.

Currying with OOP:

  res = x
    .mult(2)
    .add(1)
    .sub(3)
    .mod(x)
Currying with assignment operators:

  res = x
  res *= 2
  res += 1
  res -= 3
  res %= x
Naming things instead of Currying:

  scale = 2
  offset = 1 - 3
  with_scale = x * scale
  with_offset = with_scale + offset
  result = with_offset % x
Or aping that in Scheme:

  (let* ((scale 2)
         (offset (- 1 3))
         (with_scale (* x scale))
         (with_offset (+ with_scale offset)))
       (remainder with_offset x))


this is very readable if you know to read the operations inside-out instead of left-to-right


But that reading requires looking back and forth to read the operator and the operand. The further you move out the more you shift your eyes and the harder it becomes to quickly jump back to the level of nesting that you are currently on at the other side.


Be that as it may, it's less readable than the other ones, to me


  '(
    1. This is a list: ( ). Everything is a list. Data structures are all lists.
    2. A program is a list of function calls
    3. Function calls are lists of instructions and parameters. For (A B C), A is the function name, B and C are parameters.
    4. If you don't want to execute a list as a function, but as data, you 'quote' it using a single quote mark '(A B C)
    5. Data is code, code is data.)


> Is there a recommended "intro to understanding lisp" resource out there for someone like myself to dive in to?

Practical Common Lisp - https://gigamonkeys.com/book/

Casting SPELs in Lisp - http://www.lisperati.com/casting.html

Structure and Interpretation of Computer Programs - https://mitp-content-server.mit.edu/books/content/sectbyfn/b...

One of many prior discussions here on HN: https://news.ycombinator.com/item?id=22913750

... amongst many resources for learning LISP.


The language fundamentals of "Clojure for the Brave and True" (best intro to Clojure book IMO) is excellent (if you consider Clojure a lisp). I find the author's style/humor engaging.

https://www.braveclojure.com/clojure-for-the-brave-and-true/


It is gibberish. The expression means:

  (1 + (x * 2)) - (3 % x)
So the correct S-exp (let's use mod for modulo rather than %):

  (+ 1
     (* x 2)
     (- (mod 3 x)))

It's a sum of three terms, which are 1, (* x 2) and something negated (- ...), which is (mod 3 x): remainder of 3 modulo x.

The expression (% (- (+ (* x 2) 1) 3) x) corresponds to the parse

  ((x * 2 + 1) - 3) % x
I would simplify that before anything by folding the + 1 - 3:

  (x * 2 - 2) % x
Thus:

  (% (- (* 2 x) 2) x).
Also, in Lisps, numeric constants include the sign. This is different from C and similar languages where -2 is a unary expression which negates 2: two tokens.

So you never need this: (- (+ a b) 3). You'd convert that to (+ a b -3).

Trailing onstant terms in formulas written in Lisp need extra brackets around a - function call.


In real Lisp code you'd likely indent it something like this:

    (% 
      (-
        (+ (* x 2)
           1)
        3)
      x)
This makes the structure clearer, although it's still wasteful of space, and you still have to read it "inside-out". The thread macro version would be:

  (-> x
    (* 2)
    (+ 1)
    (- 3)
    (% x))
It's more compact, there's no ambiguity about order-of-operations, and we can read it in order, as a list of instructions:

"take x, times it by 2, add one, subtract 3, take modulus with the original x".

It's pretty much how you'd type it into a calculator.

EDIT: care to explain the downvote?


For what it's worth (speaking only for myself), I could not live without the threading macros (-> and ->>) in Clojure. Below is an example of some very involved ETL work I just did. For me this is very readable, and I understand if others have other preferences.

(defn run-analysis [path]

  ;load data from converted arrow file
  (let [data (load-data path)]

    (-> data
        ;; Calc a Weeknumber, Ad_Channel, and filter Ad_Channel for retail
        add-columns
        ;; Agg data by DC, Store, WeekNum, Item and sum qty and count lines
        rolled-ds
        ;; Now Agg again, this time counting the weeks and re-sum qty and lines
        roll-again)))

;;Run the whole thing

(time (run-analysis arrow-path))

This code processes 27 million lines in 75 seconds (shout out to https://techascent.github.io/tech.ml.dataset/100-walkthrough... library)


If such an expression gets indented at all, it would be more like this:

  (% (- (+ (* x 2)
           1)
        3)
     x)


> In real Lisp code you'd likely indent it something like this:

Not only would that not be idiomatic, the operator for modulus in Common LISP is mod not %. and the brackets you and the parent used in the s-expr are around the wrong groups of symbols. So you're more likely to see:

(mod (* (+ 1 x) (- 2 3)) x)

or maybe with some limited indentation, such as:

(mod

      (* (+ 1 x) (- 2 3)) 

      x)


Nobody said it had to be Common Lisp. I'm going by the notation the grandparent commenter used. My point was that indentation can clarify the structure of nested sexps vs putting them on one line. And that is actually what people do. "mod" vs "%" hasn't the least to do with it. This isn't even really about arithmetic; those are just at-hand examples the GP commenter chose. Could just as well have been

  (foo 
    (bar
      (baz (bax x 2)
           "hello")
      "world")
    "!")
>the brackets you and the parent used in the s-expr are around the wrong groups of symbols

No they're not. Yours is wrong. Multiplication has higher priority than addition so the order of evaluation begins with (x * 2) not (1 + x).


> No they're not. Yours is wrong. Multiplication has higher priority than addition so the order of evaluation begins with (x * 2) not (1 + x).

OK, I shouldn't have gone that far.

FWIW, modulus has the same operator precedence as multiplication and division.

So really, it is more like:

(- (+ 1 (* x 2))

   (mod 3 x))
or using the increment function:

(- (1+ (* x 2))

   (mod 3 x))


Alright well I guess this is an object lesson in why infix notation is bad -- nobody can remember all the precedence rules once you get beyond +-*/^.


We'll call it even :-)


Interestingly the second form is just infix notation where every operator has the same precedence and thus is evaluated left to right. That says to me that it's not infix notation that's inherently weird but instead it's the operator precedence rules of mathemetical infixes that are weird.


> that looks like you took something sensible and turned it into gibberish

This is the main thing I use Lisp (well, Guile Scheme) for. I used to use bc for little scratch pad calculations, now I usually jump into Scheme and do calculations. I don't recall if I thought it looked like gibberish at first but it's intuitive to me now.



Reject PEMDAS, return to monky:

x * 2 + 1 - 3 % x

https://mlajtos.mu/posts/new-kind-of-paper-2


My preferred version of this:

x.*(2).+(1).-(3).%(x)

Unfortunately our brains are broken by pemdas and need clear delineations to not get confused; this syntax also extends to multiple arguments and is amenable to nesting.


When I learned APL, the expression evaluation order at first seemed odd (strictly right to left with no operator presence, 5*5+4 evals to 45 not 29). After working with it a couple of hours I came to appreciate its simplicity, kind of like the thread operator in your last example.


Well, the easiest way to write

  1 + x * 2 - 3 % x
would just be "x-2".

But if we're talking more generally, if I have an expression like

  2*x^3 + x^2 - 5*x
a trained eye immediately can read off the coefficients of the polynomial and I'm not sure if that's true of

  (+ (* 2 (^ x 3)) (^ x 2) (- (* 5 x)))


For writing a program, the s-expression form might become:

    (+ (* 2 (^ x 3))
       (^ x 2)
       (- (* 5 x)))
Whereas:

    2*x^3 +
    x^2 -
    5*x
Would probably error out in most languages, due to parsing issues and ambiguity. Even worse ambiguity, if you put the signs in front, as then every line could be an expression by itself:

    2*x^3
    + x^2
    - 5*x
Could be 3 expressions or 2 or 1.


It might do the wrong thing in some languages but wouldn't necessarily raise a compiler error, and I'm fairly certain e.g. sympy should have no issue with it.


> `(-> x (* 2) (+ 1) (- 3) (% x))`

Love the single pipe operator. What language works this way?


That's a clojure norm, it may exist in other lisps.


That's how my brain feels. it connects informations (compound terms) to entities directly, it's almost minimized information required to represent something, unlike algol based languages.


I prefer infix over prefix. And don't forget postfix

    x 3 1 2 x * + - %
I definitely prefer white-space over commas and other syntactic obstacle courses.


I mean I would probably punch that in as

    x 2 * 1 + (-) 3 + x %
When I was using my rpn calculator. (-) flips the sign of the number on top of the stack.

Or use flip -. But you can avoid growing the stack long enough to be confusing.


Using piping for arithmetic is reminscent of grade school arithmetic dictations.

"Start with 10; add 5; divide by 3; ..."

:)


If the language has no operator precedence then

  1+x*2-3%x
is just as easy if not easier to decipher compared to both of your other examples imo. The above is equivalent to

  (1+(x*(2-(3%x)))) 
in APL/j/k. You get used to it pretty quickly.


> In particular, image-based development is a rarity nowadays, a Galápagos island feature that is undesirable in many contexts, but it’s the thing that makes it possible to have Turing-complete macros that are defined in the same place as the code, without needing to involve a build system.

This is false. Lisps do not have to have an image-based development system in order to have Turing-complete macros in the same language as the code, inside the code.

The proof is the existence of such systems. There are Lisp dialects which do not have image dumping, yet have working macros.

What you need is for the macros (and any helper functions of the macros) to be loadable into the compiler process. You do not need image saving and restoring for that to work.

Image saving and restoring is an independent feature of memory management; it traverses the heap and writes a representation of all the objects which can be used to re-animate the session.

Some Lisps rely on image saving for producing executables, but that's not related to how macros work.


> Lisp’s unusual syntax is connected to its expressive power. How? Not because of “homoiconicity”, a word that has no meaning but leaves people somewhat impressed, because it sounds fanciful and mathematical. It’s because of uniformity.

No, it _is_ because of homoiconicity, which though a fancy word, does mean something: code is data. That is the reason Lisp syntax works. Clojure is less uniform, but it is still homoiconic, and that is why it has the same powers as CL.


Doug McIlroy coined homoiconic as meaning that the programs are stored in the way that the programmer has entered them.

The POSIX shell is homoiconic because after you have defined some functions, you can execute the command "set" (with no arguments) to see the definitions. They are stored in a form that can be copy and pasted into the shell. (There may be reformatting and comments removed).

The homoiconic feature in Lisp (Common Lisp) is the ED function. It will recall the textual definition of a function, allowing it to be edited.

The big idea in Lisp of using a data structure for manipulating code isn't homoiconic; the data structure definitely isn't in the source code format. It's homoiconic to the extent that uncompiled code can be stored as a data structure and converted back to a character-level of representation. (This is possible due to something that Lisp calls print-read consistency, an important concept in Lisp.)

The code-to-code transformations in Lisp do not work by converting code to text and back again.

We can take advantage of that even if we go through some front-end that provides a surface syntax, like infix.cl, which loses homoiconicity.

So rather that homoiconicity, the two concepts important in Lisp are print-read consistency and code as a data structure.


Code is also data in C++, because a source file is a vector of bytes, which is a type of data that C++ can manipulate.

I think when people say “code is data” what they really mean is something more specific, like the AST is easy to access and manipulate in the language.


When people say 'code is data' they obviously don't mean that source code is stored in a text file which consists of bytes...

(def add (x y) (+ x a))

(def sub (x y) (- x y))

(add 1 2) # 3

(sub 1 2) # -1

(first '(add 1 2)) # 'add

(rest '(add 1 2)) # (list 1 2)

(apply 'sub (rest '(add 1 2))) # -1

(reverse '(add 1 2) # '(2 1 add)

Excuse my pseudo lisp quote syntax, it's been a while.


I've mentioned it before, but maybe If I can help one "almost-lisper":

IF the "Syntax / Parenthesis" is the "only" thing that puts you off LISP (I'm sure there are many other semi-valid reasons), keep pushing. It's a little bit like smoking, most smokers didn't enjoy their first two or three cigarettes, but suddenly one day you realise you addicted and now all other non S-Expr-Languages looks ugly. It's kinda like magic the "sudden switch" from hate-to-love. At least for me :)

YMMV


As someone who is semi obsessed with functional/array programming and lisps, framing it as an addiction is a turn off. We should use things because they are good, not because we are unable to stop.


Framing it as addiction was a mistake, but the intention is accurate: S-expressions are weird when you're used to C-like languages until they're not. You have to push yourself long enough to get used to them, and most people don't want to do that. Once you do, everything makes sense.

What did it for me is realising that after some time, you don't pay attention to the parentheses. Identation in lisps is semantic, which looks weird when you're used to one level at a time. But try using a sensible formatter (e.g. Clojure's cljfmt, Racket's fixw). They will show you good ways to format your code, making it very readable, parentheses and all.

Then you get used to structural editing with Paredit or vim-sexp, and you're sold on S-expresssions.


Maybe an "acquired taste" is a better metaphor, like stinky cheese, sour beers, and Captain Beefheart & The Magic Band.


That is a terrible metaphor, because it is dismissive and implies that there is really no difference between LISP and other languages.

I repeatedly hear smart and productive people say LISP is special because of macros. Personally I think we all should try to listen, and not miss out on learning something important, when very smart people keep repeating similar messages.

I believe the power of LISP macros causes one cost and one risk.

The cost: source code is a regular tree of data with only one type of node and a few primitive types of leaves. This heavily restricts syntax, and people unfamiliar with the reason behind the syntax choices find the syntax ugly.

The risk: the power of macro transformations can make code completely unreadable. Writing macros is not for low skilled programmers or developers without good taste. Macros allow customisation, but that power can be used well or badly. LISP lacks safety features, and its power makes it trivial to code footguns (or much worse). In return you get productivity, if you and your team are wise enough to use the power for good!

Basically macros restricts syntax choices but gifts semantic choices in return.

A simple example: in most imperative languages from the code you can tell when a function is being called. In LISP with macros you can't (although I would guess each codebase will have semantic and stylistic (non-syntax) norms that would help you).

Disclaimer: I don't program in LISP so much of the above is academic.


> Disclaimer: I don't program in LISP so much of the above is academic.

I don't mean to be disrespectful, but what's the point of the above post if you don't use Lisp? I've been writing Lisp at the hobby level for a few years now, and my opinion is my own.

I also think you misread what I wrote. Lisp and s-expressions are weird. That they take some time to get comfortable with (an "acquired taste") is both my personal opinion, obtained from my own experience, and is also generally accepted among Lisp users that I've spoken to. It's hardly dismissive.


Oh, sorry, I'm an idiot. I completely misconstrued what you wrote, as well as the other replies. I do try not to jump to conclusions!


See also: exercise


Haha I think you right !


I don’t know if it helps, but I used reverse Polish notation calculators before doing much lisp. To this day for little calculations I will either type an Elisp expression into my eMacs scratch buffer or if i am out and about will use my HP 15C app in my phone. Even if I am transcribing from an infix thing it is easier. The reverse means no paren, which is easier.


Yea 100% accurate. Sorry English not my first language.


For what reason would you be addicted to something like a programming language, other than it being good?

It's not physically addictive.

Would you be addicted to a video game that sucks?

"I want to play a game because it's critically acclaimed and has stunning graphics, not because I can't stop."

"I want to read a novel because it's author is a winner of literary prizes, not because I can't put it down ..."


If you want to convince someone to start doing something, comparing it to cigarettes is probably the worst possible idea.


Lisp syntax is just a nice format for textual representation of trees in general and ASTs in particular. Having done similar work in both Common Lisp and Go, I can confidently say that working with ASTs in CL is considerably less burdensome.

Lisp in general and Common Lisp in particular have a lot more going on than the syntax though. I think there's an old saw about syntax being the bikeshedding of the programming language world when it's semantics where all the real thought and expertise is required. It's the semantics that CL interesting, like for example its well defined notions of compile, load and eval time[1] and the ability to redefine existing types and functions, even while in the debugger.

[1] http://clhs.lisp.se/Body/s_eval_w.htm


I'd recommend this for more reading on the subject: https://www.defmacro.org/ramblings/lisp.html

Personally I switched to clojure 8 years ago after this article along with PG's essays pushed me over the edge.


If I understand correctly, I believe elixir carries similar benefits:

https://elixir-lang.org/getting-started/optional-syntax.html


you dont


Hey, not sure if this was your intention, but your comment came across as quite rude, and makes me feel discouraged from commenting on something in the future that I'm not 100% confident about. And of course that's no way to learn.

A more encouraging response would be one that doesn't focus on whether or not I understand correctly (I'd already acknowledged that I may not), but on what the differences are, thereby spreading your knowledge rather than using it to make people feel excluded.


Hi, I'm sorry for coming across as rude, but your post had no content other than a statement that you think Elixir has similar syntax benefits as Lisp. Maybe if you wrote, "I think this is similar to Elixir because so and so", someone would be able to give you a more informative reply as to why you are wrong


I disagree. The comment pointed out another language with similar benefits to the article, and linked to another article which specifically talked about syntax rules of Elixir. It was useful for me, personally.


some consider posting a link to some content you want to use as an argument, bad form. usually what is expected is to make your claim and use that article as reference to support it. this also helps lay bare your understanding of the content you are linking to.

with that said, what precisely do you think are the similar benefits of elixir syntax rules to those of common lisp?


What's the point of a comment like this? I think I disagree with your reply, but the lack of argument makes it difficult.

While I wouldn't go as far as saying that elixir has the same benefits as lisp regarding uniform syntax in combination with a flexible macro system, elixir's macro system is undoubtedly powerful. The big difference with a lisp is that elixir's macro's operate on an intermediate ast representation that looks a lot like lisp code. So while elixir doesn't have a very uniform syntax itself, I think its macro system gives comparable power and flexibility.


and because of the nonuniformity difference that you mention, for anything but the most basic macros you gonna find that level of comparability with lisp decrease pretty exponentialy

btw, the other poster didnt mention elixir macros. they mentioned optional syntax which frankly terrifies me :)


The code mentorship platform Exercism has recently launched "The Summer of Sexps!"

  This month for #12in23 we're exploring languages that are Lisp dialects. We're digging into Clojure, Common Lisp, Emacs Lisp, Racket and Scheme, with interviews and live streaming, exclusive swag and lots of fun exercises for you to complete!
https://exercism.org/challenges/12in23

https://youtu.be/5Kg7gC1YcWs


In my long journey to make computers do what I want spanning several decades, systems, languages and trends, I've come across a certain class of geeks.

This group would swear by Lisp. And by that I mean, they would treat Lisp as the gold standard. Sometimes, the only standard that matters. They would go to unimaginable lengths to explain why Lisp is really the purest form of computer science expression; how it sharpens your mind and elevates your skill; how it is akin to poetry, only better, because it's poetry in code.

And after all this was done, and repeated several times over, I would gently nod at this exalted madman who had been blessed by the touch of the Lisp God. And then we'd both go back to whatever language we were actually using at the time.


Same for FORTH. I don't think I've ever seen a FORTH programmer and a LISP programmer in the same room; I can't imagine the discussion terminating.

Over the years, as a left-handed person, I've come to see this as evidence of underlying diversity in thought patterns that can't simply be learned/trained around. Certain people find certain metaphors and modes of computing much easier to work with than others. Few people on either side can see this. The result is that you have people who, having feeling like they were fighting scissors all their lives suddenly discovering that a different sort of scissor exists that gives better results, start evangelizing it to everyone. The rest of the world tries it, finds it impossible to use, and concludes the weird scissor advocates are mad.

Neither the LH people holding LH scissors nor the RH people holding RH scissors are "wrong". It's just the RH people trying to use LH scissors who are having a bad time.

(The limit case of this is a few people who are using languages which are uniquely tailored to themselves and absolutely nobody else; colorforth, the late author of TempleOS, Urbit, etc)


I actually think it's a lot more about initial training. and corporate sponsorship.

People would rather die/retire than change how they program. The assembly guys died/retired rather than learn structured languages. The imperative people died/retired rather than learn OOP. The OOP people are currently dying/retiring as languages shift to be much more functional.

ALL of these things have existed for 50+ years, so why did they take so long to adopt (as people now almost universally agree these things are better)? It's all about what they are taught and use early on in their career. With the exception of 5-10% of outliers, everyone takes what they learned in school, adds a little more past that as they are learning to code, then cement their synapses into that pattern for the rest of their career.

A conflating factor here is corporate backing. If you look at the popular programming languages, only a couple became successful without a large corporate sponsor. Corporations tend to be conservative in their investments too. This creates a secondary effect where even when you encounter a superior language pattern, you cannot use it because your company isn't likely to pay you to rewrite complex libraries in a new language.

JS pushed the world so far toward functional programming because even though it was different, it was the only standard. OOP devs in the 90s and 00s would rather die/retire than actually learn JS and more functional programming paradigms. But because it was the standard, big companies like Google and MS were forced to pour in resources and make it accessible. In turn, that has led to a glut of functional features being bolted on to other languages now that a generation of programmers has adopted them.

I'd hypothesize that if Eich had made a Scheme instead of JS (as he originally intended), we wouldn't be having this debate today. It would have forced corporate backing of a lisp and of a functional language. The current crop of devs would have been introduced to both of these ways of thinking and nobody would question the merits of a lisp-like language compared to ALGOL/C derivatives that have been foisted into mainstream education and corporate support for the last 50+ years.


> The OOP people are currently dying/retiring as languages shift to be much more functional.

I don't know what programming language landscape you're looking at, but to me it doesn't look like that at all. I grant you that more and more languages are getting functional bits glued on. That doesn't make them FP languages, though. And my perception is that they aren't being used as FP languages. They're being used as other-paradigm languages that can do a bit of FP when desired. If people actually wanted an FP language, they'd switch to one. And they aren't.


> People would rather die/retire than change how they program

As the saying goes, academia advances one funeral at a time.


More strongly (and accurately, I think): The world advances one funeral at a time.

Good saying, in any case, however strongly one wishes to phrase it. :)


> This creates a secondary effect where even when you encounter a superior language pattern, you cannot use it because your company isn't likely to pay you to rewrite complex libraries in a new language

I think the software dying or retiring is perhaps a more important factor than the developers, who can be flexible if the money/kudos is there for them. Rust has been picked up in various places to "forcibly retire" C/C++ programs which have had too many CVEs in their lifetimes; the only reason they were not retired earlier is the lack of a language with the required properties. If there was a bug-free highly performant implementation of SSL that existed in Lisp, that might be a thing that encouraged people to take a look at adopting it despite their lack of familiarity.

> ALL of these things have existed for 50+ years

Especially Lisp. Lisp has been around for a long time. At this point I think it's suffering from a "true communism has never been tried" argument which overlooks that people have tried it and takeoff has not been achieved, and its advocates continue to blame everyone else (as in your post) rather than engage in a little introspection or reflection.

> I'd hypothesize that if Eich had made a Scheme instead of JS (as he originally intended)

As the world turns, WASM now has a canonical format in S-expressions.


I’ve seen the programmer battles over programming paradigms in years past. They’ll pick inferior tech simply because it’s what they know.

I’d argue that alternatives to C/C++ exist, but they aren’t so C-Like, so nobody wanted to try them.

Unlike communism, Lisp has been successful at pretty much every place it’s been tried until someone showed up and demanded everyone change to their preferred language.

Ironically, WASM is the least usable “lisp” ever created because of its stack design.


> Neither the LH people holding LH scissors nor the RH people holding RH scissors are "wrong". It's just the RH people trying to use LH scissors who are having a bad time.

True, and good insight, but it doesn't mean that all scissors, or all languages, are equal. They can all perform computation, but a person with a particular language might create incredible things that another person with another language would have a hard time with.

Let's appreciate the difference in tastes, which reflect our different in thought processes and approach to the world, but it does not mean that every language is the same, nor every person is the same. A Lisp virtuoso will build tools which are completely alien in design and operation than tools built by a Java virtuoso.

I really don't want to take this off-topic, but this fallacy is very present in our modern approach to human diversity, where instead of celebrating our differences, we simply reduce it to a one-size-fits-all approach. There is enough space for everybody, the result is effectively the same, but you simply can't replicate a Lisp-wielding Paul Graham in any other language.


>, but you simply can't replicate a Lisp-wielding Paul Graham in any other language.

But anyone who wants to really dig deeper will wonder about replicating which particular aspect of PG's productivity.

If the aspect they care about is the ViaWeb ecommerce site that he sold to Yahoo for a few million, then all of PG's essays[1] about (paraphrasing) "Lisp syntax being their secret weapon to agility and flexibility compared to others using inferior blub languages" ... isn't going to be convincing to others who notice that Amazon ecommerce was built on Perl/Java/C/C++ instead of Lisp. And later, Shopify was built with Ruby on Rails and PHP instead of Lisp.

In other words, maybe the secret weapon to building ViaWeb wasn't "Lisp" but instead, "PG's brain". (https://en.wikipedia.org/wiki/Confounding)

This is why language discussions focusing on "superior syntax" really don't really move the needle when it comes to mass adoption. Yes, Lisp's "code-as-data" is powerful etc, etc but there may be other factors that make that syntax feature not so important in the bigger picture.

[1] "Beating the Averages" : http://www.paulgraham.com/avg.html


Wild guess here (not a Lisp programmer). When I hear people talking about the practical benefits of Lisp, not just the nice theoretical stuff like homoiconicity and macros, I hear about interactivity. In Lisp it sounds like you can be pretty successful with a more experimental, on-the-fly development process. That might help a crack team push out features faster, but they might produce code that is proven to work more by experiment than by logic, which might mean it's harder to understand and keep building on. This leads to small Lisp shops that beat their competition in a niche but have trouble scaling it to Amazon size.


That is remarkably accurate for a wild guess.

Lisp code tends to be built from the inside. Working in a REPL, modifying the system state as you go, storing code into a buffer then recompiling and loading it into the image with a keystroke. Recompiling functions as you modify them, without recompiling the rest of the file, updating the code behind some classes without taking the system down, and the interactivity you get from the tooling for debugging and error handling. It all adds up.


> In Lisp it sounds like you can be pretty successful with a more experimental, on-the-fly development process.

That’s a side effect of having a REPL.


By itself this doesn't seem like a distinctive advantage of Lisp anymore. Lots of languages have REPLs. But I'm told that Lisp is more advanced in some way. Better debugging, breakloops, everything can be modified in-flight, etc.


Interactive interfaces (command line interfaces, etc.) are long available: BASIC, Smalltalk, Prolog, APL, UNIX shells, etc. Some of them also used source level interpreters.

One simple difference is that the READ EVAL PRINT LOOP of Lisp works with code and data in similar ways. It makes it easy to process lists which are data and also to process lists which are programs. READ reads lists, EVAL evaluates lists & other values (and can be rewritten in itself) and PRINT prints lists. Code is lists and programs are lists, too.

In the moment where the REPL uses a source level interpreter, debugging can go very deep, incl. modifying the running source code. That's kind a second nature when one is interactively developing Lisp code: every error stays in the error context and provides a break loop to explore/change/repair the current state - including that each additional error in a break loop just gets us into another break loop level, another level deeper.


> A Lisp virtuoso will build tools which are completely alien in design and operation than tools built by a Java virtuoso

Crossing with the "build what, exactly?" thread: https://news.ycombinator.com/item?id=36195055 - does the special alien-ness translate into either greater user satisfaction or greater commercial success?

It is possible that there are Lisp virtuosos. It is also possible that you have to be a virtuoso to write Lisp, as is definitely the case for J. Every J program is a tiny work of art. That's why there aren't very many of them.

> you simply can't replicate a Lisp-wielding Paul Graham in any other language

You can't replicate Tolstoy in any language other than Russian, either. I'm not sure that proves anything other than the existence of unique talents that are also bound to their circumstances?


Perhaps it has something to do with winner takes all, or maybe it has no connection at all


FORTH is (was?) absolutely fantastic in it's niche. As an alternative to assembly when you were developing for and probably more importantly WITH underpowered computers, it was great. Even better, you could roll the tools together yourself with minimal effort.

Today one's rarely in a position where you have to bootstrap your development environment from almost nothing, so there are better alternatives.

But if you're stuck on a desert island with a computer and a CPU databook, bootstrap a FORTH compiler and then use that to make your LISP. :)


As someone who has been learning forth over the last like... 2 weeks... yeah. Somehow my brain was like 'this', and it just sorta clicks really nicely in a way that no other language has so far. I also really like Lisp fwiw, so if you see me in a room, you've seen a Lisper and Forther in the same room ;)

What's interesting to me is that I see Lisp and Forth as extremely similar-in-spirit languages, though FORTH is definitely "lower level" (pointer arithmetic everywhere) than Lisp. Depending on your implementation, I bet you could squeeze out nearly every bit of performance to be had on a given system with forth (given enough time), but I'd be really surprised if you could with any lisp.


I came to a similar conclusion about how some people strive to reduce what they call "visual noise" on their programming languages, by removing things like semicolons, parentheses, curly braces, etc. and others, like myself, who like the punctuation.

I think I read code visually. I understand its literal 2D-visual shape first, and use punctuation as visual markers to help me along. Then I backfill the gaps in the structure with the actual "content" of the code.

For the longest time I was baffled by the other camp's attempts to get rid of punctuation. What are they thinking? I now believe what they're thinking is of code as linear text. If you read the code linearly first, as text, and build the structure in your head as you go then yeah, all; of(that) seems {kinda} pointless && annoying.

Now guess which camp is more likely to write blog posts about how theirs is the One True Way? Ah, the humanity.


I've actually gone one step further and I'm pretty sure I can adapt to anything now.

I coded in Java for 20+ years and then switched to Kotlin. Semicolons were gone, it was a breath of fresh air.

And then I learned Rust and I faced what I thought were two absolutely insurmountable obstacles: semicolons AND snake_case. I considered Rust ugly for these reasons and really dragged my feet. But I was curious about the language, so I perservered.

One week later, I wasn't noticing any of these any more. I still do think that semicolons are a manifestation of a language developer who favors their time over mine, but it's become more of a pet peeve than a deal breaker.

The human mind is wonderfully flexible.


If I ever do my "pet crank opinions" programming language, it will use "." as the statement terminator as it is in English. And COBOL.


> Certain people find certain metaphors and modes of computing much easier to work with than others.

Absolutely.

I took a couple of Lisp classes in my computer science program, and I saw this almost instantly. Some people who had a very hard time catching up in most other languages suddenly shined. However, some students who were otherwise very successful, had a harder time with functional languages.

I always wondered how this worked. Is it that we're just wired differently?

> ...concludes the weird scissor advocates are mad.

I just wanted to clarify I don't think this. I appreciate when people are passionate about something like Lisp. At least, I'm happy for them!


> Is it that we're just wired differently?

very much so. Richard Feynman had once said that even the simple act of counting is perhaps completely different between different people. Some people count by visually seeing a count, some use a voice, and may be there are others who have a different system.


I like this explanation.

Every year, or so, I'll design a programming language on paper to work through my "latest thoughts" as impacted by languages and ideas I've picked-up since. Each time the design is different, with different priorities etc.

I think what I'm doing is exactly what you describe: clarifying my mental model of programming languages -- so that I arrive at something which "feels right".

I can then come back to actually-existing languages and express my "prelinguistic" ideas using whatever syntax they make available.

Absent this activity, I think I gradually end up too conceptually confused -- blending a mixture of whatever languges i'm working in.

The power of a single radical paradigm solves this problem for people, like me, who require a theory to feel comfortable; but without all the effort i go to.

(Though for me, of course, it's a hobby --- I like to see how close I can get to designing a language which, at that moment, is the one I'd wish to program in).


> people, like me, who require a theory to feel comfortable

Reminds me of the paper "Programming as Theory Building" https://pages.cs.wisc.edu/~remzi/Naur.pdf


Naur is probably the closet computer scientist to my world view -- on many fronts.

I'm reminded of his genuine attempt to go into neurobiology, and via William James, take seriously biological adaption and plasticity.

I think over the last 20 years, engineers and mathematicians seem to have "taken over" computer science -- against the tradition of "scientific philosophy" which Naur represents.


> Every year, or so, I'll design a programming language on paper ...

Where do I subscribe?

> Each time the design is different, with different priorities etc.

Classic trilemmas. Nice.

Scott McCloud's triangle for style of illustrations was a eureka moment for me. The tips are ideographic, textual, and realism (IIRC). All comics lie somewhere within that triangle (solution space). Mind blown.

There are many either-or tradeoffs: closer to the metal vs abstractions, explicit memory management vs garbage collectors, etc.

But there's probably also a few trilemmas, like functional vs imperative vs declarative.

Anywho. Just a casual notion. I'd love to see a big list of design decisions for programming languages.

Today's embarrassment of riches has reduced the enthusiasm for language jihads. But it'd still be nice to have something more than esthetics and vibes to guide our choices.


Your description of your work process sounds like descriptions I’ve read of language driven development.

Could you expand on why this is so powerful for you?

Not doubting it’s power, but I’m having a hard time understanding _why_ it’s so powerful for some people.


This morning I was sketching how i'd do syntax for affine and linear types, which are (very basically) where variables can be used "at most once" or "only once".

In sketching I iterated various designs, my notepad is below. It began trying to think about how to express scopes visually, or using tags/lables -- then moved into how that integrates with other langauge features etc.

By doing this I understand much more about what really a language feature is trying to do -- I understand the tradeoffs, etc.

  program a:

  let x:a = 10

  if 10 < 5 b:
    let x:b = x
    print(x)



  program UseyTypes:
    x : once  = new Resource() # at most once -- rust's default
    y : never = ...
    z : once! = ... # only once
    q : many = ...  



  program MoveSemantics:
    a = new 10
    x = 10
    y = move x 
    z = copy y
    
    i0 = new(auto) 10
    i1 = new(memo.alloc.heap) 10 # default
    g1 = new(heap) Graph.init({...})
    g2 = copy(Graph) g1 # uses Graph's (deep)copy method
    g2 = copy(Graph.copy) g1 # eqv to above

  program : 
    global heap = Allocator.tmp(...)

    if True:
      local xs = Vec.new(heap) { 1, 2, 3, 4 }

      memo.lifetimeOf(x)      # eg., local scope a = lineno 10 - 20
      memo.lifetimeOf(x.data) # eg., global via heap allocator

      repeat x <= xs:
          x = 10 # error
          print(x)
          
      repeat i <= int.range(10):
          x[i] += 1
          
      repeat:
          const answer = input("What's 2 + 2?").to(int)

    print(compiler.scope)
    print(compiler.lineno)


    const int z = 10
    const ref(int) y = ref(z)

    print(y, deref(y))

    const x : global  = 10

    if 10 < 5:
      int x : local = 10
      int y : parent = 10



  # polymorphic .to

  fn str.to(int):
      parseInt(.)    



  program EffectSystem:
    fn[IO] println(const ...data: ...any):
      repeat d <- data:
          IO.writeline(d.to(str))

    pure fn calc(): # pure = no effects
      return 10 + 20 

    # by default, all fns are impure, ie., have IO auto-passed ?
    

    println("Hello", "World")

    with IO = open("output.txt", "w"):
      println("Into some file!")

    println[open("output.txt")]("hello world")


Thanks!

Out of curiosity, have you ever tried developing DSLs in Racket? One of its explicit reasons for existence is to enable fast development of custom DSLs.


The art of designing a language is expressing semantics in intuitive syntax -- it's an art because "intuitive" is essentially a psycho-social cultural condition. (ie., I reject Lisp)

C was "intuitively mind-expanding" to assembly developers and PDP-machine programmers -- and so on.

My aim is always to express a semantic concept in syntax so that it is so obvious that it's originating language developers will be shocked.

You can do that both with, eg.,

    map fn over collection
and

   xs/fn

and

   repeat x from xs: fn(x)

and

   { fn(x) st. x <- xs & st. x > 0 }

etc.

In that each syntax resonates with a certain programming culture.

For novices, I suppose the following might be "consciousness raising",

    set result :=
      repeat:
        set x := next xs:
          save fn(x)


Chuck Moore was a student of John McCarthy, so the Lisp inventor and Forth inventor have been in the same room many times.


I did have to use Lisp, C and PostScript in the same project once...


I don't think I've ever seen a FORTH programmer and a LISP programmer in the same room; I can't imagine the discussion terminating.

The ultimate halting problem?


I'm a Lisper. One of my good friends at Apple was a hardcore FORTH guy. I had other friends there that were Lisp or Smalltalk enthusiasts.

We got along great.

The early meetings at Apple for the design of the Dylan language definitely had both Lisp and Smalltalk folks participating. I wouldn't be surprised if some of the participants were FORTH folks, too.


the discussion would indeed recurse over whether to reimplement lisp over forth or forth on top of lisp


I'm a FORTH programmer and a LISP programmer, and of course a PostScript programmer, which is actually a lot more like LISP than FORTH.

https://www.donhopkins.com/home/catalog/text/interactive-pro...

https://donhopkins.com/home/archive/forth/forth-postscript.t...

David Singer at Schlumberger developed a Lisp-to-PostScript compiler in 1987 called "LispScript".

https://news.ycombinator.com/item?id=21968842

https://donhopkins.com/home/archive/NeWS/NeScheme.txt

Arthur van Hoff at the Turing Institute in Glasgow developed an object oriented C to PostScript compiler called "PdB" around 1990-1993:

https://compilers.iecc.com/comparch/article/93-01-152

We used PdB to develop HyperLook for NeWS, integrate The NeWS Toolkit components into HyperLook, and implement the SimCity user interface on SunOS/SVR4/X11/NeWS.

https://news.ycombinator.com/item?id=22456471

ChatGPT Summary of the above thread:

This discussion thread revolves around the concept of implementing Lisp-like macros in PostScript for creating more efficient drawing functions. The user "DonHopkins" highlights their work on the Open Look Sliders for The NeWS Toolkit 2.0, where they leveraged a Lisp "backquote" like technique to optimize special effects drawings. The user explains that this approach accelerates drawing at the expense of increased space utilization. They also propose a potential solution to space conservation by only expanding and promoting macros during tracking, then demoting them upon tracking completion.

DonHopkins shares several resources on NeWS, LispScript, and the PostScript compiler, and also refers to window management tasks in Forth and PostScript for comparison. Additionally, they discuss a paper on syntactic extensions to support TNT Classing Mechanisms and share a demonstration of the Pie Menu Tab Window Manager for The NeWS Toolkit 2.0.

Another user, "gnufx", appreciates the shared resources and brings up the metacircular evaluator in HyperNeWS or HyperLook as a potential speed bottleneck in the system.

DonHopkins responds by explaining the use of a metacircular evaluator (ps.ps) they wrote for debugging. They clarify that speed was not a concern as the evaluator was not used in any speed-critical situations. DonHopkins also discusses the technique of "PostScript capture," likening it to partial evaluation of arbitrary PostScript code with respect to the imaging model. They relate this concept to Adobe Acrobat's "Distiller" and Glenn Reid's "PostScript Distillery".


I honestly love posts about Forth (or Factor) and Common Lisp (or Lisps in general). I love both languages. On top of that, I use C a lot, along with OCaml, Lua, and Erlang (and rarely Ada). I find each one of them beautiful. :)


I don't think I've ever seen a FORTH programmer and a LISP programmer in the same room

I have! There was a knife fight and they both stabbed themselves, horrific.


The Forth programmer walked into the room backwards, like a moron. The Lisp programmer already in there had a perfect opportunity to get him in the back. Stupidly, his weapon was lying in a heap of stuff, and was boxed; he couldn't get it out in time. Moreover, it blew up in his face because he imported it, he had falsely declared it to be of toy type.


I suddenly stopped worrying about both LISP and FORTH, when my CS professor mentioned (around 1995) that it would be trivial to write a transformer between the LISP and FORTH.


Lispers get a mildly bad rap for some historical, unpersuasive "holier than thou" evangelism, but the truth is, almost all programming languages du jour have louder and fiercer evangelists. In fact, you can be paid to evangelize Rust, Python, and other languages.

As far as the last decade is concerned, the most productive open-source and professional Lispers have done some advertising of Lisp, but mostly through a combination of technical blog posts and actually making things.


Thankfully ADHD for adults is being taken more seriously now.


I like Lisp a lot, especially Scheme. It's simple, it's pure, it's expressive, it's powerful, and it's fun to write. If I could, I would use it for most things.

I can't convince anyone to use it. They just don't like the look of it, they stumble at the brackets. I know that I can't write something in it and expect other people to use it, build it or maintain it.

So I don't use it. And nor does anyone else.


Lisp advocates think that "everything looks the same" is an advantage, whereas most people strive desperately to make things look different in informative ways with syntactic features (e.g. different brackets) and syntax highlighting.


That's one of the advantages of Clojure: it's lisp-inspired but with different brackets for different data structures.


One of the reasons it has those data structures is because it sits on the JVM and to maintain compatibility with Java apps it must use them and provide a syntax for them. That’s a practical choice, of course, but also less of an idiomatic lisp.

In (pure) lisp there is only one main data structure and it’s everywhere.


Java doesn’t have maps or whatnot, as proved by many other lisp dialects being written for the JVM.

Clojure’s decision was of practical importance and a very good one at that (not everything is a list, especially not semantically)


If they had an actually good IDE for it that was LispWorks I would give it a go.

Forcing the overhead of learning CommonLisp, which is actually a fairly big language despite the syntax, on top of learning how to use emacs is a big ask.


Today, Atom/Pulsar, Vim, Jetbrains (new in 2023), Sublime, even VSCode and Lem work fine for CL: https://lispcookbook.github.io/cl-cookbook/editor-support.ht...


I'll check it out. It half worked last time I did it


VS Code appears to have serviceable support now too.


But other than a Skyscanner predecessor (VIA), a computer algebra engine, and Grammarly, are there any other modern high profile lisp powered products?


Okay, other than Skyscanner, Grammarly, HackerNews, Emacs, CircleCI, Metabase, Crash Bandicoot, Nubank, other Clojure projects etc, what has Lisp done for us?


Or, to lose the Monty Python snark, "Aside from the same handfuls of projects, counted on the digits of two hands, and always reiterated anytime somebody asks for high profile Lisp projects, what other high profile code is there from a language whose proponents always advertise its huge productivity gains?"

"Oh, there's also Crash Bandicoot".

"Wow, color me impressed".


Crash Bandicoot was Lisp, but the greatest PSX game (Metal Gear Solid) was written in C, so I’ll use that to justify my programming opinions to others.


[flagged]


As a diehard liberal, this is an extremely strong assertion and downright conspiratorial tone. What is the associate of Lambda to gay rights?

Also, LISP is almost always pitched as "the real programmers language". Your comment is such a completely ridiculous-on-its-face assertion.


Seriously, you really aren't aware of the association of Lambda with gay rights? Are you a native English speaker or an American? It's easy to google, widely known, and well documented. I'm glad for the opportunity to educate you!

I've also heard conservatives try to implausibly deny they ever heard of such a thing as the "gay lisp", too, but that ignorance-based excuse doesn't hold any water, either.

But I suppose there are some home-schooled Fred Flintstone conservatives living under a rock in Bedrock (or Florida or Texas) who have carefully cultivated their ignorance about gay history and culture, and who have never met any gay people (or are so openly homophobic that most gay people refuse to come out to them out of fear), and that their deep ignorance untainted by the facts is part of the basis for their rampant homophobia and terrified moral panic.

Lambda Legal Defense and Education Fund:

https://en.wikipedia.org/wiki/Lambda_Legal

>The Lambda Legal Defense and Education Fund, better known as Lambda Legal, is an American civil rights organization that focuses on lesbian, gay, bisexual, and transgender (LGBT) communities as well as people living with HIV/AIDS (PWAs) through impact litigation, societal education, and public policy work.

LGBT Symbols: Lambda

https://en.wikipedia.org/wiki/LGBT_symbols#Lambda

>Lambda: In 1970, graphic designer Tom Doerr selected the lower-case Greek letter lambda (λ) to be the symbol of the New York chapter of the Gay Activists Alliance.[5][6] The alliance's literature states that Doerr chose the symbol specifically for its denotative meaning in the context of chemistry and physics: "a complete exchange of energy–that moment or span of time witness to absolute activity".[5]

>The lambda became associated with Gay Liberation,[7][8] and in December 1974, it was officially declared the international symbol for gay and lesbian rights by the International Gay Rights Congress in Edinburgh, Scotland.[9] The gay rights organization Lambda Legal and the American Lambda Literary Foundation derive their names from this symbol.

Lambda as a symbol of gay/lesbian rights:

https://www.cs.cmu.edu/afs/cs/user/scotts/ftp/bulgarians/lam...

>The Encyclopedia of Homosexuality has the following entry on Lambda:

>In the early 1970s, in the wake of the Stonewall Rebellion, New York City's Gay Activists Alliance selected the Greek letter lambda, which member Tom Doerr suggested from its scientific use to designate kinetic potential, as its emblem. (Curiously, in some ancient Greek graffiti the capital lambda appears with the meaning fellate, representing the first letter of either lambazein or laikazein.) Because of its militant associations, the lambda symbol has spread throughout the world. It sometimes appears in the form of an amulet hung round the neck as a subtle sign of recognition which can pass among unknowing heterosexuals as a mere ornament. Such emblems may reflect a tendency among homosexuals toward tribalization as a distinct segment of society, one conceived as a quasi-ethnic group.

>In More Man Than You'll Ever Be by Joseph P. Goodwin (Indiana University Press:Bloomington, 1989) on page 26, Goodwin writes:

>The lowercase Greek letter lambda carries several meanings. First of all, it represents scales, and thus balance. The Greeks considered balance to be the constant adjustment necessary to keep opposing forces from overcoming each other. The hook at the bottom of the right leg of the lambda represents the action required to reach and maintain a balance. To the Spartans, the lambda meant unity. They felt that society should never infringe on anyone's individuality and freedom. The Romans adopted the letter to represent "the light of knowledge shed into the darkness of ignorance." Finally, in physics the symbol designates and energy change. Thus the lambda, with all its meanings, is an especially apt symbol for the gay liberation movement, which energetically seeks a balance in society and which strives through enlightenment to secure equal rights for homosexual people.

Sterling Silver Lambda Gay Pride Symbol Charm:

https://www.amazon.com/Sterling-Silver-Lambda-Pride-Symbol/d...

And then of course there's the purple (another classic gay color) cover of Structure and Interpretation of Computer Programs, with the two magic dudes dressed in drag with a lambda symbol floating between them.

https://en.wikipedia.org/wiki/Structure_and_Interpretation_o...


> I've found that a lot of social conservatives tend to be unconsciously afraid and ashamed of Lisp out of moral panic due to its implicit associations with homosexuality (the gay lisp stereotypical speech attribute, and lambda being associated with gay rights).

This is hardly believable to me. How did you arrive at that conclusion? Isn't the community of Rust, a language I think that you can safely call a lot more popular than Lisp these days, also very vocal about supporting LGBTQ+ rights?


But Lisp has a much longer tradition of terrifying social and linguistic conservatives since 1959.

And look at all the social conservatives desperately fighting against the inclusivity of the Rust and other communities, which kind of proves my point that it terrifies and threatens them.


What is an example of when conservatives have taken a big stance, or been terrified by Lisp?



And gimp and AutoCad (ok, not anymore) But I would like to say also: is it a solid/sound metric “how many well know projects use it”? There where extremely popular junk languages for decades, that are mostly regarded as crap today.


Isn't it a little telling you're lumping CommonLisp and Clojure projects in together? These languages share parenthesis, but are extremely far apart.


They’re all Lisps, which was the question.


That's not what I mean. It's up for debate whether Clojure is a real Lisp, but that aside, they're farther apart in paradigms than most languages are. One is an immutable functional language running on a JVM and the other is a hodgepodge of OOP concepts and low level programming capabilities.

I don't mean to say this as a dig on Lisp, but the reason you didn't just list Clojure projects or just CommonLisp projects is because we would transparently see how few have actually worked for either of them. So when you lump them together, it comes off like we're scraping the bottom of the barrel for examples, and that's not even considering how much these companies actually use Lisp or have continued using it.

It's fair to say Lisp is powerful in a niche.


Fairly certain Raytheon uses CL in their signal processing pipeline for simulating ICBM missile defense so if you not melt down in WW3 you got lisp to think for that, at least partially.

Fully expect the first HN thread while we climb out of the ruins to be though 'what has lisp ever done for us, it doesn't even run my web app'


The first multi-user combat training system (SIMNET) had a Lisp system creating the virtual training worlds.

https://news.ycombinator.com/item?id=12040601

https://en.wikipedia.org/wiki/SIMNET


>it doesn't even run my web app

By that time, it would be: "It doesn't even run my magic sauce prompt for GPT42"


The big one that I always remember is Crash Bandicoot - Naughty Dog I think had their own version of Scheme and then switched to Racket at some point. Nubank are also a semi-high profile company who use a lot of Clojure.

Of course, if you ditch the "modern" requirements, I'm sure there is more web infrastructure and scripts supported by Common Lisp than people would want to admit. . .


according to any Andy Gavin, the cofounder of Naughty Dog and a MIT AI lab alumni, crash 1, 2 and 3 were written on GOOL/GOAL[0] which was a home-brewed lisp. According to Franz themselves, the language was hosted on allegro common lisp[1]. the language gave him an ability to push ps1 platform to its limits by leveraging the kind of thinking that's part of lisp lore: incremental recompilation of a running ps1 instance using a remote little language written in and hosted on a Common Lisp dynamic environment. the previous sentence describes a poorly understood practice that of a dynamic environment leveraged development that was part of lisp machine and smalltalk machine and a handful of other now forgotten approaches. in a sense crash was not just "written in lisp", it was written leveraging lisp machine like thinking, that Gavin would've been familiar with from his MIT AI days.

when naughty dog got sold, all the remaining Gavin lisp systems were eventually stripped, so that the company for all intents and purposes became a standard c++ shop. some time later some hipsters wired plt scheme[2] as a scripting language for the Naughty Engine 2.0. unlike the original Gavin approach this is not some deep leveraged architectural decision, and it being lisp is pretty irrelevant to the sort of capabilities it provides. imho scripting language for a game engine selection is a lipstick on a pig kind of process, as demonstrated by various basic-like potato languages that came with legendary triple-As.

[0]https://all-things-andy-gavin.com/2011/03/12/making-crash-ba... [1]https://franz.com/success/customer_apps/animation_graphics/n... [2]https://www.gameenginebook.com/resources/gdc09-statescriptin...


It's the Reddit story all over. Lisp devs know Lisp + X, but everyone else only knows X, so we'll use X instead -- even if it's inferior and causes issues down the line.

This isn't really limited to Lisp though. It applies to quite a few languages with the excuse of "market forces" where "market forces" really means "we want to makes sure our devs are easily replaceable cogs" (using a niche language actually pressures both sides to stick together because the dev can't easily find a new job and the company can't easily find a new dev).


It's slightly different: Naughty Dog had proven that they can deliver commercial successful applications (novel platform games on the Playstation with excellent content) using Lisp. They had their own IDE on top of Common Lisp and as a delivery vehicle a Scheme-like runtime.

They were bought by a much larger C++-Shop (Sony) and were trying to get the benefits from a larger eco-system. In the end they were bought for their talent, their experience, their brand - but not for their technology.

For Naughty Dog it could also have been the right moment, since from a certain point in time the game platforms are getting so complex that making custom inhouse game runtimes may no longer make sense for smaller game studios.

Reddit OTOH had never delivered anything commercially successful in Lisp, little experience with Lisp, but heard that it could be cool. They used unproven tech. Naughty Dog used proven tech and had enough experience to do low-level systems programming for a brand new game platform. Which is really an outstanding achievement. Reddit had only a rough prototype in Lisp, Reddit then switched inhouse to other technology.

The early Reddit in Lisp: https://github.com/reddit-archive/reddit1.0


Naughty Dog only switched because they were bought out by Sony who then demanded that they change languages.

Reddit was merged with another YC company. That company used Python, so they switched everyone to Django. Last I knew, most of Reddit’s outage woes were still due to the outdated ORM they are stuck with. In any case, Common Lisp is hardly “unproven Tech”.


> Naughty Dog only switched because they were bought out by Sony who then demanded that they change languages.

To reuse a larger code-base, instead of working on their own new platforms for the next systems.

> In any case, Common Lisp is hardly “unproven Tech”.

Common Lisp is a language. Software is running on implementations and SBCL was relatively new then (2005).

They used SBCL which at that time was not used to implement such websites.

Naughty Dog used Allegro CL, which was already used in a bunch of 3d/OpenGL applications. Their own runtime was custom build and required real deep expertise in implementing a GC-less Scheme for an embedded platform.


Reddit could have switched to a paid Common Lisp variant without any trouble if they'd actually had issues. The people there said they moved to Python because that's what the other team knew. I don't see a reason to argue otherwise.

The argument behind the Naughty Dog switch was also pretty clear. Sony wanted to be able to move devs between projects easily and everything else used C++, so they'd rather force Naughty Dog to use C++ than tell everyone else to learn Common Lisp. To my knowledge, there was zero discussion on the merits of one or the other and it was a business call.

Further, the reams of custom code Naughty Dog now has written on Racket points to them still loving lisp and not minding if they have to invest a lot of effort into being able to use it in their designs.


> Reddit could have switched to a paid Common Lisp variant without any trouble if they'd actually had issues.

I thought they had issues. Didn't they?

Paid Common Lisp variants tend to get expensive and even for those, the main applications rarely were high-traffic websites with UI frameworks.

Take ITA Software / Google, they were developing core business logic of the flight search engine in Lisp - the product than ran/runs in SBCL. They had a team of 100+ people and a bunch of the top Lisp talent of that time. They also invested a lot into improving SBCL.

> Sony wanted to be able to move devs between projects easily and everything else used C++, so they'd rather force Naughty Dog to use C++ than tell everyone else to learn Common Lisp. To my knowledge, there was zero discussion on the merits of one or the other and it was a business call.

A business call is based on assumptions: larger ecosystem, more libraries, shared runtimes, etc. That's all much more economical than doing it alone as a small studio.

> Further, the reams of custom code Naughty Dog now has written on Racket points to them still loving lisp and not minding if they have to invest a lot of effort into being able to use it in their designs.

Of course they love Scheme and they were then back creating their own content delivery tools. But they stopped implementing runtime things like core 3d graphics animation frameworks for new CPUs/GPUs, etc.


Things were more complicated than that with Reddit from what I've read (and from a now defunct blog post they wrote not to mention various talks and interviews from devs who were there at the time).

Their devs were using Macs in 2005 which ran on PowerPC. Their servers were x86, but running FreeBSD (honestly, that was a tall ask for most languages in 2005). They had an issue finding threading libraries that worked on that OSX/PPC and FBSD/x86 combo. They further complained that there weren't tons of libraries available for use either. Finally, they also made some bad architecture decisions unrelated to Lisp.

The switch is still a weird one if you move aside from the new team not knowing or wanting to learn Lisp. Python isn't threaded at all, so they could have stuck with non-threaded CL and still have had 100x faster code. Likewise, they wound up rewriting all the python libraries they were using because those libraries turned out to have tons of issues too.


Naughty Dog continued using Lisp for game development throughout the Uncharted series at least, I'm not sure about The Last of Us but I would be very unsurprised if that changed.

They just stopped having the game written nearly purely in custom Lisp dialect, instead only mostly - effectively switching from GOAL setup of Jak&Daxter run-time to similar approach to GOOL in Crash Bandicoot - core written in C/C++ running considerable portion of game logic in Lisp (variant of PLT scheme in Uncharted).

Uncharted dev tools were also built in PLT Scheme (aka Racket).


To clarify, Naughty Dog has still been using LISP-based scripting and asset definitions in their recent games[1]. (Though I count The Last Of Us as recent so I guess that shows how often I play games.)

[1] https://www.youtube.com/watch?v=gpINOFQ32o0


I’ve heard that CircleCI uses clojure extensively, not sure if that counts as a true lisp.

There’s also Metabase, also written in it.


Hacker News. Emacs.


+1 for Emacs.

Some people do not realize this, but the thing you build when you compile an Emacs distribution is not a text editor. It's just a specialized lisp interpreter that has primitives for handling things like frames, buffers, and other UI-type stuff like detecting key chords, plus some math and stringy stuff, and a few other bits and bobs. The editor itself is written in Elisp.

To illustrate, this is what I get when I run `sloccount` on the source download of Emacs 28.2 from gnu.org:

    Totals grouped by language (dominant language first):
    lisp:       1253948 (75.66%)
    ansic:       368126 (22.21%)
    objc:         17115 (1.03%)
    sh:            7261 (0.44%)
    cpp:           1851 (0.11%)
    yacc:          1566 (0.09%)
    perl:          1442 (0.09%)
    php:           1035 (0.06%)
    pascal:        1011 (0.06%)
    python:         993 (0.06%)
    awk:            781 (0.05%)
    cs:             770 (0.05%)
    ada:            725 (0.04%)
    ruby:           405 (0.02%)
    erlang:         153 (0.01%)
    java:            65 (0.00%)
    tcl:             16 (0.00%)
It's basically 75% elisp, 25% C, and more or less all of that C code is either implementing the lisp interpreter itself, or interfacing to system libraries. Using Emacs is really the closest you can get these days to working on one of the old LISP machines from the 80s, except it's more fun.


16 lines of Tcl! I'm guessing a script somewhere for tests?

And 65 lines of java? That these have not been replaced by lisp is a travesty, I say.


Lol, I think most of that randomness is somewhere in the test directories.


> It's just a specialized lisp interpreter that has primitives for handling things like frames, buffers, and other UI-type stuff like detecting key chords, plus some math and stringy stuff, and a few other bits and bobs.

One day, it may actually include a decent text editor too! But for now, it's the end-luser's responsibility to cobble one together from the provided parts.


Yep. As the saying goes: great operating system, just needs a decent text editor :P


Might not count as modern, but the original Reddit and HackerNews codebases:

- https://github.com/reddit-archive/reddit1.0

- https://github.com/wting/hackernews


> then we'd both go back to whatever language we were actually using at the time.

Which well known projects do use lisp? I think Google Flights uses lisp [1]; there must be more?

[1] https://www.youtube.com/watch?v=mbdXeRBbgDM&t=835s



This very site does as well.


The blub paradox strikes the lispers hardest, for they have been reassured by PG and each other that there is no language better than lisp. Those of us using more powerful languages should learn from their hubris.


I can see a lot of axes for better that involve removing capability from lisp. Lisp abstracts over data representation, lifetime, code layout where other languages force you to make choices which may fit the domain better. There are also a lot around programmer ergonomics - compile time detection of various error prone constructs.

The only one that comes to mind for more powerful is abstraction over control flow. Instead of delimited continuations, one can go with unification and implict control flow.

What are the increases in power you consider lisp users blind to?

(p.s. first class environments, first class macros are missing from common lisp and scheme, but not from all lisps)


Lisp programmers tend to think that every problem is best solved by layering abstractions. Macros, homoiconicity, quasi-quoting, and s-expressions facilitate abstraction. Some even think programming is abstracting.

But abstracting has a cost. Sometimes in terms of performance (perhaps mitigated with great effort or a sufficiently smart compiler), but always in terms of cognitive load. A elegant lisp expression is meaningless in isolation. When reading lisp code, you must recursively find and remember the definitions for each token on the page before you can understand what it does. Hopefully the author knew Naming Things is Hard and choose good names. Hopefully the names point you in the right direction. But you can't be sure unless you traverse each definition to its leaves. Lisp programmers are blind to the power and clarity of thought that comes from direct expression - all definitions visible at once, with no indirection.

A lisp programmer might look down on a Java programmer's reliance on IDEs. An IDE is a powerful tool for shoveling mountains of code, but a lisp programmer might say "I can solve this problem without mountains of code". Likewise an apl programmer might look at a lisp solution and say "I can solve this problem without defining any new terms". A forth programmer might say "I can solve this without parentheses". A C programmer might say "I can solve this without a heap". An assembly programmer might say "That entire program is equivalent to a single x86-64 instruction".

By crafting different bespoke DSLs for each new problem, lisp programmers lose the opportunity to hone one DSL to perfection. The knowledge that any problem could be solved with lisp lures them toward the fallacy that every problem should be solved with lisp.


> Lisp programmers tend to think that every problem is best solved by layering abstractions.

Nah, when I write Lisp programs, I have more of an appreciation for appropriate abstractions, not needless abstractions.

> A elegant lisp expression is meaningless in isolation.

Elegance is usually in relation to something. For example,

    (list 1 2 3)
Is more elegant than

    var list = new LinkedList<Integer>();
    list.add(1);
    list.add(2);
    list.add(3);
> When reading lisp code, you must recursively find and remember the definitions for each token on the page before you can understand what it does.

You need to do this for any language. In Java and many other languages, having a "Go to definition" IDE function is very useful.

> Lisp programmers are blind to the power and clarity of thought that comes from direct expression - all definitions visible at once, with no indirection.

Not sure what this is referring to. Do you have an example?

> A lisp programmer might look down on a Java programmer's reliance on IDEs.

Plenty of Lisp programmers use Emacs, which has a great many tools to help developers including jumping to definitions, showing documentation, running and using a step debugger for code, etc. Not sure why Lisp programmers would look down on Java programmers because of an IDE.

> By crafting different bespoke DSLs for each new problem, lisp programmers lose the opportunity to hone one DSL to perfection.

This seems to be a Lisp meme. Just because Lisp can be used to make a DSL doesn't mean all Lisp programs are macro-implemented DSLs. There's plenty of Lisp code that looks just like Java code with function, struct, object definitions and calls, except it uses parens.


> You need to do this for any language.

Fair, but if your language keeps growing, this task is never done.

> Not sure what this is referring to. Do you have an example?

Array programmers sometimes avoid abstractions because they prefer "idioms" like these[1]. So rather than curate a library of words like:

    barchart: {x>\:!|/x}
and have the programmer use the word "barchart", they instead prefer to use the definition itself. The word "barchart" has a specific meaning (here, an ascii "bar chart" of 0s and 1s, showing the relative sizes of the values of input array x), but "{x>\:!|/x}" might be useful for more than just bar charts. This idiom contains smaller idioms like "count til max" (!|/) which in turn contains "max" (|/).

Being able to see the code makes it easier to explore and tweak to your specific needs. But more importantly, there are no "official" names for concepts like "count til max". That's just my personal name for it. A python programmer would call it "range". You could come up with your own name for (!|/) that makes perfect sense to you. But that name will probably be longer than its definition, and less flexible.

[1] https://github.com/JohnEarnest/ok/blob/gh-pages/examples/idi...


Ok, but Lisp doesn't force you to curate a library of words or require dealing with abstractions.

Maybe this example is relevant. Racket defines procedures `filter` and `map` for `list`. Also provided is `filter-map`, which I assume may satisfy your not-being-direct-expression concern about Lisp. But, `filter-map` exists for a particular reason:

> Like (map proc lst ...), except that, if proc returns #false, that element is omitted from the resulting list. In other words, filter-map is equivalent to (filter (lambda (x) x) (map proc lst ...)), but more efficient, because filter-map avoids building the intermediate list.

So it exists for performance concerns. It also exists in the "Additional List Functions and Synonyms" so it's not like it's being confused for a core, important function like `filter` or `map`. I still write Racket code where I just explicitly have something like:

    (~>
      (filter (lambda (x) (equal? 'some-value x)) some-list)
      (map (lambda (x) (symbol->string x)) _))
without always jumping for a more concise abstraction.

And, coming from other languages, DRY (don't repeat yourself) seems quite popular, so I am not sold on this being a Lisp-specific issue.


The thing is, a compiler could easily recognize the pattern (filter identity (map proc lst ...)) and rewrite it for efficiency (perhaps by using filter-map).

Writing it that way in the first place reduces the verbosity of the code, making it look better.


> What are the increases in power you consider lisp users blind to?

Static typing, mutable values with unique ownership.


Static typing is a _decrease_ in expressive power in exchange for compiler diagnostics.

Linear types are probably not representable in lisp (bad interaction with reified continuations, not great interaction with environment capture), that is something I miss.


> Static typing is a _decrease_ in expressive power in exchange for compiler diagnostics.

This sentiment is at the heart of the blubbiness in question. Just like homoiconicity (relative to something like C) lets you say more about code in exchange for a reduced ability to implicitly move back and forth between syntax and operational semantics, a good static type system lets you say more about computations in exchange for a reduced ability to implicitly move back and forth between denotational semantics and syntax.

Lisp seems less expressive than C until you start thinking of programs as more than just sugar for machine code; ML-family languages seem less expressive until you start thinking of programs as more than just their implementations.


There is something here I think. The lisp/C example is useful in that there are things which C lets you express things that lisp usually does not. Garbage collectors for lisp tend to be written in C, even when the lisp in question can directly munge the machine code if you wish. For that matter whether C is useful sugar or obstructive depends strongly on what you want the machine code to be.

It's tempting to handwave away compile time detection of missing cases in pattern matching as nice but inessential. However it is something which gets leaned on very heavily where it is available. It takes a collection of failure modes out of the mind of the programmer to encourage thinking about other things.

Expressive power is probably inconsistent as a concept. Whether introducing a constraint or removing it increases expressivity depends on what one is trying to express. For example, should code that does not typecheck prevent running code which does? Depends on context - it's deeply annoying during development, but helpful to avoid checking in code which no longer works on the paths you weren't looking at.


Lisp isn't any one thing. There are some with static type systems. PreScheme is a Scheme subset developed in the 90s with static typing via Hindley-Milner type inference. There's a wide world of PLT that has been explored with various Lisp implementations.


Yeah. All languages other than LISP are Blub. :(

Relatedly, I was just looking at this, mostly to see what the fuck possessed someone to do it: https://macropy3.readthedocs.io/en/latest/index.html

Origin of the phrase "Blub Paradox": http://www.paulgraham.com/avg.html


More powerful?


Poetic.

I am a big fan of "functional programming languages", including Lisp to an extent. That said, I appreciate the teachings they can impart - abstract structures, design sense / "design patterns", and ways of thinking etc. - more than any sort of dogma associated with "their way". Although, to be fair, I did go through my own period of foolishness in ... say, "venerating" this particular "methodology".

Your comment did prompt thought of Pratchett's book "Interesting Times". Specifically, passages like "... It had come as a revelation to Lord Hong when he looked at the problem the Ankh-Morpork way and realized that it might just possibly be better to give the job of Auspicious Dog-maker to some peasant with a fair idea about metal and explosive earths than to some clerk who'd got the highest marks in an examination to find the best poem about iron. In Ankh-Morpork, people did things."

The tricky bit in everything, and the difference between the true expert and not quite is often found in forms of "executive function" ... Knowing what to apply and when ... when to change up strategies, what the value is in some methodology and how to bring that into what you are doing ... all these sorts of things.

I always know when I'm at more of an "advanced intermediate" level (a sort of "first black belt" level) when I know too many ways to go about doing something but have no idea which is likely to produce solid progress in a relatively time-efficient way.

So, bit of a digression off of your comment, but stimulating from my perspective and hopefully of some use to anyone who might read this.


In my long journey in following the cult of Lisp, I haven't seen a better description in the situation.


In all fairness we need to note that it speaks of the most vocal followers. There are likely many others who are evangelizing it much less, but that's exactly the source of the bias: we don't "see" them.

I in fact agree with the observation, but I also think that the same can often said of many passionate people. Scala? Blockchain?

Continuing the observations, mainstream approaches seem to have less passionate followers. Very few people are evangelizing Java as passionately as others do Lisp. I wonder what makes mainstream less attractive in this sense.


I've come across several classes. Sadly, a thing most have in common is derision towards any other class. Be it language choice, agnosticism, editor choice, paradigm choice, toolchain, whatever. It is frustrating, as so many of us seek to find where and how we disagree with each other, when we'd get far more mileage out of understanding where we agree.


Honest question: have you ever done something more than trivial in Lisp? In my experience, people who used many languages have some favorite at the end. Sometimes depends on the task what they think is the best for task X.


spanning several decades too, i have a version of your observation: i think the class of geeks is reminiscing, because "back in the day" common lisp was (like everything now) a world in which there were libraries for all the hip (at the time) buzzwords.


I've been a hard core lisper for almost 50 years, and have coded in pretty much every other language in that time, either professionally or for fun. But I always return to lisp, because Lisp isn't a programming language. There are no parens. There is no syntax. There is no spoon. Lisp isn't a programming language, it's hard to describe what it is with a simple term, "meta-language" is the closest I can come, but that doesn't get to it deeply enough. You don't write a program for some problem in lisp, you think beyond the problem to it's meta-problem-space and design a language in which you naturally express prose in that (meta)problem space. Then, given that you now have a natural language (and I mean that in the true sense of "natural language", but not for a human problem space, but for the problems space of the problem you are working on) for that problem space, the problem you were after to begin with is a natural expression in that new language. And you're done.


While the barebones aspect of Lisp is appealing, there is no way I will commit again to a language that's not statically typed.

There is a difference for a language between "readable" and "understandable".

To me, languages that don't have type annotations (ideally with type inference) might look pleasing to the eye but they are a nightmare to maintain and modify because so much vital information is missing.


Racket goes a step further from having types and gives you contracts. The whole language is defined by these contracts. For example,

    > (list 1 2 3 4)
is a way to create a list consisting of elements 1, 2, 3, and 4. This procedure is defined as returning a `list?` (a contract checking for a list) and the values provided are `any/c` (a contract representing any value). But many other languages can support something like that just fine. Where contracts further help are with composition and specifying allowable values or ranges of values. For example,

    > (round 2.4)
    2.0
as we expect. But looking at the definition of `round`, we see that `(round x)` accepts an `x` satisfying contract `real?` and returns `(or/c integer? +inf.0 -inf.0 +nan.0)`. If you try to `round` Infinity or NaN, you get what you "expect" and not some new value. If you try to `round` POSITIVE_INFINITY in Java, you'll get

    > Math.round(Double.POSITIVE_INFINITY)
    9223372036854775807
because, with types, Math.round() must return an int or a long.


contracts are still runtime checks and thus different from a static type system, which gives you a proof, ahead-of-time, that your program will be well-typed no matter the input (well, at least in theory - in practice, many type systems are unsound, but it takes some effort to actually break the system).

you're right that a contract allows you to specify more precise properties than a "regular" type system, but it comes at the aforementioned cost. If you want both, dependent types would be your best bet, but those are... rather complex in practice and to my knowledge, not really "production-ready".

as for your example though, it would be possible to implement `Math.round()` in such a way in Java that it doesn't just return an int, but some sum type that encodes whether it's an actual result or some invalid value. The reasons why this isn't done are probably a) it's awkward, though possible, to express sum types in Java (this would be easier in some other languages), b) it would force the caller to always have to explicitly consider the case that the value is invalid even when they know it's not, making mathematical calculations tedious and hard to understand.


You can write a lisp that uses static typing. The book "The Little Typer" develops one such language as (I believe) a dialect of Scheme. Of course, that's just a toy language (and it doesn't have type inference, for example, but as far as I'm aware that's a hard problem when you add dependent types), but it shows it can be done.


The language in The Little Typer supports not just static types but dependent typing. A much more interesting capability.


Well yes, but those are still static types. :)

And of course, you're not forced to use dependent types, so "regular" statically typed programs are just a subset of dependently typed ones.


I find s-expressions useful, of sorts, but I also find them horrible to read and write. For my long-languishing Ruby compiler project I added syntax to let me embed small chunks of "pretty much AST" directly in the code to "escape" to lower level code when implementing the lower levels of the standard library (like Object, Class etc.).

For that kind of limited use, I find s-expressions great because it's trivial to add simple s-expression parsers to things where you suddenly need to support complex nested structures with minimal syntactic noise, but I'd claw my eyes out if I had to read s-expressions all day every day.

(And since I prefer Ruby: just prepending a "%" the example s-expressions in the article makes them valid Ruby literal strings, so now I'm kinda tickled at the thought of taking a little s-expression parser and building a tiny Ruby Lisp implementation that lets you just intersperse lisp code in your Ruby. Surely someone must have already done so for fun)


Couldn't resist, so a lightly golfed s-expression parser and test of using %() here:

https://m.galaxybound.com/@vidar/110493036393779755


Correct me if I'm wrong but it seems you roughly get "lisp-ness" by taking any textual data format and defining a programming language syntax in terms of that data format. I could make a JSON-based language that looks like:

    [
      // define some_fun(x) => let y=2;let z=y*3;return x+z
      {"some_fun": [["x"], [
        {"y": 2,
         "z": ["*", "y", 3]},
        ["+", "x", "z"]
      ]},
      // call it
      ["some_fun", 100]
    ]
Here I'm using JSON dictionaries to bind variables, and lists for everything else (instruction sequences, argument lists, etc.). Now I can quote stuff to define JSON data, or otherwise define programs.

I'm not a lisper but it seems what's special about lisp is it defines its programs in a generic data format.


But JSON is somewhat impoverished in that only has floating-point numbers, strings and a handful of symbols (true, false, null, ... ?).

The most obvious thing lacking from the point of view of interpreting in a Lisp-like way is an ergonomic syntax for symbols.

People have experimented with computation in JSON, pretty much like how you have it; but it's lame to work with. Nobody wants to write quotes around basic identifiers like "+". And it is inefficient, to boot.

Interning of symbols is a key advantage of Lisp, and it was there in 1958 already; so by using JSON this way, you're regressing before that.

Lisp converts symbols into small values, which are atoms. Two or more occurrences of the same symbol are the same atom. An atom is a small word-sized quantity. Often a pointer, but not necessarily so; it could be a numeric index into a table of objects. Anyway, it is very efficient to test whether two atoms are the same object.


> I'm not a lisper but it seems what's special about lisp is it defines its programs in a generic data format.

One of the things that still makes Lisp special. It makes metaprogramming "easy" in contrast to metaprogramming in almost every other language out there.

The other big thing is its high degree of interactivity and dynamic capabilities.


You're essentially describing homoiconicity: https://en.wikipedia.org/wiki/Homoiconicity


Thanks!


Similar argument from this lengthy but enjoyable essay: https://stopa.io/post/265


If a kid were to learn Lisp purely for the joy of recreational programming, with no intentions of ever building production software with it, what Lisp would they pick?


Racket would be my suggestion. As much as I like Common Lisp, it has a lack of material targeting kids (other than particularly motivated ones). Scheme would be a second suggestion, but Racket is descended from Scheme and so is mostly (with a bit of effort in some cases) compatible with any Scheme learning material you might want to use. Scheme also requires you to decide which implementation to use, which can impact available libraries and tools.

https://racket-lang.org/

It's also very easy to get visual results in Racket which often helps to keep kids motivated (seeing results of their work quickly and easily, versus needing to jump through hoops or read lots of documentation just to render a single image).



uLISP, an arduino and a bunch of peripherals for it.


your own obscure lisp of course!


I find the claim about image-based development being required for macros somewhat unfounded.

The most trivial counterexample is an interpreter - it can simply evaluate the macros just like ordinary functions.

A step up in complexity is a compiler that - during compilation - compiles macro definitions by emitting code and dynamically loading it (Goo does this http://people.csail.mit.edu/jrb/goo/goo.htm , and I have also put a toy implementation of this together using dlopen, and there are probably many other impls that do this.)


Yeah, that part made little sense to me.

Clojure is one of the weakest lisps when it comes to macros, but you can still define and use them at run-time, and it has no image-based development.


I rather want to master lisp as the last proframming language, because i want to experience the pain first in other languages to understand why, what and how lisp could help.


The main problem I find with Lisp syntax is that everything looks the same: a giant soup of parentheses. You have to read Lisp closely to know the context. A list of statements (function calls) visually looks different than a list of parameters in C-style languages, for example. Similarly, you see square brackets and immediate know it's a subscript/index to an array-like structure. And parenthesis (usually) mean something different than curly braces.

Visual Cues Help! Maybe your eyes are different, but most the programming world has agreed with me over the years and voted C-style over Lisp. I have even proposed a syntax style called "Moth" that attempts to meld the best of C-style, Lisp, and XML in terms of composability and roll-your-own control & block structures. I welcome better takes on the goal.

Moth link: https://www.reddit.com/r/ProgrammingLanguages/comments/ky22d...


I think SQL is an unfair example because the grammar is so poorly constructed. Designers of the language seemingly cared little for ease of extensibility in favor of ridiculously specific constructs. Compare that to Java's much simpler grammar https://docs.oracle.com/javase/specs/jls/se7/html/jls-18.htm.... I can't find a good reference, but I suspect even Kusto/KQL has a much simpler grammar than SQL despite functioning in a very similar problem space.

One of the first rules of making a new programming language should really be to do as little as possible in the parser. By making your parser extremely complicated, you also make it extremely difficult for anyone to write tools for your language. Instead, it's much better to push that out to the rest of the compiler and/or the type system.


It's called prefix notation and it's the most primitive form of representing ordered computation. It was the easiest thing to implement in the glorified calculators of the 1950s. It's not the "Lisp syntax" that works, it's the lack of syntax that enables this uniformity and flexibility but irks the eye.


re: "lack of syntax"

I agree (though it doesn't necessarily irk the eye after some getting used to). Rather than saying it has a "uniform" syntax as the article author does, I am fond of saying lisp has "no syntax," because it's essentially true.


Minor nitpick, but weren't the best of these calculators (i.e. the HP ones with RPN) postfix instead of prefix?


>> weren't the best of these calculators (i.e. the HP ones with RPN) postfix instead of prefix?

Yes. RPN is more like Forth than Lisp.


This is a good article and I think it hits the nail on the head. There is one point I don't agree with though:

> Syntactic uniformity also means there is a lot less room for syntax bikeshedding. Really the only thing you can bikeshed is the naming convention for identifiers, and how S-expressions should be indented.

This might be true in theory, but in practice I find to be effectively false. Lisp makes is very easy to add your own macros that implement DSLs with custom "syntax". Common Lisp for instance has bunch of these build in, for instance for iterating over lists. While this is technically not syntax in the traditional sense that the parser is aware of it, it has the same characteristics for a user. You need to learn the "syntax" that the macros implement and _that_ syntax you can certainly bikeshed about.


You can see that in even ancient Lisp discussions. The article shows a macro that transforms to a Common Lisp-style LOOP, but LOOP is often criticized as having too much of a DSL syntax and there's plenty of old debates on if it is better to use things like REDUCE or other predecessors of LOOP instead of learning the complexities of LOOP. (On the other hand, LOOP is very powerful, and its mini-query language can relatively succinctly describe a lot of usefully optimized loops.)


The LOOP facility is also often more efficient than stringing together map, reduce, and other functions. Unless you use something like SERIES (or a similar package) which can perform stream fusion for efficiency, but that's not the norm (I highly recommend exploring it though if you want to use the map/reduce style and want efficient results).


Most of the SQL grammar shown is, in practical terms, semantics more than syntax. Sure, there’s syntax woven into it, so that the points being made are quite valid, but I mean that to cover the scope of that grammar would take just as much space in Lisp, just in the form of things like function and parameter definitions (probably just as complex) instead of syntax. So it’s not just 118 lines of stuff Lisp wouldn’t need at all, which is a way I can see some people misinterpreting that section. SQL puts a lot of its complexity into syntax-space, a similar Lisp system puts it into library-space, but the complexity is still there (just probably easier to work with programmatically).


Well, I like Lisp, too. But the macros are not really good, but they are basically buggy ('non-hygienic'), and it took a few years to realise and repair that. Look at Scheme or Racket instead.


Hygienic macros are a waste of time. I've never once had a macro bug from CL.


Hygienic macros are a bigger issue in 1-lisps dude to high density of namespace collisions.


CLers will be along to tell you that macro bugs rarely happen in CL because of separate function and variable namespaces, and if they do you can always use gensyms.

CL is the "worse is better" of "the right thing".


As a CLer, I have never understood the draw of hygienic macros. They solve what looks like a non-problem, straining mightily to achieve little of actual value.


It turns out that it isn't terribly hard to implement hygienic macros on top of "non-hygienic" macros. I recently saw a paper that had an implementation. The code itself was fairly short.


That sounds like an interesting read. Can you post a pointer?


I wasn't sure I could find it again, but here it is!

Embedding Hygiene-Compatible Macros in an Unhygienic Macro System: https://www.jucs.org/jucs_16_2/embedding_hygiene_compatible_...

    Abstract: It is known that the essential ingredients of a Lisp-style unhygienic macro system can be expressed in terms of advanced hygienic macro systems. We show that the reverse is also true: We present a model of a core unhygienic macro system, on top of which a hygiene-compatible macro system can be built, without changing the internals of the core macro system and without using a code walker. To achieve this, the internal representation of source code as Lisp s-expressions does not need to be changed. The major discovery is the fact that symbol macros can be used in conjunction with local macro environments to bootstrap a hygiene-compatible macro system. We also discuss a proof-of-concept implementation in Common Lisp and give historical notes.
The code is in Figure 1, on page 10 of the PDF (p. 280 in the actual journal)


Great read, thanks!


sure, disregard that one word and suddenly it all falls apart. what is homoiconicity anyways? doesnt matter you say?

it is the feature of the code being represented the same way as the datastructures it manipulates. i know, what a completely unneccessary and unimportant point. now let's all forget this and turn our attention back to the compiler, it wants you to fix those bugs now... back to work.


>it is the feature of the code being represented the same way as the datastructures it manipulates.

Every language can do this. You just don't get QUOTE. But any compiler can implement QUOTE.


Maybe? Many languages can possibly expose an AST abstraction that you can use to achieve the same results, but you will have to resort to a new syntax to make that work. Compare with https://taeric.github.io/CodeAsData.html, where you can take the normal syntax of a function, but use it to get something else.

That is to say, the "quote" form in lisp is far more than just a string template, which is what it often turns into in other languages. That, or you build up a bunch of structs that you don't normally see.


XD yeah. you can implement lisp in any language.


But my macros!


whataboutthem?


> Lisp’s unusual syntax is connected to its expressive power. [...] It’s because of uniformity.

Okay...

> Lisp (and XML) have a uniform syntax.

.. ookay. The rest of the arguments, and the conclusion, can then be equally applied to XML, specifically the:

> I maintain that Common Lisp is the gold standard of DX for macros

Does (s)he also maintain that XML is the gold standard of DX for macros? I guess that'd mean XSLT?

...

Or the article is just trying to be too clever for its own good.


I'm not a Lisp person. But...

XML and Lisp both have uniformity. What you can express in Lisp, you can express in XML. That's formally true.

But once you've seen S-expressions, XML is so verbose that you want to throw up. You could do what Lisp does in XML, but your fingers would fall off from all the typing, and your eyes would glaze over from reading all the verbiage. You'd fall asleep before you got to the point.

It's really hard to overstate how disgustingly ugly XML is after you've seen the same thing done in S-expressions. I knew XML, and then I learned what S-expressions are (though I don't program in Lisp), and the next time I saw XML I was shocked at how ugly, tedious, and verbose it was.

So, yes, they're equivalent, and no, they're not. The verbosity matters, and it kills the usefulness of XML as a programming language. You could do it, but nobody wants to.


I can intellectually see the power of Lisp, but in practice I find it's just really hard to read down the road, especially other people's Lisp. The visual cues provided by the "ugliness" of other languages do help reading (I gave examples nearby). Lisp is kind of like new tract homes: they look all the same such that it's easy to get lost. An old neighborhood with funny bent mailboxes and a mish-mash of house and yard styles is just easier to remember.


XML can be parsed and transformed more easily than a language with a more heterogeneous/specific syntax. This is obvious.


XML is absolutely not easily parsed, if you want to do it correctly.

Unless you mean to say you can easily parse XML if you're using an XML parser library, in which case you can just as easily parse any other language using its parser: for example Python using its `ast` module.


Most of the complications for parsing XML come down to entity resolution, and dealing with namespaces. Take those out, and it is back to trivial. That is, surface level parsing of an XML document is somewhat easier than other documents.


Parsed easily? Sorry, have you tried parsing XML?


> I guess that'd mean XSLT?

XSLT is amazing. And XSLT scripts are really hard to read - just like Lisp is (at least, it is to me).


The idea behind XSLT (match/patch) is amazing.

Xslt itself is complete trash.

- developing an entirely new language in XML was a meh idea, but it was severely lacking making using it difficult unless you were allowed to extend it via custom functions and elements

- the template-as-transform and template-as-function confusion was completely unnecessary and made it much harder to learn and grok than necessary

- but the worst mistake by far was the impliciteness and confusion of `.`, the thing is verbose as all hell, the implicit passing and transformation of the context was completely unwarranted, not unlike Perl’s magical side-effect unless you’re deep in the sauce it makes the entire thing a lot worse, and with even less payout given the general verbosity of XSLT and XML (in perl all the magic provides amazingly terse CLI commands, it’s just a shame people then used that in perl files)


XSLT is a lot harder to use than Lisp.

Trying to debug some XSLT right now, every implementation processes the script differently, have reached the point of just declaring that it is broken and using something else.

I should be writing Common Lisp, I get paid for that.


> Trying to debug some XSLT right now

Heh! I haven't touched XSLT for decades. I can't remember it enough to discuss its strengths and weaknesses; all I can remember (weaknesses) was that it dealt badly with namespaces. I think it was invented before XML namespaces came along, and the terseness of XSLT became a horrible mess in the face of namespaces.

I remember being told in about 1996 that XML was the future; the guy that told me was my boss, who was a manager, not a techie. We worked for a document company, and in that sense I think he was right. So I put a fair bit of effort into learning to use XSLT to translate XML data notation into HTML presentation notation.

XML never made the grade as a document notation, and literally nobody I knew wanted to know anything about XSLT.

But yeah, XSLT was a lot like regular expressions - pretty much a write-only notation. Hmm - I just realised that I'm using the past tense, as if it was my dead uncle or something.


[Sorry, commenting to self]

I remember now. XSLT was indeed crap, all XML syntactic sugar. But the "XSLT" expressions were XPath, which existed before XSLT came along, and it's XPath that had no support for namespaces. And it's XPath that was concise, and ressembled regexes, and shared with regexes the write-only property.

[Edit] So it's XPath that I think was amazing. Unfortunately it was bound to XML, which wasn't amazing.


As a backseat lisper, I wondered, when cakelisp hit the front page, if what makes one lisp different from another really only is “the standard library”, because the syntax is so barebones, there really isn’t anything to play around with.

And then cakelisp turned out to not really be a lisp according to the author.


Semantics. CL is extremely dynamic and has separate symbol namespaces, Scheme has more rigid types, Clojure has a lot of stuff built around immutability and replaced the general use of lists with arrays, Fennel has lua types and semantics, Janet is very small and portable, and adds a lot of small nice things.

Each of these feels completely different to program in, and it's mostly not about the standard library.


Aside but related: Borretti's Austral systems programming language looks interesting. Early stage. I just read the one example so far, but the syntax looks nice and clean to me.

https://github.com/austral/austral


There are some nice Lisp related comics on xkcd, e.g. https://xkcd.com/224/ and https://xkcd.com/297/.


I expected yet another tutorial on how to implement lisp in 100 lines of code.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: