Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
How Lisp Became God's Own Programming Language (2018) (twobithistory.org)
151 points by Qaphqa on May 13, 2020 | hide | past | favorite | 117 comments


Gerald Sussman, co-inventor of Scheme and author of SICP, was my undergraduate advisor. The last several times I visited his office, he was usually with Jack Wisdom either programming or deep in thought or discussion about differential geometry and the differential geometry Scheme library they were writing. One time when he wasn't so occupied, I brought up SICP, and asked if he was aware that a lot of people think of reading the book as a sort of magical, enlightening experience. He said, "Yes, I'm aware." I asked if he had any idea why. He said, "The main reason is that it tells a good story. It also has a complete, coherent narrative."


Am I the only person that read the whole of sicp and didn't feel enlightened in the least? I felt way more enlightened when I read the whole of John Baez's this week's finds. Maybe it's a book which you need to read when you're younger.


Curious: did you complete all the exercises in the book? Not doubting you in any way, just wondering if that’s a possibility. I know I’ve gone through books without doing the work, while on others I have done the work — and it’s usually a pretty different experience.

But, not everyone is gonna connect with everything, regardless.


I think the impact it has depends on what you're doing at the time, what languages you've seen. For me, I'd done lots of assembly, C, Pascal, and Basic, so it was super fun. 20 years later I re-read parts and remembered the excitement, but, in the intervening years, had used Ruby and Python, and noticed Scheme/Lisp ideas had been imitated more widely, and it felt, in that 2013 context, less invigorating to reread SICP.


It probably depends on if you already have tried all the stuff in the book. I was not young when I read it and I did find it quite enlightening, but I had never tried implementing an interpreter or a logic language before.


I love John Baez's writing but have not read any set of his posts as a whole. Are there threads that develop through time and build on one another?

I wouldn't say that I felt enlightened by SICP, but I was and still am excited about what it covers. A couple aspects of it that excite me, recursion and homoiconicity, the property of a programming language whose programs look like a data structure in the language, are, in my mind, related to many other topics that I had encountered before SICP such as (in no particular order): paradoxes such as chicken and egg problem, self-replication, feedback mechanisms, quines, fractals, philosophic inquiry into ontology and consciousness, the anthropic principle, Russel's paradox, Gödel's incompleteness theorems, the halting problem and Universal Turing machines, and the list goes on. I initially included Douglas Hofstadter's strange loop in the list, but I think he would say that all of these are tied together by the concept of strange loops.

For awhile, I was disappointed with Sussman's answer to why people feel enlightened by SICP. I think I was hoping that he'd fill me in on the secret or give me some clues. I felt like I /almost/ knew why people felt enlightened by it, but couldn't quite put my finger on it. I think what I was really hoping for was an explanation of why SICP and all of these related topics are so exciting to me or that he'd share similar awe or experience and add another piece to the puzzle. Currently, I find a lot of meaning in his statement, "it tells a good story", whether it's the meaning he intended to express or not. The topics excite me because I find relations between them. Finding relations between them is a story that I have created, and it's exciting because the story is far-reaching, and I don't know what character might join the web of relations next...it could seemingly be anything! I think many people feel empowered and enlightened by SICP as a stand-alone entity because the meta-circular evaluator pulls back the curtain on some of the magic behind programming and computing. The exercises show how computers can touch a wide range of topics and can both be molded to how a mind sees fit be and mold the mind in return. Instead of being given a labyrinth of a language created by someone else to learn in an intro to programming course and having to spend a lot of time learning all the syntax, types, tools, etc, Sussman uses a very simple Scheme and shows how to create from scratch a plethora of things encountered in other languages that are useful across many domains in a very flexible way. After accomplishing so much, the reader even creates the magical thing they were given in the beginning, eval. In our world of computing, doing that without much difficulty is empowering and enlightening.


I have never minded using a so-called niche language. Since 1982 I have been very fortunate enough to be paid a good fraction of my time for using Common Lisp - I consider this to be largely good luck and I am grateful for how things turned out. I also like Scheme but literally no one has ever paid me for Scheme development. In the USA, we have a saying that “you dance with the person you go to the party with.” My dance partner has been Common Lisp.

I still feel like I am still a student. I started to read Let Over Lambda (a reference to closures) a few months ago which has reinforced my realization that there is so much about a programming language that I have used for 38 years that I still want to explore.

All that said, I have often totally enjoyed building systems with C/C++, Java, Python, Prolog, etc. Designing and writing code can be fun in any language.


Didn't you work at CCSO back when Prof Kaplan was just down the hall, unless you were in another building...have you read the Guy Steele original papers on Scheme? They're actually awesome.


No, that was not me. I do like Scheme (and a long time ago I wrote a Scheme book for Springer-Verlag), but I was just saying that I haven't used the language professionally.


Version I heard was you dance with who brung ya


Reading "Lisp 1.5 Programmer's Manual" when I was a post-graduate student at Waterloo (around 1977) was a revelation to me. In particular, "Appendix B: The Lisp Interpreter" gives a version of the Lisp interpreter in just 39 lines of code! (The appendix includes notes, and is 3 pages long.) I remember using this code to figure out how Lisp evaluated recursive functions. Until this point in time I had programmed in Fortran, IBM 360 Assembler, Cobol and Pascal. They all required much more documentation, and, in many cases, experiments to figure out what would actually happen in certain cases.

I wish the idea of a definitive high-level semantic guide had become a thing. SwiftUI, for example, seems to be wonderful, but learning it, as far as I can tell, requires watching hours of talks, or working through many tutorials. What a contrast with McCarthey at. al's 1985 book.


I'm surprised that this didn't bring up Paul Graham's "Blub" concept, in which non-Lisp languages are thought to be objectively less powerful than the universal language of Lisp. That's been a lot to blame for the mystical reverence of Lisp in the 21st century. While I'm a Lisp fan myself, and agree that it is more powerful than a lot of mainstream languages, the idea that it is the "most powerful" blinds a lot of hard-core Lisp devotees to things like Haskell that are worth exploring as well.


The "Blub" parable is really clever because it says that when other people doesn't use Lisp, it is simply because they are incapable of understanding its power - not because of any practical or technical reason to chose another language. So any argument the "Blub" programmers might use to justify "Blub" is automatically invalid.

Of course it can be used for any non-mainstream language, and I have seen it used for Haskell, where Lisp is the "Blub" language.


Hum... The Blub parable is really clever because it surfaces the very real problem that a programmer that don't know a language can not predict the benefits or problems of using that language.

I don't see how people (you are the 3rd I see here) get the impression it's a Lisp-only comparison. PG very clearly says he has no idea what language should be on top, and even if there is a top at all. And even if he didn't, that would just make him a practical example of the paradox, and wouldn't make the idea any less valid.


In context (http://www.paulgraham.com/avg.html) he specifically states that he think Lisp is at the top of the power continuum. The point of the Blup-parable is to explain why Lisp isn't that popular when it is so amazing.


Ok, after reading it again, by "very clear" I was overstating it, but there is this note:

> [4] Note to nerds: or possibly a lattice, narrowing toward the top; it's not the shape that matters here but the idea that there is at least a partial order.

And in no place he says Lisp is the end-all best language. At most he compares it with other languages his competitors used on the context of Viaweb, what on the bottom of the article means C++, Java, Perl and Python.

He indeed goes on to argue that any language more powerful than Lisp would be Lisp, because macros work that way. If you don't take notice that he is simplifying things into a power continuum to make a point, that can mislead you into the message that Lisp is the most powerful language. But he puts some effort into making the message language agnostic.


Is Forth Lisp? Given that you can control the reader, everything is possible syntactically, and semantics is entirely what you make it.


I would say it fits his description (but well, I'm also quick to dismiss it all because ergonomics matter), so it's at least equivalent in power. I have no idea what he would say, I imagine a serious lisper would be offended by the suggestion (because, well, ergonomics matter), but would take a while to even understand why.


> I imagine a serious lisper would be offended by the suggestion (because, well, ergonomics matter)

Curiously, it did come up in the context of ergonomics in another discussion thread:

https://news.ycombinator.com/item?id=23178953


I’ve had this recently. Was writing Python but missed the ease of concurrent functional programming. Used F# but missed type classes. Used Haskell but missed dependant types. Use Idris but miss the build environment of Python.


Yes, once you learn enough languages I feel like you just find yourself constantly wishing you had aspects of another language, pretty much regardless of what language you're using. Sometimes its directly related to the language, sometimes its something like the ecosystem surrounding it.

This is why I'm wary of "right tool for the job" when it comes to languages. In my experience usually there isn't a singular obvious right language. Maybe one is 35% right, another is 38% right, and the golden ticket language is actually just 45% right. And sometimes you won't really know until you're halfway through the project.

If you wanted a 100% singular obvious correct language, you would have to make a custom language with traits from a dozen different ones. But in the real world, the differences between languages you can actually choose from end up being not that large.


I think of languages as a multi-dimensional tree, with branches extending in different directions. I think that the trick is to figure out what the yak-shaving aspects of the project are going to be (which you can think of as a vector), and picking the language that goes the furthest in the direction of that vector (and thereby does the most to minimize the yak-shaving). This requires that you be able to fairly accurately determine what the yak-shaving will be up front (which can be problematic).


> Used Haskell but missed dependant types.

They're coming :)


Oooh is there a link to the work/who’s working on it?


https://gitlab.haskell.org/ghc/ghc/-/wikis/dependent-haskell

^ this has a bunch of info + external links. Richard Eisenberg is the one owning it afaiu.

I think there's also a GHC branch somewhere but idr its name.


I think the problem is looking at the prism of being the most "powerful" or "expressive" language and ignoring everything around it. Go and Java programmers don't use their language over Lisp because they are masochists and enjoy the exercise of programming in a less expressive language. They choose their languages because of ease of use, large healthy ecosystems (compiler and libraries) and excellent IDE support.

I sometimes see similarities in the D crowd. Because D has such advanced metaprogramming capabilities, it also suffers from parts of the Lisp curse. Too often, any criticism of D or comparison is rebuked with "but you can do it in D using this template magic and there's a package that partially implements it".


I'm not saying your first sentence is wrong, but your second sentence is missing something fundamental:

1. "Ease of use" describes Go reasonably well (anyone who has used a compiled language can figure out Go tooling in minutes), but I've never found it to be true for Java. Whichever you think is better, I don't think there is a case for Maven or Gradle being an order-of-magnitude easier to use than ASDF.

2. The "large healthy ecosystem" from the libraries point of view is a silly argument for Go, since 10 years ago Lisp clearly had the larger ecosystem. Something is fundamentally different (socially, technically, or otherwise) from Lisp to Go that Go could grow the library ecosystem it has from zero in a decade, while Common Lisp could not in 25 years.

3. The "large healthy ecosystem" from the compiler point of view is silly; Lisp has 2 healthy commercial implementations (Lispworks, Allegro), 3 very actively maintained open-source implementations (sbcl, ccl, ecl), plus several other "somewhat maintained" implementations. Thats almost Java plus Go put together.

4. It's funny you mention IDE support, because tooling in general, and SLIME specifically is what keeps me using Lisp. Java actually has fairly solid IDE support, but outside of the Smalltalk world, I've never seen anything even close to SLIME in terms of being a useful IDE.


I think I read this a while ago and tried to learn lisp (i didnt fully understand macros, only superficially). Anyway.

I think there are 2 kinds of 'powerful'. Yes, I can see that the Macrosystem is powerful. But in a sense the language seems 'overqualified but underskilled'. Let me explain: I basically need arrays for my work, array functions and fourier transformation (fft). I use python and i think it has much higher productivity then lisp on these fields (correct me if I am wrong).

Also while the macrosystem easily allows (and probably requires) to write your own DSL, this productivity boost does not scale easily if you have a lot of ppl working on the same codebase.

For Paul it didnt matter, it was two ppl only, and languages with comprehensive libraries didnt exist at that time. So if you have to build everything yourself (alone) then lisp is probably superior.


For your first point: Yes, good quality domain-specific libraries go very far in allowing one to be productive. Lisp does not offer a good, general-purpose, integrated numerical computing environment (plotting, scientific functions, etc.) and as such wont be good for belting out scientific code. If you like DIY, then Lisp is wonderful to write scientific code in because it can compile to extremely fast code.

For FFTs in particular, Common Lisp has very good support for them [1, 2, 3]. For matrix arithmetic, there’s an up-and-coming library called MAGICL [4].

As for DSLs, there’s no problem using these on a team. Their utility does scale if the DSL is actually solving a problem.

[1] NAPA-FFT3 https://github.com/pkhuong/Napa-FFT3

[2] BORDEAUX-FFT https://github.com/ahefner/bordeaux-fft

[3] Patrick Stein’s FFT https://github.com/nklein/FFT

[4] MAGICL https://github.com/rigetti/magicl


Thanks. Just to clarify on the dsl remark: dsl work of course, but usually it is a two tier approach. First the dsl is designed and implemented, then the (other) team works with it on the main application. So far it looked to me (especially from Paul Grahams article) that in lisp usually these two phases are mixed, but i might misunderstand this.


That's really a library problem. On a cursory glance it seems Racket supports DFTs.


Isn't the whole point of Lisp to make everything a library problem?


I would have thought God would write binaries directly in machine code. The only purpose of a language, is to allow creatures of limited intelligence to create abstractions that hide complexity and allow complex problems to be more easily reasoned about.

Presumably God does not need to use abstractions, and can reason perfectly with an infinite number of variables, so a programming language would just prevent an Omniscient being from being able to create as perfect a program as it could otherwise (as any given programming language doesn't let you create any binary).


Machine code is an abstraction, too.

In a Judeo-Christian context, God seems to operate in very abstract terms. "Let there be light" is a highly abstract instruction.

In some other religions and mythologies, there isn't a single God giving the instructions from outside, so it seems difficult to make the comparison there.

Abstraction is like a lever, and by first-order logic there is no way to avoid using an abstraction whenever we communicate, whether internally via vocalized thoughts, via hn or by some really old books. Maybe that's why humans want to believe that abstraction is so powerful. Thankfully, we're not totally wrong.

But thank you for making me imagine a world where the "Let there be light" statement has been meticulously explained in as much detail as possible.

"Then, He realized that adding an extra electron to Hydrogen was not such a good idea; the entire universe shattered, and, after a brief moment of embarrassment, he comforted himself with the fact that no-one will ever know of his folly.

He continued calculating the correct speed-constants for a particle he made called a photon, which wasn't exactly a particle, but it was small and didn't carry a lot of weight, it was everywhere, and it was mostly directional -- so he figured it might be useful for some sort of massively parallel input apparatus, and his creations can use it to understand all sorts of things about their environment and themselves. Eventually, humans will suspect that light is a wave, too; but that wasn't quite right, either. God made is difficult to figure out for copyright protection reasons, but here's a hint, and get your notebooks out: ..."


> "Let there be light" is a highly abstract instruction.

The more I think about this, the more I realise it might be the most abstract instruction. It's actually kind of beautiful.


I once worked in a company that used a similar approach. So, for example, the "chief architect" would file a bug in the tracker titled, "Product X does not exist". It would then be up to the engineers to "fix" it by creating the product.

No wonder Satan rebelled. ~


That itch you're feeling is the itch I feel when I write in a LISP. Other programming languages make up too many of their own rules and get in the way.

But those are just feelings, and if you forget that, you'll bring yourself into a manic state. heh.


In python you can do

  pip install light
and it installs all the requisite dependencies. Awkwardly, god runs as root and doesn't bother with virtual environments, so there might be some bugs due to version creep


The only naturally-occurring programs we are aware of are written in genetic code and the way in which they run is arguably not even Turing-complete. Recent papers have suggested ways to construct a Turing-complete system using DNA but it seems that such a system doesn't exist in the real world.

So yes, God's apparent method is writing self-modifying spaghetti code straight to the metal.


Perhaps the universe is a LISP machine that interprets His code directly? No need to assume x86 derivatives when dealing with the Infinite.


If we're discussing what a divine being might write the universe in, my vote is not for direct code. Instead, imagine writing something as immense as the universe, you'd want the universe to emerge from simple rules such as basic IFS [1], L-Systems [2], or multi-dimensional cellular automata [3].

There is actually some preliminary research findings giving evidence to cellular automata underpinning the creation of the universe, as part of the Wolfram Physics Project [4].

[1] https://www.stsci.edu/~lbradley/seminar/ifs.html

[2] https://web.cs.wpi.edu/~matt/courses/cs563/talks/cbyrd/pres1...

[3] https://mathworld.wolfram.com/ElementaryCellularAutomaton.ht...

[4] https://www.wolframphysics.org/


The LISP machine now permits LISP programmers to abandon bra and fig-leaf.


"In the beginning was the Word, and the Word was with God, and the Word was God."

Clearly, the universe is written in Forth.

Or rather a dialect of it, which has only one fundamental word, and everything else is implemented in terms of that word. How exactly is it done? That's the ineffable part.


You're thinking about programs, not computation.


I have heard for decades how Lisp is transformative, how just having learned it, even if you leave, you will never look at things the same way again. Like having served in the Armed Forces.

Every so often I get interested in Lisp, but I always run up against the same conflicts. I look for something that I can run in Windows that has a reasonable set of libraries and then immediately stumble upon the Crusades. You know, the religious wars you saw with emacs vs vi, wars that used to be fought over various Linux distros and window managers, that you will now see about different flavors of agile or whatever. I am quite sure that there are religious wars being fought in the territories of Javascript frameworks that I've never heard of. These wars so often seem to leave the territories barren, the original objectives cloudy, and the participants scarred. What was this good for, again?

Anyway, every time I encounter these things I end up asking myself if I have the knowledge to pick a side in whatever war and if joining up is going to actually provide a solution to the problems I wanted to solve using programming, decide I am neither fit nor armed, and back slowly out of the room.


I dunno, Lisp is OK. I enjoyed my time with it. The problem with it is, while it may give feelings of transcendent power to the programmer, its real world deployed results are rather lacking, compared to such boring languages as golang, java and so on; or even more obscure ones like Ada or (trolling hard now, but true) Forth. From this alone, it is abundantly obvious its advocates are overstating its utility in solving problems.

The most obvious products of the Lisp world are computer algebra systems such as Reduce, Axiom and Macsyma, and these are all very old pieces of software, rather than something new and interesting used by millions. In fact, you might say the purpose of Lisp is to write the kinds of custom interpreter/compiler expert system/symbolic manipulation systems that people used to think were going to be "AI" in the 1970s.

Sure, our host made his fortune developing in Lisp; he might have done so using some other language. The most obvious lisp projects today are people .... developing new lisp dialects. I'm sure there are plenty of profitable projects in Lisp, just as there are in Delphi, APL and other older non-mainstream languages. But if Lisp were really all that amazing for building useful things, considering the number of people with a smattering of it, it would be used more often. Or at least you'd see more real world results from the Lisp community than you do.

Anyway, pick one up; it's a lot of fun, and it does change your views as to what is possible in terms of communicating with your computer.


I'm saddened by your story. Our programming communities can be more welcoming.

That said, I think these debates are necessary. Fixing problems starts with deciding what the problems actually are. The problem isn't that the debates are happening, it's that people like you are getting dragged into them.

You aren't obligated to participate in debates. Nor do you have to use the absolute best version of Lisp to get most of the benefits.

My advice is to pick up a copy of MIT Scheme (works on Windows) and then work through Structure and Interpretation of Computer Programs (SICP), and don't read anything else about Lisp until you're done, because you don't need to.

MIT Scheme isn't the best Lisp. It's just the Lisp that's used in SICP. You won't win any debates arguing that MIT Scheme is the best Lisp. You might even hear people say MIT Scheme isn't even a Lisp. I agree with some of that, but those debates are irrelevant to your learning.

Feel free to hit me up with any questions. I certainly have opinions on all the holy wars, but I won't bring them up, I'll just help you with the task you're working on. :) I'm not a Lisp expert by any means, but I've worked through most of SICP.


I have looked over the site once a long time ago, when I had a copy of SICP in hand. I just looked now.

What I do not see is a robust set of libraries that can help me accomplish the solving of real-world problems. As mercenary as it sounds, I program to solve problems my employer has in exchange for money. I solve problems that people have, rather than problems that books abstractly propose. While Lisp or whatever dialect might be lovely, it may as well be Logo for practical tasks. I do not want to re-implement JSON. I do not want to try to write my own ODBC. I need something beyond a language that lets me solve the problems written in a book that is divorced from real-world stuff, and that has, for the past couple of decades, meant libraries.

"The Lisp Curse" is a pretty good explanation of why I won't see those libraries and the situation hasn't changed, that I can see, since I first read it.

At the end of the day, if I want to learn a language, I want to have done it for more than the sake of having said that I have climbed that particular mountain. I need something up top that is valuable. Climbing it has to have real-world applicability to me.

Can I use Lisp to interact with these GIS formats and solve real-world problems? Not without building my own libraries, and so on. This is why I have liked the war metaphor: all of these folks skirmishing when they could be building factories.

I am not asking for Lisp to be Python or Perl or whatever. But it should have a great standard library. Where is this?


"What I do not see is a robust set of libraries that can help me accomplish the solving of real-world problems."

I think the suggestion wasn't to give you the One True Answer to which Lisp to use. The purpose of suggesting working through the SICP book is to give you in concentrated, curated form the insights that Lisp is supposed to bring, whereupon you should turn around and bring those insights back to whatever normal programming world you inhabit. To the extent it is divorced from real world stuff, yeah, that's on purpose, and the entire point of the recommendation of SICP.

Fortunately, the world has changed since the SICP was written. At the time, there was a much larger barrier between Lisp and the "real computing world". While by no mean do all languages look like Lisp now, there has been a lot of seepage, and now there's plenty of languages where you can bring the stuff in SICP into the language you use day-to-day.

The idea is this: You could learn a new language, a couple of frameworks, half-a-dozen libraries, fight through bugs in all of the above in some immature cutting edge library, and also fight through a lot of accidental complexity because you accidentally selected some task that the weird new language is not very good at, only to arrive after all of this with some new insights about how computation works and what languages can do after a year or two. Or, this suggestion is, learn a very small new language and read a guide book, get the concentrated insights in a few months at most, and then continue using the frameworks, libraries, and experience you already have.

(Personally, I recommend SICP as the perfect companion to any self-taught programmer. It is almost laser focused on the sorts of things that the self-taught programmer will find hardest to pick up on their own. Finish it and you really will be able to code circles around most college grads, beating them both practically and theoretically.)


I did this once with Prolog and did not come away with the benefits espoused. I was told that it would really change how I thought about things and so forth. That didn't happen. I didn't get anything out of it that I could bring elsewhere. I fear the same result after a similar investment in a similar situation.


shrug That's the risk you take whenever you learn anything new--maybe it won't be useful.

The alternative is, of course, never trying anything new or learning anything ever again. Your call!


Thanks for your comment, you really grokked what I as trying to say previously and said it better than I did.


Racket and Common Lisp probably both fit the bill for what you're saying. Racket is more "batteries included", while Common Lisp has a wider variety of third-party libraries. Both have mature JSON implementations.

That said, I think you're underestimating the value of learning Lisp with no intention of using it to solve real-world problems. There are some big benefits:

1. It shows you a different way to structure programs, which I think is simpler and more powerful than OO.

2. It's a lingua franca of academia, which allows you to read a lot of papers you wouldn't otherwise be able to read.

3. For better or for worse, Lisp is a secret handshake that will get you into a lot of clubs. I've gotten more than one full stack Python/Django/JS job offer after interviews where all we talked about was the Lisp implementation on my GitHub. And while you definitely don't need Lisp experience to write full stack Python/Django/JS, there are jobs which require FP experience for good reason, even if you aren't writing in a very FP-focused language, especially if you're solving hard problems.


Would Clojure fit your criteria? There are already a good chunk of Clojure libraries, but you can easily use Java libraries via interop, so all libraries on maven central is at your disposal.

It was made to be a pragmatical lisp for real world use and the syntax is somewhat modernised to make it a bit more readable and user friendly. Vectors and maps looks like they do in other languages. It also uses immutable data structures by default making concurrency and parallell execution very easy.

It also has reach running on the JVM, in the browser (ClojureScript), as native binaries (via GraalVM's native-image), and even as short lived scripts via a number of Clojure interpreters like Babashka, Joker, and others.

I've found Clojure to be a very enjoyable language. Granted I have not really worked with other lisps.


If you're a Java shop you can sometimes use Clojure since it runs on the JVM. It has 100% access to Java classes which gives you a huge existing ecosystem despite the language being fairly niche.

From what I gathered when looking around some people don't consider it a Lisp, but it never bothered me much.


I’d advise using Racket to get up and going.


I did a lot of lisp bouncing, and am still what I’d call a mediocre novice. However the two lisps I found the most traction with were Clojure and Racket.

With Clojure, I liked the data structures, availability of libraries, and syntactic sugar. With Racket I feel like I got something almost as comfortable as Clojure, but without the baggage of the JVM (yes, I know the JVM is amazing—I still find it a pain).

The Racket documentation is pretty good. The guide, the reference, and the How to Design Programs beginning programming book all contribute something. I dabbled in How to Design Programs way back when I was learning R, and it certainly shaped the way I approached R... even infected my Python thinking a little.


Agreed, specifically the the SICP #lang: https://docs.racket-lang.org/sicp-manual/SICP_Language.html


Ignore the war. Just pick a lisp, any lisp, and then learn it. It's not necessary to pick a side nor does it really matter which lisp you pick. You can pick a side later if you feel the religious fervor sweeping you up. But really Lisp is sufficiently different from pretty much any language you'll have encountered that it's worth taking a look.


No, I do not worry about feeling fervor. I tend to be intensely repelled whenever I detect that sort of thing in others.

I must select a side because I have to install something. The problem I have had with this approach is that, through the clouds of dust and smoke, through the constant shelling and the screams of the dying, I cannot seem to discern which side has that bare minimum set of properties that I want. Namely, can run on Windows without it being a ridiculous exercise in hackery, comes with a robust set of libraries that I can end up using it to solve actual problems I might eventually receive money for solving, and has enough study material to lead me through it.

None of these are negotiable to me and as far as I can tell, these objectives haven't even been considered by the various sides in this forever war. And so I will do what I always have done, which is take a peek, observe the chaos, and then wait another five years to see if anything has gotten any better.

I did the whole "enlightenment" thing once with Prolog. It ended up not having any impact on anything I wrote. The enlightenment wasn't portable. I could not use it to solve actual problems I had. I could use it to solve toy problems in a toy world, and maybe there are spots where it could have overlapped with a part of the world I wasn't in, but I wasn't going to migrate just to find a way to get paid using Prolog. At the end of the day, I was still chopping wood and carrying water.


>Namely, can run on Windows without it being a ridiculous exercise in hackery, comes with a robust set of libraries that I can end up using it to solve actual problems I might eventually receive money for solving, and has enough study material to lead me through it.

Racket seems to tick all of your boxes, IMO. Also maybe Clojure. I've gotten both to work without much problem on Windows, both have an ecosystem and plenty of tutorials.


Go with Emacs + Slime + SBCL, they're very stable on windows, then install Quicklisp: https://www.quicklisp.org/beta and you're good to go.


If the dialect of Lisp picked does not have full-fledged macros, then it's not really sufficiently different from many other languages.


All you really need is quoting and eval to get something close enough to macros. Most Lisp dialects will have that much.


If you consider that to be sufficient, then what difference is there compared to any other language with eval?


It's about ergonomics. Technically if I'm willing to grab a library to parse the AST and hook into the compiler/interpreter backend then I can replicate what Lisp let's me do in any language that exists. But there is a pretty wide gap between how easy that is in say... C++ as opposed to Lisp. Some languages get a lot closer than others but I would argue that with the exception of Forth or perhaps TCL none of them make it anywhere near as easy Lisp does.


The line between "language" and "library" can be very blurry - it certainly is in case of Lisp.

Suppose we take something like Python, which has a standard library module to parse ASTs, modify them, and eval them. It might look a tad more ugly than Lisp (esp. with quoting and unquoting), but I don't see why this should be considered a fundamental difference, enough so to justify its use as the sole determining factor of a "true Lisp".

And besides, what about popular languages that do have full-fledged macros? Say, Rust or D.


The point is still about ergonomics though. The AST for Rust or D is still complicated enough that you typically won't do the kind of thing you might do in a lisp just due to startup cost. Lisp because of it's extremely limited syntax makes it considerably easier. The sorts of things you'll feel free and capable of doing in a lisp macro are ergonomically harder to do in a Rust or D macro.

You might argue, quite convincingly, that Rust and D have valuable ergonomic barriers to doing some of those things since macros have a non-trivial cost in your codebase regarding complexity and readaiblity. But the ergonomics of a macro is Lisp are pretty close to the optimal for ease of use.


I thought HolyC was God's own programming language.


I would have thought that the Lambda Calculus has a better claim to be 'God's Own Programming Language'.

Apparently McCarthy was aware of, but had not studied, the LC. One quote from the article stuck out:

"McCarthy invented an alternative, the “true” conditional expression, which returns sub-expression A if the supplied test succeeds and sub-expression B if the supplied test fails and which also only evaluates the sub-expression that actually gets returned."

this is how 'true' and 'false' are encoded in the LC, (\x \y x) and (\x \y y) respectively, and the final sentence indicates lazy evaluation.

The early (wrong) choice of dynamic vs static (lexicographical) binding, since corrected, suggest the language was far from 'handed down on stone'.

Homoiconicity is very nice, though I suspect that the macros it has enabled are often perhaps too powerful a tool.


It was Einstein that said, “Everything should be made as simple as possible, but not simpler.”

I think the problem with Lisp, is that it violated this principle.

It has a lot of strange little constructs. It came from a time when programmers tried to type as little as possible. In doing so, the language adopted all these little quirks. I’m not saying it’s bad, but it’s just different.

Whereas the human mind is a simple graphical machine, and we like to see associations. Like, the usage of an equal sign, to see that we’ve made a variable assignment. Maybe this reaches back to our childhood algebra days, where we associate equivalence with an equal sign. Who knows.

But Lisp did away with all that. It created its own style. It gave us parentheses to enclose our statements, which is to be honest, actually a nice feature. But it forced us into knowing the specific ordering, sequence, and symbols in order to make a legal statement.

Anyways, I like Lisp, and have always been wanting to use it for something. But not quite sure what.

It’s great for writing short macros in Emacs though. You can write a multi line function, then compress it back into a single liner, because of the parentheses. This helps keep your config file short.

It doesn’t really work for video game programming, as it doesn’t seem to have the libraries for it. It’s not as fast as C for speed critical applications.

It kinda lives in that medium realm, where internal business applications can use it for internal business processing, that can run uninterrupted for decades. But, this space is where Python excels at.

Anyways, one day, I’ll finally create that programming language idea of mine, and it’ll be some fusion of Lisp and Smalltalk, but can run almost as fast as C.


Some implementations (e.g. Common Lisp) have legacy oddities in them for back compatibility but newer implementations like scheme tend not to. Instead of CAR and CDR they have first and last, for example.

Emacs lisp is quite slow but adequate for purpose. You can write very high speed numeric programs in Lisp — another book by Sussman was on HN the other day and it’s all about physics, all written in scheme. The fact that code is data allows lots of complex optimizations that are harder or impossible to represent in c


I think you got this backwards. Common Lisp does have `first` and `rest` as synonyms for `car` and `cdr`. As far as I know, Scheme does not. I believe you have to use `car` and `cdr` there (unless, of course, you define your own synonyms).

I could be wrong about Scheme: My Scheme knowledge is badly outdated, and was always incomplete.


Racket has `first` and `rest`. I just tried it. But I tried an online scheme interpreter and it did not have it.


first and rest are in racket, but racket has many things not in scheme (and doesn't have some things that are.) If you use the r5rs #lang (e.g. `racket -I r5rs`) you'll see that it doesn't have first and rest.

Also, first and rest in racket aren't synonyms for car and cdr. car and cdr take pairs, while first and last only take lists. Try this: (car '(1 . 2)) and (first '(1 . 2))


Strange, CL has first, rest, car and cdr.

Scheme R7RS small has only car and cdr.


The fractal flowers and recursive roots: The most lovely hack I’ve seen.


"To iterate is human, to recurse is divine"

[not sure who I am quoting]


I’m pretty sure that was guy Steele and he would have written “recur” (like “to occur”)


Lisp has many parallels with Ayahuasca: Both are tough to 'swallow' and not everyone comes out on the other side 'enlightened'.

No doubt about the potency of both though ...


I like the Prolog story "PROably the Language Of God"


LOGO is the Language Of God, and we are His turtles.

    L: Language
    O: Of
    G: God
    O: Only God knows what the last "O" stands for.


A tail recursive acronym!


My first language (circa 1977!)


PROvably the Language of God




The responses I see to lisp seem to vary on a huge range from idolatry to dismissiveness. It's interesting that Gerald Sussman's own point of view on lisp seems to be very much more moderate - that different programming styles and philosophies suit different domains, and ultimately, you should choose the right tool for the right job. Lisp is flexible in that it does not bind you to any philosophy, and is good as a general tool insofar there isn't a specialized tool that would fit the problem better.



It is borderline blasphemous to think God can't handle a little syntax and would use barebones parse trees cum s-expressions. Do you see DNA chains cranking around without a higher layer phenotype? You do not. QED.


I thought that was Perl...


Obligatory XKCD: https://xkcd.com/312/


Wait, isn't this the obligatory one in this context? https://xkcd.com/224/


Yes, in the context of the OP. But in the context of someone mentioning Perl, the one that I quoted makes sense.

Here's the Frost poem alluded to: https://www.poetryfoundation.org/poems/44263/fire-and-ice


That would be satan's language.


Perl may be ugly, but it is not evil.

Malbolge (https://en.wikipedia.org/wiki/Malbolge) is evil.

Perl's unofficial mascot is a camel because Perl is a lot like a camel: a beast of burden that requires little and will get you through the desert, but that may not be as pretty as other languages.


You forgot to mention the smell :)


Obligatory xkcd reference: https://xkcd.com/224/


i was wondering what took so long for this to be posted


This, then, is a proof of the multiplicity of gods (there must be at least as many gods as there are Lisps).


Try Clojure, learn properly then we talk...


A lot of people, including some prominent faces in the programming world in general, have been writing about Lisp as secret sauce, silver bullet, God's language, source of programming enlightenment, yadda yadda; basically a set of mystic-sounding buzzwords that, other than causing some people to indeed try Lisp, have left a lot of people confused and amused by the wording - or just plain angry and disappointed at the false marketing that I consider the above hyperboles to be.

I can see the benefits of the comments that follow the "any publicity is good publicity" rule. In addition, the more-functional-in-approach Lisp dialects certainly follow the "breaking the long-built mental model" scheme that you mention, and so does Common Lisp, since it mixes programming paradigms (including functional one) rather freely. Still, after seeing the effects of the aforementioned hyperbolization of Lisp in the long run, I'm not convinced that its execution went well at all.

Lisp is a very good language, but no, sorry, it's not the Magical Silver Bullet of Enlightenment™®© that some people like pg or esr or (to some extent) the author of this post claim it to be.


I think it is a silver bullet of enlightenment of a certain understanding of what programming languages are. It is not a silver bullet for all of your programming tasks or problems.

Most people I know who cast off Lisp are people who read the Wikipedia page, didn’t feel enlightened, then began to complain online about how they were disappointed by their Lisp experience. Or perhaps they went a tad further, got upset by Emacs being unfriendly to setup, and proceeded accordingly.

In the Modern Age (TM), programming Lisp is unlikely to convince you to change your usual dev stack to it. But if learned properly, it will enlighten you on the structure of a language and how syntactic malleability is a powerful abstraction for solving many kinds of problems.

Enlightenment usually comes from realizing that it’s not a feature bolted onto Lisp, but an exposed interplay between many otherwise ordinary aspects of programming languages: syntax, semantics, interpretation, and the runtime. At this point one typically “sees” how this interplay could (and perhaps even opaquely does) play out with other, non-Lisp languages.

These kinds of things could in principle be learned in a compiler course, but compiler courses tend to be extraordinarily opaque as to how such a course would help your day-to-day coding. Lisp provides a visceral, hands-on experience of many (though certainly not all) of the same principles.

If you happen to be the kind of programmer who likes absolute control over your environment, because that helps you work through gnarly problems more efficiently than duct taping a bunch of dependencies together, then you may actually end up switching to Lisp.


> syntactic malleability

this is the key point - not quasi/proto FP

most comments I see that are skeptical of the 'lisp as magic' claim seem to focus on the quasi-FP-ability of lisp, leave out the fact that the lisp family is pretty much unique when it comes to symbolic programming, and then optionally go on to talk about how some typed functional non-symbolic language is a better functional language.

this is ignoring the 'too many parens', 'no market share', and 'doesnt work well in my editor' people.


Syntactic malleability is actually a red herring. The key insight of Lisp is: Lisp code is not text. Lisp code can be serialized as text, and it can be parsed from text, but it is not text, it is S-expressions, and S-expressions are not text, they are data structures, specifically, they are trees of cons cells. And because they are trees of cons cells you can construct them without ever constructing any text, i.e. without any parsing.

Syntactic malleability is a consequence of this property of the language. It is not in and of itself the central idea.


good point..

though arguably if code is data we are somewhat saying the same thing :)

in any event, this unique overall property/combination of properties is often overlooked in these discussions


The "code is data" slogan also misses the point. Text is data, so all code is data, whether or not it's Lisp code. (The only code that isn't data is code that has been compiled down to hardware. In the olden days computers were programmed by plugging wires into plugboards. That code wasn't data. But any code that is rendered in the same medium as the data it processes is data, and nowadays that includes all code.)

What matters is that text is structured fundamentally differently from trees of cons cells. Text is a vector of characters. It is a fundamentally flat data structure. It is well suited for humans to read and write with pencil and paper, or chisels and stone tablets. It is not natively suited for describing hierarchies of abstractions, which, it turns out, is what you want when you're writing code.


Never forget: The Y-Combinator is an overly complicated hack that is only necessary because the language lacked a simple way for closures to refer to themselves.


I tend to joke that lisp is a cult


And this is exactly the result of the aforementioned people actually marketing Lisp as a cult. IMO we can thank these Lisp "evangelists" for greatly contributing to the bad vibe that Lisp dialects have nowadays, along with a few Lisp-using haters such as Erik Naggum who provided the community with as much insight as toxicity.


Erik Naggum has been dead for 11 years. I’m not sure it’s useful to bring up his contributions to the culture if we are discussing what Lisp is today.


He's been dead for a long time, though I've seen his posts being brought up now and then on the Lisp forums. Honestly, I'm thankful that these situations are rare. Most of his Lisp wisdom has been distilled from his other qualities and integrated into many contemporary Lisp libraries. This saves us from the need to dive into the verbatim Naggum posts from the c.l.l archives.


That's your opinion and you're welcome to it. I wouldn't use words like 'us', just say 'me'.

I find considerable value in reading Naggum posts verbatim.


Good idea, thanks. The next time I'll replace "us" with "me, and plenty of others".


I edited my post, hope you do the same.


I cannot - the time is over, so I hope that whoever reads my original post also reads this series of replies.


It maybe had that ethos in its heyday, especially with pg/jwz/c2, but if you for instance join #lisp on Freenode, you’ll find the community is by and large just a bunch of people hacking and collaborating with no evangelical agenda.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: