I have a suspicion that outside of the 10 people writing lisp seriously—shirakumo, stylewarning, lispm, and some others— there’s more characters of prose praising lisp being written than lisp being written. It’s a great language. I really like it and am using it for some projects. I just want people to actually use it rather than talking about using it.
Edit: I’m wrong in the present case! Author has some cool projects on GitHub under themetaschemer. Still though I’m struck by the ratio of lisp project articles vs lisp praising articles hitting HN’s front page.
(I've done a lot of Lisp, for money, and for open source platform&community buildout.)
HN articles about exotic things, not just Lisp, do seem to get upvoted disproportionately.
Also, PG declared Lisp as something you're supposed to think is important or a superpower, and I've wondered whether that also contributes to some upvotes on HN.
But I'm not aware that a single one of the bajillion YC startups used Lisp.
Maybe it's like an admirable religious dogma that a congregation affirms each weekend, and then promptly forgets about for the rest of the week.
My impression is that the high salary is because Clojure it is primarily used for small-ish, fairly senior, highly technical teams writing high impact standalone services e.g. it's my understanding that a Walmart's route planning teams writes their service in Clojure.
This actually meshes well with what you'd expect from a lisp being used well. Lisp's power and its downfall is that it lets you express abstraction at whatever skill level you're at with no built-in mechanisms to keep style consistent across large teams. This works really well for small teams of high powered developers.
Lisp as a great tool for learning FP paradigms for application in other languages. IMO, in practice, the metaprogramming that Lisp encourages isn't very good good for the understandability of a codebase, which is IMO one of if not the most important factors for building an applications in a company.
The objective test for whether Lisp metaprogramming contributes to or detracts from the readability of a codebase is to take a sample of the code base and expand the macros, to see whether that is more or less readable.
Macros provide abstractions that often make programs easier to understand. I myself have dozens of the usepackage macro in my Emacs configuration, they hide and standardize the boilerplate that I could use instead. Nevertheless, I have serious reservations about macros in programming.
Taking a code base and expanding the macros to see whether that is more or less readable is a bit like expanding a C++ class hierarchy into assembly language and concluding that this program’s object oriented design is great because it is more readable than the resulting assembly language.
The real question is whether or not there are better alternatives than macros or metaprogramming in the first place. Functions, especially as found in modern languages[1], have proven to be very useful abstraction mechanisms and can often mitigate the lack of fancy macros.
I don’t want macros banned from Lisp, but I don’t miss them in other programming languages. Over exuberant use of fancy macros in Knuth’s brilliant TeX, in my opinion, contributes to the glacial pace of LaTeX development.
[1] By this I mean functions with circumscribed access to non local variables, provisions for choices of ownership of parameters, static scoping, first class functions, closures, and recursion.
> The real question is whether or not there are better alternatives than macros or metaprogramming in the first place. Functions, especially as found in modern languages[1], have proven to be very useful abstraction mechanisms and can often mitigate the lack of fancy macros.
The only relevant "better" in this debate is whether that would be more readable in those specific cases. People still make macros over functional solutions; e.g. the trick where a macro expands into just a function call to one or more generated lambdas which hold argument material. All macros expand to nothing but functions and special forms.
TeX macros are just preprocessor cruft that is more closely related to C macros than Lisp macros. TeX and LaTeX are fragile mainly because of the semantically impoverished target language which just has global variables and conditionals. LaTeX macros are leaky; use them in the wrong place and they misbehave.
Agreed. Metaprogramming does have it's place in languages like Lisp where it removes boilerplate or hides details that other programmers likely don't need/want to think about. However it's easy to go too far, which is really what I'm talking about. I think as long as you have first-class and variadic functions you can probably live without macros for like 98% of cases without compromising on readability (as is the case with Python). Alternatively, if you have an easy way to, at develop/compile time, identify exactly what a macro/metaprogram is doing I think it's ok, but that is subject to tooling.
Python acquires new macros in its parser. For instance, it now has pattern-matching macros which weren't there before. A few years ago it got a f'...' reader macro which wasn't there before.
So if you're on an older Python, you can't backport the new syntax; you're forced to upgrade if you have some code which depends on it. In Lisp, if some code we would like to use depends on some newer macros offered by the Lisp dialect we are using, but we cannot move to a newer installation yet, we may be able to backport just the macros into our program.
You're stuck with the ill-designed crap that Python puts out. For instance, I don't agree with pattern matching that assigns to existing variables; from where I'm sitting, it looks like an incompetent clusterfuck. If someone did that in one Lisp program, the entire language would be blamed for allowing that sort of "curse".
so as far as i can tell it's not like a full switch per se but reddit has services written in both python and go.
r2 (old reddit, the reddit API, some parts of new reddit, some of their ML stuff), their monolith, is still written in python.
some services they have in go include some real time services (like the reddit talk stuff seems to be primarily written in go)
reddit kinda seems to have a SOA (not microservices). while there are some codepaths that go through r2 (and indeed this seems to be the case in a lot of cases with regards to old reddit), a lot of newer code uses services.
(note: their engineering blog (/r/RedditEng) calls a lot of their stuff microservices and indeed they may be using that for some things but a lot of their stuff have been described as using more of a SOA vs a microservices.)
someone like /u/ketralnis may have more information as well?
I think Lisp has a sort of cachet to it, "lost secret of the ancients scorned by foolish mortals," that draws a lot of people to talk about it endlessly and speculate about how important it is instead of just... using it. I used to see the same thing in the Plan 9 community where people loved to post on the mailing list about all their grand plans for doing stuff with Plan 9, but they never even installed the goddamn thing! If they had, they'd have realized it has some neat ideas but it's also just a software system, and a rough-around-the-edges research system at that. Similarly, Lisp is just a programming language, and regardless of how many blog posts you read about it, you'll still have to actually make the program in the end.
Lisp is a research language that lucked into great syntax by accident. I use it all the time to help me think through a problem and then implement it in something brain dead like rust so the average programmer can follow along.
From the outside my code bases look 0% lisp, from the inside they are 100% lisp with build artifacts in other languages.
> I use it all the time to help me think through a problem and then implement it in something brain dead like rust so the average programmer can follow along.
Is Rust brain dead language anyone can follow along? Did you mean python?
> From the outside my code bases look 0% lisp, from the inside they are 100% lisp with build artifacts in other languages.
Can you expand on that? You generate code of other language than lisp in lisp?
No Rust, warmed over ideas form 40 years ago are just hard enough to make mediocrsties think they are cutting edge.
>Can you expand on that? You generate code of other language than lisp in lisp?
I write the real program in lisp in two weeks then translate it to a brain dead language over a few months so the average developer contribute to the code base.
It's rather impossible to get people used to algol descendants to think about complex programs. It's rather like explaining color to the bilnd.
"If <FOO> is so amazing, why doesn't everyone use it?" has got to be one of my least favorite questions. Never mind that it's lazy (in a bad way), it presupposes that people automatically adopt the best available technology or whatever, which is obviously false. But I think the thing that bothers me the most is that, asked with a different attitude and intent, it's usually a good question.
E.g. Lisp is so much better than pretty much every other programming language (except maybe APL) that it truly is bizarre and therefore interesting that it doesn't get wider adoption. (I don't even use it, despite having such a near-worshipful attitude towards it.)
In a word, extrapolation. I hope I don't sound too ridiculous, here on Hacker News, but I've kinda devoted my life to computer programming (it's deeper than that, and there's a lot of other stuff going on, but to a first approximation, that's a true statement.) I used to joke that I was reserving a bank of brain cells for that day when it was time to learn Lisp. When that day finally came, I was well prepared and grokked it mildly, but enough to become angry when I saw all the time and energy that has been wasted due to non-use of Lisp. I literally stomped around the house for twenty minutes cussing!
Anyway, I know this is "argument from authority" by some rando on the Internet, so I don't expect you to take it seriously. :)
Erlang is largely a different kind of niche than Haskell and lisp. Its more focused on distributed and reliable processing than more general purpose. In effect its the Actor model taken to the extreme. Where I'd categorize Haskell and lisp more general purpose but they take different approach. Lisp the more "keep it simple" approach and Haskell the high theory approach. I've yet to encounter someone that claims expertise in haskell that isn't obviously lying or delusional as it is probably the largest language feature wise. I've been learning it for years but still have yet to find the "target use case" that it is the best tool for the job. It certainly is worth learning about even if only for the new perspectives on problems it encourages.
At least in my opinion:
Lisp is amazing for its simplicity and homoiconicness and the great powers that come with those.
Erlang is amazing for its approach to distributed computation and reliability.
Haskell is amazing for at least its theory, and probably more I'm not yet aware of.
That you think assembly is fundamental rather than an accident of architecture really shows how undereducsted you are in computer science. Lisp isn't great because it's weird, it's great because it lucked into homoiconicity in its birth. It's weird because no other language family can do that.
It's not popular because the majority of programmers are mediocre and will never understand the point of homoiconicity.
Show me homoiconicity used "for real". Sure, it's useful in compilers and interpreters, but how often do people need them in a business application? You will most likely use a library, and/or a syntax designed for serialization instead of programming, like JSON.
I personally vastly prefer Python's syntax over Lisp's, because the parentheses require two buttons pressed (shift + 9) instead of one (tab). That may sound trivial, but it's why I jump to Python instead of a Lisp.
That said, I do suffer from Python problems: the GIL, clunky immutable data structures like pyrsistent, poor support for shared memory for multithreading and so on.
Edit: I just realized in Lisp you could replace the built-in data structures if you wanted, so libraries like pyrsistent would require little change in client code syntax. I guess that's one example of homoiconicity in action.
Can you come up with another one? It is not very often that I find myself wanting to redefine the language I'm using (which comes with its risks: other coders and/or their tooling might find my code hard to follow).
The thing that makes lisp special, IMO, is how simple the syntax is. It makes thinking about certain kinds of problems much easier than in other languages, which can do the same things. As strange as it may sound, the book that most helped me understand Elixir's macros wasn't the book devoted to teaching them, but instead it was this Clojure book: https://pragprog.com/titles/cjclojure/mastering-clojure-macr...
In reality, they work pretty much the same way in the two languages, but due to the syntax, it was easier (at least for me) to grapple with the ideas in a lisp first.
I very rarely write macros, but I sure use them all the time via the web framework and db-wrapper libraries that dominate the Elixir ecosystem and they've been useful for all the "business applications" I've worked on in the past several years.
Running with the json example you don't need a json library in lisp because you'd just dump an s-expression holding the data you want directly. You don't need a library to parse it because it's already in a format that lisp can understand.
Hilariously enough the project that made me switch to lisp from python as my scripting language was writing a lightweight parser for ascii delimited files - https://en.wikipedia.org/wiki/Delimiter?useskin=vector#ASCII... - instead of csv files. After doing it in both I had the eureka moment of using nested s-expressions in the scheme version instead of special characters. All of a sudden I had access to a csv like file which could be arbitrarily nested and didn't require me to worry about escaping. The next mind blowing moment was when I realized I could embed the code of the parser as the header of the format as the type definition and use it to evaluate the format with the program that was used to create it.
You don't even need some deep insanity to do it just:
It also shows why I wouldn't use lisp for everything: if I wanted to ingest a file of a known csv dialect that won't fit in memory I'd do it in C after doing the prototype/master version in lisp. I also wouldn't trust running unverified source code from the internet. But for internal projects it's better than sliced bread.
Well, I wrote a genetic programming library, and it was fun to parse a Lisp-like representation from Python. You still have recursion and everything (albeit no tail call optimization).
Here, `_from_source` goes from a plain array of tokens to a nested one (tree), depending on their arity:
S-Exps are almost valid Python. The exception is the single-element tuple which needs a comma: (x,)
But I still preferred to use Python as a programming language, and Lisp as a sort of AST. It's just easier. I am curious what roadblocks you faced in your ASCII delimited parsing.
Do you by any chance still have the two parsers? I'd love to see them. If you are worried about your anonymity, you can find my e-mail on my blog, and my website on my HN profile. I promise not to disclose your identity publicly.
Add higher order functions, e.g. (λ (x) (x x)), and lisp notation is the simplest/only way to deal with the general case where you don't know ahead of time the arity of the function you'd be applying because of partial currying and data persistence.
As for the parsers this was 10 years ago at university. I've long since lost the source code. There weren't any problems with python, it's just that once I wrote the lisp version I realized just how useful the brackets actually are. There's a reason why every computer language is more or less context free. Lisp just takes that to its logical conclusion.
Well, thank you for your feedback. I don't know why I bothered with the flat list in the first place. I might rewrite this library in Clojure. And I might blog about my findings.
Some advice: move to emacs regardless of what editor you use. Evil mode is the best vi clone there is. Use paredit for any lisp like language, you are editing the sexpression tree directly, not it's incidental textual representation. Use the mode for the dialect of your choice. You will never need to exit emacs and will be able to experience the repl as it was it was meant to be.
If you are referring to fastest in terms of wall & CPU time running. Without spending a lot of time optimizing your code yourself, I suspect SBCL would take the crown for vast majority of cases. If you do hyper optimize, the gap with scheme largely closes (sometime scheme wins, sometimes SBCL). So I'd go with SBCL if all else is the same feature wise to you. That said, the last serious benchmark I did of lisps was ~2010 so optimization may have changed since then and of course they were not very rigorus benchmarks (just my compute-bound workload at the time, that primarily boiled down to mostly integer ops). Debian language benchmarks game has some benchmarks written for sbcl and racket that may be good for comparing implementations yourself if interested. https://benchmarksgame-team.pages.debian.net/benchmarksgame/...
SBCL, probably. Scheme would be a close second depending on the implementation (Chicken, Gerbil or other compile-to-C impls.) Racket and Clojure are closer in performance to scripting languages like Python.
But here it's significantly worse than the Clojure runs, likely due to the slow DB bindings (the other test types show it roughly in line with Clojure; though it uses a "Stripped"/unrealistic HTTP implementation):
There is a fun blog series about comparison and tweaks of the same problem between java, rust and CL.
I will link only the last part of the series[1], but the general take is that SBCL performs comparably to a heavily optimized java implementation, and can even hold a candle to rust and java on short runs.
It also could be that autistic people such as yourself are such a small minority, that a programming language targetted for them won't ever have a large audience.
I actually just finally tried out plan 9 in a VM and liked it so much that I’ve been using acme for almost all my development on mac via plan9port for a few weeks now.
I never bothered checking it out before because I had always seen it described as unusable so I figured it would be a waste of time.
which is a terrible shame and a lesson. of the 10 or so lisp like languages i've used extensively, elisp is by far the worst. it succeeds because it interfaces with something important, and its carried along everywhere.
After much searching for the perfect languages and perfect tools (while using emacs as a simple text editor), it finally dawned on me that Emacs itself is as close as I'd get to a lisp machine right here in my own back yard. A lisp machine that sits quite nicely on top of linux/unix and plays very nicely with it.
There's much that I love about the lisp world, and the author does a great job of articulating the reasons why. There's also much that I love about the unix philosophy. Emacs is a bridge between these two worlds that I love, and I'm old enough now not to give a shit (or need to give a shit) what the rest of the world is doing. This is home.
It reflects the beauty of the Emacs design, Taft something written decades ago is still being extended today to completely new use cases, because of the dynamic and flexible Lisp engine interacting with the C core.
What's the lesson? That a text editor's extension language needs
the strong concurrency and immutability models of a Clojure or sbcl?
Getting emacs to sit and roll over doesn't exactly require an industrial
grade language.
Is this the best marker of Elisp being written? Perhaps I don't get what this is analyzing, but I suspect that it is the newcomers that write the least Elisp, and my impression is that packages are popping up and updating much quicker as of late
Lisp was probably the first widely available language to use garbage collection as well as generate an ecosystem.
As such, it damn near was a superpower compared to everything else available at the time in about 1984--think Assembly, C, Pascal, etc. Garbage collection meant that all the effort everybody else spent on memory management could be spent on your problem. And the ecosystem meant that you had actual data structures to work with right in the language library instead of having to build those from scratch every time.
Emphasis on WAS. Right at about 1988, the new languages demonstrated that everybody got the message.
Perl, Tcl, Python, etc. all had garbage collection and worked very hard to create a general ecosystem. This roughly negated the advantages that Lisp had.
At that point, anybody really well versed in Perl, Tcl, Python, etc. was equally as productive as if they were writing in a Lisp.
Lisp debuted powerful features (GC, lexical scope, closures, destructuring, reflection) back when they were still wildly expensive. Over the decades most of these filtered into everything we use. But I still haven’t seen any rivals for its convenience in providing domain-specific languages via macros.
I haven’t written any kind of lisp for years, though I used to do so professionally (Clojure) and have been moderately successful getting some reusable stuff open sourced in that context (every time I look back at the “biggest” of these contributions it’s always amusing because it was not much more than a relatively simple macro).
I’d love to write more lisp, and like the article author I’m drawn to Racket, but I’ve just got very little bandwidth for exploring it further. Even so, lisp generally is something I advocate learning especially to mentees, because learning it (in author’s sense pretty much exactly as expressed, but with no strong feelings towards static typing at the time) was so transformative for me. I don’t need to write lisp to reap most of its benefits now… I write code in whatever language with the same attitude the author describes: mostly operate on values, abstract and isolate mutations with particularly high value, enjoy reasoning about code easily. And I’d add: recognize things which are like sexprs for what they are as things you can treat like sexprs because that’s more or less what they are.
Granted I’m not posting anything to HN, front page or otherwise besides comments. But I’m still inclined to praise lisp even a few years out from the last time I actually wrote any. Because it’s bound to be as foundational for others as it has been for me.
A portion of it has to be because of Emacs. I read all these front page posts because I love Emacs, and I’m sure I’m not alone. I picked up Clojure because I got familiar with elisp. The gateway has to be Emacs for a lot of people. And it ends up being a very good first impression for the people who stick around so the cycle builds on itself.
The tricks my muscle memory does on its own accord with lisp syntax within Emacs makes programming more fun than any other context.
At least in Clojure's case, most dev teams are definitely just working on normal (domain-wise) business apps, leveraging existing stuff from the dev community, and not posting on the internet about it.
> Edit: I’m wrong in the present case! Author has some cool projects on GitHub under themetaschemer.
Maybe I am missing something but I visited their GitHub but could not find any cool projects. Clearly there are better examples of people who write Lisp seriously instead of evangelizing it. You have given some good examples of them yourself.
Somewhat to my surprise, I ended up writing some script-fu lisp when I couldn't get python-fu to do what I wanted in GIMP. I believe it got some use seven or eight years ago at work for batch resizing of images.
What? You see tons of rust projects hitting HN advertised as rust projects and a very different ratio of this to “rust is the enlightened choice” articles
Every lisp variant is a bit different. Racket, Clojure, and Common Lisp are all excellent. Read a bit about the compromises and dip your toes in one and see if it sticks.
Racket has more didactic origins but I’m given to understand its a good albeit slow general purpose Lisp. Common Lisp is basically the kitchen sink programming language and with a good compiling implementation like SBCL it’s possible to write high performance code. Clojure is the Lisp that you’re most likely to get paid to write. It’s as performant as any JVM hosted language.
I like Racket, but it is also a kitchen sink Scheme. It has a ton of features: GUI framework, drawing (although it's slow), event spaces, places, futures, OOP, mixins, traits, macros, #langs, etc. It can get quite complex very quickly. I have mainly used it as a Scheme.
No, but it offers a lot of escape hatches to get closer to Java-level performance: transients, mutable fields with deftype, atoms, transducers to minimize intermediate sequences, type-hinting, Java interop, etc.
I've seen Clojure sped up to within 20% of Java's speed, at the cost of very un-Clojure-y code.
Personally I feel that Scheme (Racket or some other implementation) is in many ways better language compared to e.g. Common Lisp. Simpler and more intuitive (at least IMO) compared to Common Lisp which is in scale of Java or C++ when it comes to the size of the standard/language. This can be of course explained due to the nature of Common Lisp and how its, according to its standard's author, more about politics than art, since when Common Lisp was being planned they needed to accomodate many features from many different Lisps before it. But despite that, when it comes to real-world code, in my own experience, Common Lisp and especially SBCL takes the cake.
Common Lisp has <1000 exported symbols in the "COMMON-LISP" package.
> when Common Lisp was being planned they needed to accomodate many features from many different Lisps before it
Common Lisp is mainly a slightly modernized & portable version of Lisp Machine Lisp, the feature influence of other Lisp dialects is not that big. The main difference is that Common Lisp provides lexical bindings and lacks a few features.
Yeah, scheme will do. Racket especially since it has all things to get you started. Once comfortable, try dipping into proper CL like SBCL and then decide on a small self-designed project with a clear goal in mind. That way you'll see the struggles or lack of.
It really depends on what runtime you're targeting. If you're targeting the JVM, CLR or Javascript then you might want to check out Clojure. It'll let you interoperate with libraries in each of those environments, which you'll find useful.
If you're looking to create a stand-alone executable and have C/C++ libraries you'd like to use then SBCL or Racket may be more useful to you.
I vehemently disagree with dynamically typed being a winning point of Lisp. SBCL's strong support for type checking is the main reason I was drawn from Scheme to CL, and Coalton (https://github.com/coalton-lang/coalton) is one of the most interesting Lisp projects I have encountered.
Type checking can remove an entire class of bugs from even being a consideration. Yes, it could be argued that type mismatches are a trivial class of bug, and yes, proper testing should catch any issues... but catching problems before you go to testing can save you precious seconds, especially when coding in the typical interactive style of Lisp. Lisp lets you code at amazingly high velocity, good support for type checking helps increase that velocity even further.
I think it would help the conversation along a bit if those who are against static type checking could articulate exactly how type checking gets in the way of writing correct programs.
Isn't making sure the types flowing in and out of your functions match, at some point, something you will eventually need to do anyway? Or are we trying to say the type system doesn't allow for perfectly safe things that should be allowed?
I'm not sure I understand it.
Then again, I'm also the person who wonders if the restrictions Rust puts on "valid" code aren't also too restrictive, so maybe we all exist on a gradient?
In almost every instance I see a function in a statically typed language that should return a sum type the developers inevitably collapse that into a single type eg. we're going to return the number of results and -1 is a sentinel value that means something special. This is a fundamental type error. In a dynamic language you tend to not need to embed a concept or value in a domain it doesn't exist. If later I'm adding results together I'm going to get an error when I try to add the Clojure keyword :something-special to a number, while in the static language you may have just folded the special -1 into a sum because you forgot it was unique.
Yeah, I've encountered real-life code that had such issues (was even running in production without anyone noticing). Type checking won't prevent every type of bug, but that kind of problem (returning a special value and forgetting that it should be handled differently) can occur in a dynamic language as well.
Just because keywords and symbols exist doesn't mean the hypothetical programmer who would return -1 as a special value in a static language will not do so in a dynamic language. But in the statical language when the sum type is used it will prevent people from passing the result into an arithmetic function when more code is added in the future (in an ideal world tests would catch it, but in an ideal world the sum type would have been used from the very beginning).
There are trade-offs between static and dynamic typing, and while dynamic typing allows us to write code more quickly, things balance out when we include the time lost by type matching errors that static type would protect us from.
It's like the C vs Rust discussion but with less potential for leaking important customer data all over the web.
That is indeed an argument against static typing and in favour of dynamic typing, although the demerits can be mitigated through some extent through a combination of organisation and tooling.
For example, if our customer IDs used to simply be increasing integers but then we decide to change to a UUID variant it would be a pain if I had something like:
And then had a bunch of functions calling out to that which all require integers and I had to change all the declarations from integer to UUID. However, I could save myself some future headaches by doing something like:
(deftype customer-id nil 'integer)
And using the customer-id type everywhere. Of course, this is a very trivial example and a real problem would be harder to manage. That said, with optional type checking like SBCL has it is entirely possible to fly by the seat of your pants until things start to take shape.
Not that it matters too much if you go with the tried and true method of tossing out the first two prototypes, but that's another discussion.
Been doing Clojure for the past 5 years and while the language itself is great, the ecosystem really isn't. The tooling is subpar, editor support is clunky at best, everything is for some odd reason much harder to set up and explained in a more difficult way. A ton of libraries don't even provide documentation, they just expect you to REPL their functions to find out what they do.
The older I get the more I realize that what makes something great to use isn't all about the language, but the surrounding environment, and how pleasant _that_ is to use. Personally I think I will most likely not Lisp for much longer because I'm seeing many other languages have much nicer to use ecosystems, even if they probably pay less.
This is the opposite of my experience: one of the main reasons I prefer to use Common Lisp and Clojure over any other language is that the tooling is so much better (emacs + SLIME [CL] or Cider [Clojure]). The only real competitors, imo, are SmallTalk and Prolog environments. However, part of this is the paradigm is completely different and it took me a bit of time to learn what made the tooling great.
Yeah, for Emacs users I'm sure Clojure is great. For the other 99% of us who use VS Code or IntelliJ or something else entirely, it really isn't. Though IntelliJ + Cursive I'd say is probably the best, it still pales in comparison to say Rust or TypeScript support in editors.
Are you suggesting that one has to forgo their favourite code editors, and learn what is arguably the most complex editor, in order to use Clojure? That's the kind of attitude that makes all beginners run away.
You must agree that if the first step in learning a language is "Learn Emacs first", a lot of potential learners are shooed away before they even start. The language has high ceiling, but high floor as well.
Have you tried Calva for VS Code? Although I don’t use it myself (I use vim-iced with Neovim, which I like very much), I have heard lots of good things about it and it seems very easy to set up and use.
I tried Calva a lot and suggested it to my coworkers a while back (2017ish) and it was nothing but trouble. Cursive seemed to work pretty well, though.
I love clojure but do a lot more Rust now. I feel the language leadership is from the information-systems universe - thinking about databases, schemas that sort of thing - and I feel the language and tooling are very good for those sorts of things but there is really no energy or interest in expanding beyond those use cases so it feels like things are settled now and wont change.
Rust OTOH feels highly malleable and the tooling is always improving. The community seems to care about a good user experience across several domains - web, systems, networking, cryptography, desktop gui, and over the last 2 - 3 things are really coming together as coherent whole.
I haven't used a lisp for a few years, but I still remember after it finally "clicked" the weird _velocity_. Code just flooded out.
In other languages I hop around from class defs to class defs, different files. Just richocheting all over the file system. Something about LISP removes a lot of the ricochet. It's a weird sensation.
I didn’t do a lot of Lisp, but I remember solving Project Euler in Scheme and encountering a dynamic programming problem and finding it extremely easy and natural to write a memoize macro with very little language experience.
As author of #Script Lisp [1] I'd say LISP really shines as an embeddable REPL language where you can use it to script larger compiled code-bases like Unity3D games while it's running [2] it's also been useful to open a TCP REPL on a deployed .NET App to inspect its running state and execute its configured dependencies [3].
But I don't use it outside of Scripting .NET Apps anymore other than when needing to perform quick calculations while I'm already in the command-line, I can bring up a quick LISP REPL with `x lisp`.
I've been digging into Common Lisp again lately. I'm really enjoying some parts: CLIM is weird but interesting, and the fact that I'm basically developing inside a debugger makes testing and iterating on stuff pretty straightforward. On the other hand, library documentation frequently feels more like the programmer was making notes to himself rather than illustrating how the code might actually be used. I was also pretty shocked when, after using a quick shell script to generate a file containing a big list (~80MB on disk, 800k items) defined in s-expressions, reading it into SBCL exhausted the heap. After telling SBCL it could allocate two gigabytes I was able to read the file, but doing anything with it was a sure-fire way to run out of memory again. I would never think twice about reading an 80MB file into a Go program!
I sort of feel like I'm walking through an abandoned alien city and wondering what size they were, how many limbs they had, and if we share the same 5 senses or not.
Lisp won't save you from yourself. I bet a Go program using an equivalent representation (assumption being you naively used READ) would exhibit similar memory consumption. However, that doesn't mean that you don't have other options:
- Use an array
- Load the entire file in a memory buffer and skip the reader
- Use mmap (for extra points, you can keep the entire file outside the Lisp heap but still manipulate it in Lisp)
You can run into similar issues in the JVM if `-Xmx#` is too small. This has to do with the way sbcl’s decided to manage the heap and other implementations won’t have this issue. But, I typically start sbcl with an 8GB heap and basically never run into this when consuming large files.
But, also, once you hit a couple megabytes or so, it’s almost always better IMO to consume a file incrementally rather than all at once.
Also there's no problem reading an 80MB JSON file in CL. I suspect gp was using cl:load which is for loading source files. SBCL definitely struggles with very large source files; I never bothered debugging that since I don't need 80mb source files.
I was calling read. I suspect the real problem is explained by another commenter, who points out that 80 MB of ASCII text will become 240MB of UTF32 in the hands of SBCL, not to mention all the additional words of memory required for cons cells.
Lisp feels more creative. You can bend the language, it's malleable.
Rust may be the new Engineer's best friend but to me, Lisp remains the tool for the artist.
Use Rust to Do It Right. Use Lisp to have fun.
Edit: I’m wrong in the present case! Author has some cool projects on GitHub under themetaschemer. Still though I’m struck by the ratio of lisp project articles vs lisp praising articles hitting HN’s front page.