It's a cultural issue with all of the pre-Internet languages. Even titans like C and C++ lack a well defined repository outside of their stuffy standards committee defined standard libraries. CPAN showed how powerful a resource like that can be 25 years ago and almost every language since has shipped with something similar, but old languages never seem to have caught on.
Some of them might be considered production ready, but that would depend on your expectations. They all pale in comparison with CAPI though, which is really good and I'd say one of the main reasons to use Common Lisp today.
It came very early with a form-based GUI designer.
Allegro CL addresses with their GUI facilities explicitly MS Windows and Unix/Linux environments.
Perhaps a Lisp shop could force consistency, but that seems to fall apart after a while.
More on this viewpoint: http://wiki.c2.com/?GreatLispWar
1. It shipped with a substantial set of features, either in core from day one or as a function of being hosted on the JVM, that programmers would otherwise have been likely to build themselves in multiple, potentially incompatible ways.
2. The syntax is simple, and Clojure doesn’t allow user-defined reader macros, so libraries can’t introduce new syntax. There may be more than one way to do it, but you should at least know how to read all of them.
3. It’s only ever been under the stewardship of one cautious, opinionated developer.
Of course, from another perspective, all of these could be (and I’m sure have been) seen as weaknesses.
Objective C and Swift for iOS.
Java (and now Kotlin?) for Android.
SQL for relational databases.
Visual Basic and C# for Windows.
Python for scientific programming and machine learning.
Developers don't pick a language because it makes them feel productive. They pick a language that allows them to write for the platforms where they want to deliver their programs. Or because it has the best libraries for their problem domain.
(Back end web development is somewhat of an exception, as every language can talk HTTP.)
C++ and Mac OS, OS/2, Windows, now games and GPGPU shaders
Most people are in the middle of the bell curve. Smart persistent people are out at the edge of the bell curve.
So where should you aim your project?
There's cult kudos in being out on the edge, and there may even be real productivity and reliability benefits.
But realistically, most people aren't going to go there most of the time.
I can't speak to Clojure; I'm still learning it and haven't used it professionally. But it seems promising insofar as it seems to be quite opinionated. Groovy and Scala aren't. So they both have a lot of concepts to learn, and those concepts all overlap and interact in surprising ways. On my team, I've started banning certain parts of Scala from the codebase. Not because I think they're bad features, per se, but because carving the language down to some semblance of an orthogonal subset seems to be the only way to ensure that 4 programmers won't solve the same problem 6 different ways in the same codebase.
I run into the same problem with Haskell. Haskell's a great language and I think everyone should put some time into learning it. But it was designed as a language research platform first and foremost, and that's evident everywhere. So I'm hesitant to try using it on a large team. I'm kind of hoping that someday someone will produce the Haskell equivalent of Clojure. It doesn't need to try and be minimalist like Scheme. It just needs to have fewer than 5 string types, and save the senior developers from having to spend 85% of their time coaching the junior developers on when to choose a monad and when to choose an applicative functor.
Like I said, I don't know much about it other than it's a pure functional language that compiles to java byte codes, so I lump it in with Groovy and Scala which are both in the same family. Maybe it is easier - Groovy and Scala (as much fun as they both are) definitely aren't. Isn't it as "pure" a Lisp as they can get away with and still run on a JVM though? Lisp has sent many worthy men screaming for the hills.
Not saying that it won't improve, just stating the current state.
Even with reactive UIs as way to get around those callbacks, relm isn't as easy as Fabulous, SwiftUI, Elm, with the caveat that reactive UIs are heavier on the memory allocator.
This was one of the main goals of the Common Lisp standard, no?
Lisp has been around for over half a century. It's had plenty of opportunity to demonstrate its worth in helping people solve real world problems in production. I agree with the author there is probably a fundamental reason lisp doesn't see this kind of use (maybe with the exception of clojure), but I seriously doubt that reason is that it's "too powerful."
It only implies that the relationship between difficulty of development and developed ecosystem is not monotonic.
It's actually fairly normal for two parameters to be highly correlated to each other along most of their range & domain, but for the correlation to suddenly fall apart at the extreme. (I think there's a term for this, and I'm wishing I could remember it, so I could to pages I've seen discussing it. If anyone knows, links solicited.) Of course, the real problem is that there's almost certainly a lot of such parameters, and having what may be merely a "just-so" story for Lisp's general failure to set the world on fire and it's excessive power may not be particularly determinative of anything.
It does at least fit the facts. Most of the other possible answers have at least had the seed of a solution created (e.g., package manager solutions), and it still hasn't taken off, suggesting that wasn't the problem. (Though we can't eliminate the possibility it needs multiple such things.) The fact that Clojure seems to be settling into a C-list language position after its push also is suggestive of it not being any of the things that it fixed.
If anyone could make an equally high performance 80% functionality/use cases coverage version of those libraries using high level python in maybe a month, would that cause a fragmentation in a way that we would get a lot of 80% libraries (with different 80%) instead of one 95% library that is supported everywhere?
His argument is not about difficulty to develop on, but about difficulty to shape the language and its idioms.
All those packages you mention are libraries, and you use them with the same pythonic idioms you code with. Even Python's metaprogramming facilities hardly change the semantics much...
I like to think Brad Cox did a pretty good job and remained compatible with the base language.
> Once again, consider the C programming language in that thought experiment. Due to the difficulty of making C object oriented, only two serious attempts at the problem have made any traction: C++ and Objective-C. Objective-C is most popular on the Macintosh, while C++ rules everywhere else. That means that, for a given platform, the question of which object-oriented extension of C to use has already been answered definitively.
The point isn't that you can't do it. It's that, for a language like C, it's a big enough effort that it's only really happened a couple times, and nobody strays much away from those two options because the effort required to do so would be immense.
Whereas, I think that the difficulty of doing so in Scheme is overrated. We had to implement a fairly decent version as a Scheme homework assignment during my very first semester of my very first year as an undergraduate. It's so easy in Scheme that it's plausibly faster to write your own than it is to choose and then read the documentation of an off-the-shelf option.
NeXTSTEP only survives because Apple decided to buy NeXT instead of Be, and they did an inverted acquisition once inside Apple.
And their pivot into OpenSTEP wasn't selling like faster then they were bleeding.
My thesis was to port a particle engine from NeXT to Windows, as my department was getting rid of their Cubes.
Had Apple not bought them and we would be remembering NeXT just like Amiga, Atari, Be,....
Besides a couple of DDJ articles, I never saw it anywhere.
Even the famous Steve Jobs presentation I only got to be aware of it years later when someone uploaded the VHS recordings to YouTube.
Even those Cube workstations at the university seemed like pure luck having someone signing off their purchase.
Which they repented afterwards and got students like myself to port their 3D research to Windows as part of our graduation thesis.
The old Cray has "died", they just assemble Linux/Windows HPC clusters like everyone else.
Lisp people (like Paul Graham) say that a lot, but they also admit that there’s a pretty steep learning curve before you get to that point - at least, I’ve never heard anybody say that Lisp is both easy to use and easy learn. I actually do believe them. I used to hear the same thing about vi: it’s quick and powerful, once you get over the learning hump. I actually did take the time to get over the hump and found that I _was_ faster and more productive with it, but it took some time to get there. The time spent was worth it, but it was slow going getting there. I’ve dabbled enough in Lisp and functional programming in general on my own to believe that there’s something very powerful hiding in there… but there’s work to be done and unlike my choice of text editor which only impacts me, I have to work in a language that everybody agrees on, and so far, that’s never been Lisp.
Seems more likely that the grass is not actually greener on the other side. If it were, we’d see small teams of the rare few who are willing to surmount the learning curve outcompeting the rest of the tech industry.
Here we are talking about people refusing to move beyond beginner phase.
Isn't that in fact the case, and why we have nurses, RPNs & medics? "Sorry, I can't give you this flu shot - I've gotta finish med school first"
I will say it.
Lisp has been used for introductory programming courses in the past, with much success. Beginners take to it just as well, or better, than other more popular languages.
"This book is about learning to program in Lisp. Although widely known as the principal language of artificial intelligence research—one of the most advanced areas of computer science—Lisp is an excellent language for beginners. It is increasingly the language of choice in introductory programming courses due to its friendly, interactive environment, rich data structures, and powerful software tools that even a novice can master in short order"
Ironically it's been supplanted by Python in that role now.
Richard Stallman would like to disagree with you.
It was Bernie Greenberg, who discovered that it was (2). He wrote a version of Emacs in Multics MacLisp, and he wrote his commands in MacLisp in a straightforward fashion. The editor itself was written entirely in Lisp. Multics Emacs proved to be a great success — programming new editing commands was so convenient that even the secretaries in his office started learning how to use it. They used a manual someone had written which showed how to extend Emacs, but didn't say it was a programming. So the secretaries, who believed they couldn't do programming, weren't scared off. They read the manual, discovered they could do useful things and they learned to program.
Its quite a statement when secretaries back then without the internet and with just manuals. Were able to learn and program Emacs in Elisp.
Speaks volumes about the current generation of programmers and ability to learn or commit to any deep expertise.
That's why today I simply point people to the cookbook . If they keep the interest, they'll eventually jump to the classics.
If someone asks me for advice on how to get started with Common Lisp I will point them them to CL resources.
Racket itself wants to deprecate their entire language, toss every thing and start out with some thing totally new, all over again.
Oddly enough, I've been here for ages, and I don't remember ever seeing this!
The power of Lisp is its own worst enemy.
The power of _insert_prog_lang_here_ is its own worst enemy. Applies to Ruby, Smalltalk, C++, Haskell, and probably many others. It even applies to PHP! You see, power comes in many forms.
2nd order sociological effects often decide the fate of programming language communities. Unfortunately, many programmers have been less competent at navigating those forces. (This has changed with the effect of the Internet on world culture, of course.)
The sideswipe against the supposed venality of managers is unwarranted. Programmers are like mountaineers in a world where the terrain shifts radically and unpredictably. One day you are on top of a mountain, the next day, without having moved, you are in a deep valley. Under these circumstances teams of people in ATVs who can move rapidly across the terrain tend to have a higher average altitude than people with karabiners and ropes.
(More amusing than a barrel of Lisp macros)
1. C++ is not an extension to C. Most C code would quite unacceptable C++, and considered inelegant, unsafe, and overly redundant.
2. C++ is a multi-paradigmatic programming language, not necessarily object-oriented.
Lisp forces coders to think in terms of infix trees. These trees need to be carried around in the front of the mind. This mental model is very difficult unless your mind is wired to be able to process code this way. For Lisp aficionados this either comes naturally or with coding practice. The "parentheses" fade into the background. For most people this wiring never comes. It remains too difficult to keep the nested trees straight in their heads. To put it colloquially, it's too difficult to juggle all those parenthesis.
If you're a Lisper you'll want to believe that with use comes the familiarity to overcome this hurdle. It's not. If you're a Lisper your probably tempted to redactor a sample chunk of code into something really readable to show this isn't the case. However, that only works in the small. It's like code golf. It doesn't carry over into large scale applications. Lisps mental model just won't become second nature to very many coders.
Haskell forces you to juggle "math." Forth forces you to juggle "stacks." Lisp forces you to juggle "trees." The popular Algol descendants are popular, in part, because they're closest to the way people think. The curse of unique brains is the curse of Lisp.
No, those languages not close to "the way people think", unless those people are already experienced programmers using one of those languages.
Programming languages are artificial and do not work like human languages. Every programming language requires you to run an internal parser and build partial parse trees in your head to under stand what the program will do. Whether it's Lisp, Java, or anything else.
Lisp is not any harder for beginners than any other programming language.
> It doesn't carry over into large scale applications.
It works far better and more powerfully in large scale applications. Lisp is unmatched in its ability to write large applications, in terms of power and functionality, with fewer developers and fewer lines of code than almost any other language. This is pretty much the whole point of the "Lisp Curse".
And no, it's not code golf. It is higher order, more powerful abstractions and coding techniques.
>> It works far better and more powerfully in large scale applications. ... This is pretty much the whole point of the "Lisp Curse".
That's the premise of the article. I believe that premise is incorrect. Lisp may be a higher abstraction but it's a more difficult abstraction for many programmers to think in. As I stated, I believe that's a direct result of its syntax and all that syntax entails.
All it takes is 10 or 12 years of schooling, and suddenly people realize this is the natural way to think! Except for all the exceptions like pow(x, 2) which nobody can figure out. Or x=x+1, which flies in the face of those past 10 years of schooling.
Algol, of course, still makes you juggle math, stacks, and trees, but they’re completely invisible so you have to keep them in your head, along with the bidirectional mapping to the complex syntax required by the compiler. But what you can’t see can’t hurt you, right?
>> Algol, of course, still makes you juggle math, stacks, and trees ...
Agreed. It just does so in a way more comfortable to the most number of people.
Comfortable to the most number of people? Following the same logic we can conclude that Mandarin is the most comfortable spoken language, because it's the most common language on Earth.
Absolutely! I'm not saying that Lisp is a bad language. I like Lisp (and Forth). I'm saying that its syntax (and the why of its syntax) are the primary reasons it hasn't caught on. I'm just not buying the "too powerful" and such arguments.
>> ...we can conclude that Mandarin is...
Most people pick their spoken language by their surroundings and what their family speak. Children aren't making a choice. In the early days of computers when the root languages were being created everyone had a choice. Maybe not as much now but still anyone can pick up new languages during their careers.
That's where you got it all backwards. Lisp (as an idea) after six decades is still thriving. There is a Lisp dialect for almost any platform. Clojure surpassed in popularity languages like F#, OCaml and Haskell, even though they are older than Clojure. There are more Clojure conferences, more meetups, more podcasts and more jobs. There are regular conferences in the US, Canada, Brazil, India and in almost every European country. Some of them have multiple Clojure-focused conferences. Emacs and emacs-lisp is still being developed and used by thousands around the world. Common Lisp is still in use. Racket is pretty popular in academia and beyond.
Again, Lisp is just an idea. If you grok the awesomeness of that idea you may someday find what makes it so powerful. Lispers often talk about "enlightenment" and call Lisp "a secret alien technology". They don't do it simply to justify their choices or to create aura with elitist attitude to scare outsiders.
Akin to the idea of Vim as a modal text editing - it takes time to get it, to learn it, to understand it. But one day you may realize the power behind it. Most people, being spectators don't get it. Sometimes people would learn just the basics of Vim, move onto something else and would be like "meh". But those who stick around, they soon may find an incredible new world that changes their subjective view and understanding of how things work. And they realize that there's nothing that can convince them to "unsee" this new world of possibilities. And just like there are many Lisp dialects, there are many ways to run Vim. You can do it in almost any modern IDE and editor.
Lisp is the same. It is a simple but quite powerful idea, if you don't see its power - perhaps because you haven't really looked.
Unfortunately yes, that's not the best way to "sell it", but that's the truth. You have to be a Vimmer to see the power behind modal editing, you need to become a Lisper to see true power of Lisp. That is why neither Lispers nor Vimmers have any illusions - they know, their way of thinking will never become mainstream.
I have fond memories of a number of editors: VIM, EMACS, VEdit, and KEdit (Rexx scripting). When the mouse centric editors appeared on the scenes I found they worked better for me. As someone who has coded in Lisp, Forth, C, etc. and in a wide variety of editors I guess our experiences differ.
End 70s on the Lisp Machine (had a megapixel bitmap screen, mouse and GUI) with the Zmacs editor. The second Emacs ever written and the first one written in Lisp.
Any language which PL enthusiasts considers elegant since you can do so much with few abstractions will be a really bad language in practice since humans prefers different kinds of code to have different syntax and look different.
I still don't understand what kind of objects you're talking about and why they need to be separated? Why pencil.writeOn(paper) is a good UX and (pencil.write-on paper) is a bad UX?
> Our brains are very good at parsing symbols
> and lisp forces you to parse text to understand the structure of the code
> Any language which PL enthusiasts considers elegant since you can do so much with few abstractions will be a really bad language in practice
That is why Lisp refuses to die? When Fortran is forgotten, COBOL completely dead, C replaced with Rust and Java with Scala and Kotlin, PHP gone and Ruby becomes unpopular, there still will be at least a few dialects of Lisp still very much alive and thriving.
Take Emacs for example. For decades different IDEs and Text editors been trying to "kill" Emacs, yet it still thriving and still being used and packages written in emacs-lisp are in abundance and new ones being developed all the time. Check Github stats for different languages - you will be surprised of how much emacs-lisp is published on Github alone.
"Really bad language in practice" turns out to be very pragmatic choice, otherwise how would you explain that Clojure been successfully used at companies like Apple and Walmart, at NASA and many other https://clojure.org/community/success_stories. There are multinational companies like Funding Circle that have built their businesses on Clojure.
Many seasoned CS veterans repeat over and over: "Learn Lisp if you want to master the art of programming".
So if you don't get Lisp syntax - it's just you. You aren't used to it, it's unfamiliar, that's all. But there's nothing inherently "bad" about it.
"My brain cannot handle parentheses, therefore no brain can handle parentheses."
There's no sin in saying "I don't like parentheses." But please, please stop generalizing that to "most people" because you really, really have no idea.
"I don't like part of the first sentence therefore there's no need to consider the rest of what the comment says."
There's no sin in saying you don't agree with a comment. But please, please stop taking one point out of context just because it pushes personal "anger buttons" because you really, really have no idea.
It's not the hierarchy itself that's the problem. The difficulty is remembering all that the hierarchy is doing before setting a chunk of it aside that causes the problem. Especially prefix trees.
Writing Lisp is more like writing poems without having to worry if it rhymes or not. You can write code just as you would draw prototypes on a piece of paper. Except you can evaluate any part of it without any preceding ceremony. Some prefer to start with smaller pieces and compose them, some would just write it "imperatively" and then break it apart. You don't have to worry about operator precedence, about reserved words, semicolons, all the unnecessary garbage. At the end of the day you realize - the visual garbage that's what it is. It supposed to make code more readable - but in reality it does not, it distracts more than helps. Human brain is fascinating, it plays tricks with you - once you see a dot in the middle of a visual illusion, you lose the ability to "unsee" it. But most people may still try to convince you that there is no dot.