Hacker News new | past | comments | ask | show | jobs | submit login
The Lisp Curse (2011) (winestockwebdesign.com)
59 points by felipelalli 7 days ago | hide | past | web | favorite | 98 comments
 help




Maybe the problem isn't so much that Lisp is too easy, but rather that there's no central repository for Lisp modules? Everybody runs into the "I need a GUI environment" and doesn't have anywhere to search for the 30 other GUIs that have already been written and never published so they write their own. A centralized repository for these could help a lot, especially if it enforces documentation standards before accepting modules.

It's a cultural issue with all of the pre-Internet languages. Even titans like C and C++ lack a well defined repository outside of their stuffy standards committee defined standard libraries. CPAN showed how powerful a resource like that can be 25 years ago and almost every language since has shipped with something similar, but old languages never seem to have caught on.


Quicklisp exists now and largely addresses this issue. It can still be difficult to discover libraries sometimes, and not everything is in it, but it’s been a major step forward.

Ultralisp[1] Quicklisp distribution works quite well.

[1] https://ultralisp.org/


what gui library? The only production-ready one is part of Lispworks :(

You can find a listing here https://awesome-cl.com/#gui

Some of them might be considered production ready, but that would depend on your expectations. They all pale in comparison with CAPI though, which is really good and I'd say one of the main reasons to use Common Lisp today.


I am sure there are several ones. Binding to GTK should be very easy through FFI. Myself, I have written a pretty large enterprise application based on Ltk.

Allegro Lisp also has GUI support.

A lot of very GUI demanding programs have been written in Allegro CL, including CAD systems, 3d design system, ...

It came very early with a form-based GUI designer.

Allegro CL addresses with their GUI facilities explicitly MS Windows and Unix/Linux environments.


History has shown that if code must be maintained by multiple different people over time, then consistency and predictability overrides abstraction and meta-ability. Legibility by actual readers trumps linguistic or symbolic parsimony.

Perhaps a Lisp shop could force consistency, but that seems to fall apart after a while.

More on this viewpoint: http://wiki.c2.com/?GreatLispWar


Clojure seems to do pretty well in that regard

I can think of three things that may have helped Clojure avoid this issue, at least relative to other Lisps:

1. It shipped with a substantial set of features, either in core from day one or as a function of being hosted on the JVM, that programmers would otherwise have been likely to build themselves in multiple, potentially incompatible ways.

2. The syntax is simple, and Clojure doesn’t allow user-defined reader macros, so libraries can’t introduce new syntax. There may be more than one way to do it, but you should at least know how to read all of them.

3. It’s only ever been under the stewardship of one cautious, opinionated developer.

Of course, from another perspective, all of these could be (and I’m sure have been) seen as weaknesses.


I don't see it catching on in the "mainstream" yet. It still seems niche-specific. Maybe we'll know for sure in a few more years.

I haven’t had a chance to spend much time with Clojure, but I have spent time learning both Groovy and Scala which are similar. Both have a steep learning curve (which I can’t say I’m entirely over), but I have noticed that, in the nearly 3 decades I’ve been programming, I’ve never seen anything with a steep learning curve really catch on. The things that catch on like wildfire are things like Java, JavaScript, Cobol, VB, C#: things that take a moment to learn but a lifetime to master. The reason seems pretty obvious - you can produce results and feel productive (and demonstrate productivity) quickly even if you’re less productive in the long run. Short term results matter more than long term efficiency.

The languages that catch on are the ones tied to the most popular platforms.

Objective C and Swift for iOS.

Javascript for the Web.

Java (and now Kotlin?) for Android.

SQL for relational databases.

Visual Basic and C# for Windows.

Python for scientific programming and machine learning.

Developers don't pick a language because it makes them feel productive. They pick a language that allows them to write for the platforms where they want to deliver their programs. Or because it has the best libraries for their problem domain.

(Back end web development is somewhat of an exception, as every language can talk HTTP.)


C and UNIX

C++ and Mac OS, OS/2, Windows, now games and GPGPU shaders


This is one reason why "worse is better" happens so often.

Most people are in the middle of the bell curve. Smart persistent people are out at the edge of the bell curve.

So where should you aim your project?

There's cult kudos in being out on the edge, and there may even be real productivity and reliability benefits.

But realistically, most people aren't going to go there most of the time.


It's not even necessarily that they have a steep learning curve, it's that there's such a deep case of TMTOWTDItis in each of those.

I can't speak to Clojure; I'm still learning it and haven't used it professionally. But it seems promising insofar as it seems to be quite opinionated. Groovy and Scala aren't. So they both have a lot of concepts to learn, and those concepts all overlap and interact in surprising ways. On my team, I've started banning certain parts of Scala from the codebase. Not because I think they're bad features, per se, but because carving the language down to some semblance of an orthogonal subset seems to be the only way to ensure that 4 programmers won't solve the same problem 6 different ways in the same codebase.

I run into the same problem with Haskell. Haskell's a great language and I think everyone should put some time into learning it. But it was designed as a language research platform first and foremost, and that's evident everywhere. So I'm hesitant to try using it on a large team. I'm kind of hoping that someday someone will produce the Haskell equivalent of Clojure. It doesn't need to try and be minimalist like Scheme. It just needs to have fewer than 5 string types, and save the senior developers from having to spend 85% of their time coaching the junior developers on when to choose a monad and when to choose an applicative functor.


Sounds like your talking about Idris, Elm(JavaScript), ReasonML([JavaScript] https://reasonml.github.io), or F#(.NET)... Monads are for context-sensitive mini-languages (>>=) (=<<): they have a direction and sort-of sequential semantics, >>= is concat . map, <$>(map) is for context-insensitive languages (can be run in parallel)...

What? Clojure is not similar to Scala at all. Its learning curve is not that steep, because it allows you to learn things "à la carte" - you can pretty much learn it while building things. I have personally met several people to whom Clojure was the very first language they learned. I once give a book to our test automation engineer, and a week later, he has built a dashboard (for test runs) in Clojurescript. He was like: "I am very much surprised myself how quickly I was able to build it." It was a wild mish mash of all sorts of libraries, weirdly looking non-idiomatic Clojure code, but it was a completely functional app.

> Clojure is not similar to Scala at all.

Like I said, I don't know much about it other than it's a pure functional language that compiles to java byte codes, so I lump it in with Groovy and Scala which are both in the same family. Maybe it is easier - Groovy and Scala (as much fun as they both are) definitely aren't. Isn't it as "pure" a Lisp as they can get away with and still run on a JVM though? Lisp has sent many worthy men screaming for the hills.


I don't believe so, because Armored Bear Common Lisp runs on the JVM.

Let's not forget Kawa Scheme.

This is the primary roadblock to Rust deployment IMHO. It's really hard to get over the hump at the start. You do all of the trivial examples in the manual just fine, but immediately run into issues when you write your own code.

Specially if one tries to do a GUI or game as the first step after going through the book.

The GUI is an issue for all the non-platform languages.. It took a lot of effort for Java with average results. Python's Qt wrappers seems to be more successful.

On Rust it goes beyond that, because borrow checker semantics still aren't productive in typical GUI programming patterns, like self referencial structs from event handlers, leading to Rc<RefCell<>> and clone() calls everywhere.

Not saying that it won't improve, just stating the current state.

Even with reactive UIs as way to get around those callbacks, relm isn't as easy as Fabulous, SwiftUI, Elm, with the caveat that reactive UIs are heavier on the memory allocator.


I would include C and maybe C++ in that. The perhaps biggest difference between Rust and C++ is that the C++ compiler doesn't bother you up front with memory correctness.

Clojure is already bigger than Haskell, F#, ReasonML, Elixir and Elm. It has more podcasts and more meetups, there are more than a dozen of books, almost every European country now has its own regular Clojure conference, there are conferences in India, Canada and Brazil. It probably will never become "mainstream" but it is slowly and steadily growing.

> Perhaps a Lisp shop could force consistency

This was one of the main goals of the Common Lisp standard, no?


But it was done poorly... Lisp-2, eq vs equal vs = vs equalp , multiple-value-bind, destructuring-bind, remove-if-not, etc... Python's PEP process seems to be a better idea...

I disagree with the core premise of this article. By this logic, the most developed language ecosystems should be those built from languages which are the most difficult to develop for. Python is a clear counterexample with a comparably low barrier, and tons of existing modules with ~80% capability hacked together by random individuals. In practice, many of those partially complete projects are picked up and improved by others, or serve as direct inspiration for more rigorous implementations. I would argue the language and community are stronger because of this.

Lisp has been around for over half a century. It's had plenty of opportunity to demonstrate its worth in helping people solve real world problems in production. I agree with the author there is probably a fundamental reason lisp doesn't see this kind of use (maybe with the exception of clojure), but I seriously doubt that reason is that it's "too powerful."


"By this logic, the most developed language ecosystems should be those built from languages which are the most difficult to develop for."

It only implies that the relationship between difficulty of development and developed ecosystem is not monotonic.

It's actually fairly normal for two parameters to be highly correlated to each other along most of their range & domain, but for the correlation to suddenly fall apart at the extreme. (I think there's a term for this, and I'm wishing I could remember it, so I could to pages I've seen discussing it. If anyone knows, links solicited.) Of course, the real problem is that there's almost certainly a lot of such parameters, and having what may be merely a "just-so" story for Lisp's general failure to set the world on fire and it's excessive power may not be particularly determinative of anything.

It does at least fit the facts. Most of the other possible answers have at least had the seed of a solution created (e.g., package manager solutions), and it still hasn't taken off, suggesting that wasn't the problem. (Though we can't eliminate the possibility it needs multiple such things.) The fact that Clojure seems to be settling into a C-list language position after its push also is suggestive of it not being any of the things that it fixed.


On the other hand, a lot of the most popular Python libraries became effectively standard and lots of people agglomerated around them instead of creating competing standards because it would be hard for any smaller team to replicate even their base functionality. Creating a numpy, scipy, sklearn, pandas, tensorflow or pytorch would involve lots of optimized C/C++ code and python wrappers, which is a considerably high barrier.

If anyone could make an equally high performance 80% functionality/use cases coverage version of those libraries using high level python in maybe a month, would that cause a fragmentation in a way that we would get a lot of 80% libraries (with different 80%) instead of one 95% library that is supported everywhere?


>I disagree with the core premise of this article. By this logic, the most developed language ecosystems should be those built from languages which are the most difficult to develop for. Python is a clear counterexample with a comparably low barrier, and tons of existing modules with ~80% capability hacked together by random individuals.

His argument is not about difficulty to develop on, but about difficulty to shape the language and its idioms.

All those packages you mention are libraries, and you use them with the same pythonic idioms you code with. Even Python's metaprogramming facilities hardly change the semantics much...


Imagine adding object orientation to the C and Scheme programming languages. Making Scheme object-oriented is a sophomore homework assignment. On the other hand, adding object orientation to C requires the programming chops of Bjarne Stroustrup.

I like to think Brad Cox did a pretty good job and remained compatible with the base language.


The author seems to be well aware, given that, a couple paragraphs later, Brad Cox's efforts become one of the key features of his argument:

> Once again, consider the C programming language in that thought experiment. Due to the difficulty of making C object oriented, only two serious attempts at the problem have made any traction: C++ and Objective-C. Objective-C is most popular on the Macintosh, while C++ rules everywhere else. That means that, for a given platform, the question of which object-oriented extension of C to use has already been answered definitively.

The point isn't that you can't do it. It's that, for a language like C, it's a big enough effort that it's only really happened a couple times, and nobody strays much away from those two options because the effort required to do so would be immense.

Whereas, I think that the difficulty of doing so in Scheme is overrated. We had to implement a fairly decent version as a Scheme homework assignment during my very first semester of my very first year as an undergraduate. It's so easy in Scheme that it's plausibly faster to write your own than it is to choose and then read the documentation of an off-the-shelf option.


If you build your own you'll have to debug it. If you get it from off-the-shelf and its built well, it'll be readable and small.

Yeah, but I think Brad Cox deserves to be named. As a side note he has two really good (if a bit dated) books: "Object-oriented programming ; an evolutionary approach" and "Superdistribution: Objects as Property on the Electronic Frontier".

Ah yes, the speed of smalltalk with the memory protection of C.

Or, you know, the speed and compatibility with everything of C, with the higher level organization of Smalltalk...

Yeah, but Objective-C spawned NeXTSTEP, and C++ got us Taligent.

In a way both are market failures.

NeXTSTEP only survives because Apple decided to buy NeXT instead of Be, and they did an inverted acquisition once inside Apple.


NeXT was making money at the end, and powered OS X for a lot of years. Taligent released a framework and died. I love Be but it wasn't in the same league as NeXTSTEP, and I had both at the time.

What money were they making? Their agreement with Sun to create the future of Solaris SDK failed, with Sun using their work as genesis of J2EE.

And their pivot into OpenSTEP wasn't selling like faster then they were bleeding.

My thesis was to port a particle engine from NeXT to Windows, as my department was getting rid of their Cubes.

Had Apple not bought them and we would be remembering NeXT just like Amiga, Atari, Be,....


WebObjects was doing quite well.

In what markets?

Besides a couple of DDJ articles, I never saw it anywhere.

Even the famous Steve Jobs presentation I only got to be aware of it years later when someone uploaded the VHS recordings to YouTube.

Even those Cube workstations at the university seemed like pure luck having someone signing off their purchase.

Which they repented afterwards and got students like myself to port their 3D research to Windows as part of our graduation thesis.


Dell's website used it. Just because you didn't see them doesn't mean things don't exist. How many Cray computers or iSeries machines have you seen in the wild?

iSeries plenty of them in Portugal, part of my high school internship required doing backups of OS/400, used to be quite common across medium size companies.

The old Cray has "died", they just assemble Linux/Windows HPC clusters like everyone else.


> Lisp allows you to just chuck things off so easily

Lisp people (like Paul Graham) say that a lot, but they also admit that there’s a pretty steep learning curve before you get to that point - at least, I’ve never heard anybody say that Lisp is both easy to use and easy learn. I actually do believe them. I used to hear the same thing about vi: it’s quick and powerful, once you get over the learning hump. I actually did take the time to get over the hump and found that I _was_ faster and more productive with it, but it took some time to get there. The time spent was worth it, but it was slow going getting there. I’ve dabbled enough in Lisp and functional programming in general on my own to believe that there’s something very powerful hiding in there… but there’s work to be done and unlike my choice of text editor which only impacts me, I have to work in a language that everybody agrees on, and so far, that’s never been Lisp.


Learning curve is a weird reason for such an important and well compensated profession to be wielding subpar tools towards subpar outcomes. “Yeah, medicine seems really powerful, but med school is hard and I’ve got patients to treat.”

Seems more likely that the grass is not actually greener on the other side. If it were, we’d see small teams of the rare few who are willing to surmount the learning curve outcompeting the rest of the tech industry.


This occurs in medicine too. One of the primary problems we're trying to help solve for our customers is that they introduce medical devices (for orthopedic surgery) which in theory have many benefits over older devices but which have a learning curve for surgeons. Surgeons often don't feel they have the time to invest in learning the new device when they already have experience with an older device and are confident in their use of it, even if they are sold on the potential benefits of the newer device once they get past the learning curve.

Perfectly an acceptable reason as long as you know how to use the older devices well.

Here we are talking about people refusing to move beyond beginner phase.


I wonder about that, too, honestly. Taking my comparison with vi again: I’m a vi true believer, having gone from a vi skeptic… but I don’t really have hard evidence that I’m truly, actually, indisputably more productive in vi than I would be with, say, notepad++ or sublime or atom or some other editor. I say the same things that other vi true believers do, and I’ve never seen or heard of anybody who was comfortable in vi who ever said that they regretted spending time learning it or that they didn’t think the time was worthwhile - but how do I (we) know that isn’t just confirmation bias? Or worse, how do we know we’re not just deluding ourselves to try to convince ourselves that we didn’t waste time learning - or justify time spent doing something we enjoy(ed)? I’ve been learning Scala on and off for a while and it _feels_ better and more correct than Java, but maybe I just want it to be better and more correct than Java.

"“Yeah, medicine seems really powerful, but med school is hard and I’ve got patients to treat.”"

Isn't that in fact the case, and why we have nurses, RPNs & medics? "Sorry, I can't give you this flu shot - I've gotta finish med school first"


> - at least, I’ve never heard anybody say that Lisp is both easy to use and easy learn.

I will say it.

Lisp has been used for introductory programming courses in the past, with much success. Beginners take to it just as well, or better, than other more popular languages.

For example:

https://www.cs.cmu.edu/~dst/LispBook/book.pdf

"This book is about learning to program in Lisp. Although widely known as the principal language of artificial intelligence research—one of the most advanced areas of computer science—Lisp is an excellent language for beginners. It is increasingly the language of choice in introductory programming courses due to its friendly, interactive environment, rich data structures, and powerful software tools that even a novice can master in short order"


"Although widely known as the principal language of artificial intelligence research—one of the most advanced areas of computer science—Lisp is an excellent language for beginners."

Ironically it's been supplanted by Python in that role now.


I wonder if Guido had looked at Common Lisp and built his Python from a simplified form of Common Lisp/Scheme with infix syntax but with multi-line-lambdas, optional return values, CLOS, and edit-and-continue, and fast compilers from the start... We shoould root for Julia - it can drag the Python and Lisp people together. Also what ever happened to Dylan?

See [Hissp] and [Hebigo] for another attempt to drag the Python and Lisp people together.

[Hissp]: https://github.com/gilch/hissp [Hebigo]: https://github.com/gilch/hebigo


>>at least, I’ve never heard anybody say that Lisp is both easy to use and easy learn.

Richard Stallman would like to disagree with you.

It was Bernie Greenberg, who discovered that it was (2). He wrote a version of Emacs in Multics MacLisp, and he wrote his commands in MacLisp in a straightforward fashion. The editor itself was written entirely in Lisp. Multics Emacs proved to be a great success — programming new editing commands was so convenient that even the secretaries in his office started learning how to use it. They used a manual someone had written which showed how to extend Emacs, but didn't say it was a programming. So the secretaries, who believed they couldn't do programming, weren't scared off. They read the manual, discovered they could do useful things and they learned to program.

https://www.gnu.org/gnu/rms-lisp.en.html

Its quite a statement when secretaries back then without the internet and with just manuals. Were able to learn and program Emacs in Elisp.

Speaks volumes about the current generation of programmers and ability to learn or commit to any deep expertise.


I don't think that's true. Sure you need quite some experience with the language to actually appreciate its strengths, but when optimising for low attention span and focusing on the kind of literature that allows people to jump quickly into solving problems (rather than those texts that start slowly and focus on laying out the foundation), provided existing experience with another language, you can start building things in no time. Lisp is just like any other language.

That's why today I simply point people to the cookbook [0]. If they keep the interest, they'll eventually jump to the classics.

[0] https://lispcookbook.github.io/cl-cookbook/


Even the famous long time Lisp aficionados like Paul Graham today recommend Clojure instead of CL. That seems to be more pragmatic choice.

I like Clojure, but whether it is a more pragmatic choice (or if it’s even a Lisp) is a bit off topic here.

If someone asks me for advice on how to get started with Common Lisp I will point them them to CL resources.


I think a lesser version of the same problem is behind JavaScript's Framework Hell. And JavaScript projects in general: "idiomatic" is an ill-defined term in the context of JavaScript, so you rely a great deal more on conventions within any given organization.

Emacs is good enough, much more powerful than many "proper" IDEs. But it should be ported to the proper, modern LISP or Scheme, instead of the using its own subset/dialect. And the used dialect is incompatible [1] with everything else. Too bad that GuileEmacs[2] project died, probably because of the same curse though. Porting it to SBCL would be a game changer too.

[1] https://www.gnu.org/software/emacs/manual/html_node/cl/Porti...

[2] https://www.emacswiki.org/emacs/GuileEmacs


Emacs is fine. Emacs-lisp is good enough for what it was made for. Every few years someone gets a "brilliant" idea that it should be something else but that gets nowhere. Languages cannot exist in a vacuum. If there's a proposal to make it compatible say with Racket, then you'd have to create a strange hybrid of Elisp-Racket because you can't just make all the existing Emacs packages incompatible.

>>If there's a proposal to make it compatible say with Racket

Racket itself wants to deprecate their entire language, toss every thing and start out with some thing totally new, all over again.


Why should it be ported? What would be a game changer?

Because you can't use CL libs in EmacsLisp due to the incompatibility. And vice versa.

Update on October 6, 2017. N.B.: Please stop submitting this to Hacker News! Look at the Hacker News search results for this essay.

Oddly enough, I've been here for ages, and I don't remember ever seeing this!

The power of Lisp is its own worst enemy.

The power of _insert_prog_lang_here_ is its own worst enemy. Applies to Ruby, Smalltalk, C++, Haskell, and probably many others. It even applies to PHP! You see, power comes in many forms.

http://www.giantitp.com/forums/showthread.php?238385-quot-Po...

2nd order sociological effects often decide the fate of programming language communities. Unfortunately, many programmers have been less competent at navigating those forces. (This has changed with the effect of the Internet on world culture, of course.)


Haskell's type checker is not Turing-complete by default. This is a feature, not a bug, because it means that the type checker is guaranteed to terminate. Since this piece was written GHC has been extended so that non-termination can be enabled if you want it.

The sideswipe against the supposed venality of managers is unwarranted. Programmers are like mountaineers in a world where the terrain shifts radically and unpredictably. One day you are on top of a mountain, the next day, without having moved, you are in a deep valley. Under these circumstances teams of people in ATVs who can move rapidly across the terrain tend to have a higher average altitude than people with karabiners and ropes.


The submitted title ("The Lisp Curse by Rudolf Winestock (Again, Sorry)") broke the site guidelines by editorializing. Can you please review https://news.ycombinator.com/newsguidelines.html and not do that in the future?

Useless talk. If the Lisps were that powerful,every spec would end up with a Lisp implementation. We need better Lisps that make translation from spec to software easier. Lisp can improve. Lisp shall improve.

Speaking of Olin Shivers and reposts ... http://www.ccs.neu.edu/~shivers/autoweapons.html

(More amusing than a barrel of Lisp macros)


> Due to the difficulty of making C object oriented, only two serious attempts at the problem have made any traction: C++ and Objective-C.

1. C++ is not an extension to C. Most C code would quite unacceptable C++, and considered inelegant, unsafe, and overly redundant. 2. C++ is a multi-paradigmatic programming language, not necessarily object-oriented.


"We found no items matching the lisp curse"

Yes! I found this text here: https://srfi-email.schemers.org/srfi-discuss/msg/12129771/ and I have never seen it before. It is brilliant.


The curse of Lisp is not its power, elegance, flexibility, etc. The curse of Lisp is its syntax! Both "all those parentheses" and what they represent is the curse of Lisp.

Lisp forces coders to think in terms of infix trees. These trees need to be carried around in the front of the mind. This mental model is very difficult unless your mind is wired to be able to process code this way. For Lisp aficionados this either comes naturally or with coding practice. The "parentheses" fade into the background. For most people this wiring never comes. It remains too difficult to keep the nested trees straight in their heads. To put it colloquially, it's too difficult to juggle all those parenthesis.

If you're a Lisper you'll want to believe that with use comes the familiarity to overcome this hurdle. It's not. If you're a Lisper your probably tempted to redactor a sample chunk of code into something really readable to show this isn't the case. However, that only works in the small. It's like code golf. It doesn't carry over into large scale applications. Lisps mental model just won't become second nature to very many coders.

Haskell forces you to juggle "math." Forth forces you to juggle "stacks." Lisp forces you to juggle "trees." The popular Algol descendants are popular, in part, because they're closest to the way people think. The curse of unique brains is the curse of Lisp.


> The popular Algol descendants are popular, in part, because they're closest to the way people think.

No, those languages not close to "the way people think", unless those people are already experienced programmers using one of those languages.

Programming languages are artificial and do not work like human languages. Every programming language requires you to run an internal parser and build partial parse trees in your head to under stand what the program will do. Whether it's Lisp, Java, or anything else.

Lisp is not any harder for beginners than any other programming language.

> It doesn't carry over into large scale applications.

It works far better and more powerfully in large scale applications. Lisp is unmatched in its ability to write large applications, in terms of power and functionality, with fewer developers and fewer lines of code than almost any other language. This is pretty much the whole point of the "Lisp Curse".

And no, it's not code golf. It is higher order, more powerful abstractions and coding techniques.


Programming languages were created in order to bridge the gap between our minds and computer hardware. Why wouldn't each language be created in a way that the language's author(s) believed best to bridge that gap? Most of these first era languages were available on every platform. People picked the language that worked best for them. The language that "spoke" the most like they did.

>> It works far better and more powerfully in large scale applications. ... This is pretty much the whole point of the "Lisp Curse".

That's the premise of the article. I believe that premise is incorrect. Lisp may be a higher abstraction but it's a more difficult abstraction for many programmers to think in. As I stated, I believe that's a direct result of its syntax and all that syntax entails.


> Algol descendants are popular, in part, because they're closest to the way people think.

All it takes is 10 or 12 years of schooling, and suddenly people realize this is the natural way to think! Except for all the exceptions like pow(x, 2) which nobody can figure out. Or x=x+1, which flies in the face of those past 10 years of schooling.

Algol, of course, still makes you juggle math, stacks, and trees, but they’re completely invisible so you have to keep them in your head, along with the bidirectional mapping to the complex syntax required by the compiler. But what you can’t see can’t hurt you, right?


C, Forth, Lisp these were all "first era" languages. Each had a chance to be used, to grow, to spawn other languages. To become the common way of thinking. Nobody forced the paths each language took. They grew organically. As people taught themselves how to code they picked the language that was closest to the way their minds worked.

>> Algol, of course, still makes you juggle math, stacks, and trees ...

Agreed. It just does so in a way more comfortable to the most number of people.


> C, Forth, Lisp these were all "first era" languages. Each had a chance to be used, to grow, to spawn other languages.

Do you have any idea of how much is there from Lisp in e.g. Javascript and Python? With the only exception of prefix notation - there is a LOT of things directly inspired by Lisp in many modern languages.

Comfortable to the most number of people? Following the same logic we can conclude that Mandarin is the most comfortable spoken language, because it's the most common language on Earth.


>> Do you have any idea of how much is there from Lisp in e.g. Javascript and Python?

Absolutely! I'm not saying that Lisp is a bad language. I like Lisp (and Forth). I'm saying that its syntax (and the why of its syntax) are the primary reasons it hasn't caught on. I'm just not buying the "too powerful" and such arguments.

>> ...we can conclude that Mandarin is...

Most people pick their spoken language by their surroundings and what their family speak. Children aren't making a choice. In the early days of computers when the root languages were being created everyone had a choice. Maybe not as much now but still anyone can pick up new languages during their careers.


> it hasn't caught on

That's where you got it all backwards. Lisp (as an idea) after six decades is still thriving. There is a Lisp dialect for almost any platform. Clojure surpassed in popularity languages like F#, OCaml and Haskell, even though they are older than Clojure. There are more Clojure conferences, more meetups, more podcasts and more jobs. There are regular conferences in the US, Canada, Brazil, India and in almost every European country. Some of them have multiple Clojure-focused conferences. Emacs and emacs-lisp is still being developed and used by thousands around the world. Common Lisp is still in use. Racket is pretty popular in academia and beyond.

Again, Lisp is just an idea. If you grok the awesomeness of that idea you may someday find what makes it so powerful. Lispers often talk about "enlightenment" and call Lisp "a secret alien technology". They don't do it simply to justify their choices or to create aura with elitist attitude to scare outsiders.

Akin to the idea of Vim as a modal text editing - it takes time to get it, to learn it, to understand it. But one day you may realize the power behind it. Most people, being spectators don't get it. Sometimes people would learn just the basics of Vim, move onto something else and would be like "meh". But those who stick around, they soon may find an incredible new world that changes their subjective view and understanding of how things work. And they realize that there's nothing that can convince them to "unsee" this new world of possibilities. And just like there are many Lisp dialects, there are many ways to run Vim. You can do it in almost any modern IDE and editor.

Lisp is the same. It is a simple but quite powerful idea, if you don't see its power - perhaps because you haven't really looked.

Unfortunately yes, that's not the best way to "sell it", but that's the truth. You have to be a Vimmer to see the power behind modal editing, you need to become a Lisper to see true power of Lisp. That is why neither Lispers nor Vimmers have any illusions - they know, their way of thinking will never become mainstream.


I'll agree that the ideas and power of Lisps have caught on and are even pervasive. Just that "thinking in Lisp" is what I believe has held it back.

I have fond memories of a number of editors: VIM, EMACS, VEdit, and KEdit (Rexx scripting). When the mouse centric editors appeared on the scenes I found they worked better for me. As someone who has coded in Lisp, Forth, C, etc. and in a wide variety of editors I guess our experiences differ.


> I have fond memories of a number of editors: VIM, EMACS ... When the mouse centric editors appeared on the scenes

End 70s on the Lisp Machine (had a megapixel bitmap screen, mouse and GUI) with the Zmacs editor. The second Emacs ever written and the first one written in Lisp.


Please stop spreading nonsense. The syntax that may be a bit unfamiliar to uninitiated stops being an issue within literally hours. It takes a few days to completely become comfortable with it. I have seen people learning Clojure as their first language, I've seen people coming from other languages. But I have never met anyone who actually wrote Clojure for a few months and still couldn't get used to the syntax. Only people who are completely unfamiliar with the language do sometimes complain about the syntax. Because again - to uninitiated it doesn't look very familiar. Just like indo-arabian numerals looked strange to Europeans in 13th century.

The syntax of lisp is objectively bad from a UX perspective since it doesn't properly separate different kinds of objects. Our brains are very good at parsing symbols and very bad at parsing text, and lisp forces you to parse text to understand the structure of the code. That is great for computers to parse and generate since the structure is uniform, which makes lisp great for meta-programming, but it also makes it a horrible language for humans.

Any language which PL enthusiasts considers elegant since you can do so much with few abstractions will be a really bad language in practice since humans prefers different kinds of code to have different syntax and look different.


>The syntax of lisp is objectively bad from a UX perspective since it doesn't properly separate different kinds of objects.

I still don't understand what kind of objects you're talking about and why they need to be separated? Why pencil.writeOn(paper) is a good UX and (pencil.write-on paper) is a bad UX?

> Our brains are very good at parsing symbols

Are you saying we should be all using APL? Point me to a single academic source where it says "from neuroscientific point Lisp syntax is not so good for humans". For the same reasons why Sanskrit writings would not be very easy for me to read, Lisp is not very easy to read to some people. And the reason is - unfamiliarity. But does that mean that written Sanskrit is a bad UX or it can't convey information or used effectively as a language? I will state again: I have never met a single person who after writing Clojure for a few months would still complain about its syntax. Not saying they don't exist, I just have never met any. Besides - after writing Lisp for a few years I find it a bit difficult for myself to read code in other languages - Python, Javascript, C, Rust, etc. And I have been programming in non-lispy languages for over twenty years. I also personally know a few programmers who started with Clojure as their very first PL and then later tried to learn another, more traditional language and they all have struggled. It took time for them to adjust to new, unfamiliar syntax. ALL of them said that they prefer Clojure syntax.

> and lisp forces you to parse text to understand the structure of the code

How is Python, Javascript or even SQL are different? You still have to mentally parse the text to understand structure of the code

> Any language which PL enthusiasts considers elegant since you can do so much with few abstractions will be a really bad language in practice

That is why Lisp refuses to die? When Fortran is forgotten, COBOL completely dead, C replaced with Rust and Java with Scala and Kotlin, PHP gone and Ruby becomes unpopular, there still will be at least a few dialects of Lisp still very much alive and thriving.

Take Emacs for example. For decades different IDEs and Text editors been trying to "kill" Emacs, yet it still thriving and still being used and packages written in emacs-lisp are in abundance and new ones being developed all the time. Check Github stats for different languages - you will be surprised of how much emacs-lisp is published on Github alone.

"Really bad language in practice" turns out to be very pragmatic choice, otherwise how would you explain that Clojure been successfully used at companies like Apple and Walmart, at NASA and many other https://clojure.org/community/success_stories. There are multinational companies like Funding Circle that have built their businesses on Clojure.

Many seasoned CS veterans repeat over and over: "Learn Lisp if you want to master the art of programming".

So if you don't get Lisp syntax - it's just you. You aren't used to it, it's unfamiliar, that's all. But there's nothing inherently "bad" about it.


You caught me. I'll stop spreading nonsense.

Sigh. Somebody always says this and it always sounds just as narcissistic.

"My brain cannot handle parentheses, therefore no brain can handle parentheses."

There's no sin in saying "I don't like parentheses." But please, please stop generalizing that to "most people" because you really, really have no idea.


Sigh. Somebody always responds without reading the full comment.

"I don't like part of the first sentence therefore there's no need to consider the rest of what the comment says."

There's no sin in saying you don't agree with a comment. But please, please stop taking one point out of context just because it pushes personal "anger buttons" because you really, really have no idea.


There are many different ways of thinking.

> Haskell forces you to juggle "math." Forth forces you to juggle "stacks." Lisp forces you to juggle "trees."

Javascript forces you to juggle semicolons, Python forces you to juggle indentation, Java forces you to juggle nested class hierarchies, XML forces you ...


Indentation and semicolons don't impact the number of things your mind has to remember at one time. Nested classes are a problem as we're finally seeing with a bit of a backlash against OOP these days. XML is just markup and not a programming language so there's no "action" that needs to be carried around.

It's not the hierarchy itself that's the problem. The difficulty is remembering all that the hierarchy is doing before setting a chunk of it aside that causes the problem. Especially prefix trees.


> Indentation and semicolons don't impact the number of things your mind has to remember at one time.

I disagree. It didn't feel like impeding factor back in the day when I was writing Python, C#, Javascript, Typescript, etc. But working in Lisp I realized how many things I used to deal with, constantly shifting my focus to properly style, indent and punctuate the code so it would work and look easier to reason about.

Writing Lisp is more like writing poems without having to worry if it rhymes or not. You can write code just as you would draw prototypes on a piece of paper. Except you can evaluate any part of it without any preceding ceremony. Some prefer to start with smaller pieces and compose them, some would just write it "imperatively" and then break it apart. You don't have to worry about operator precedence, about reserved words, semicolons, all the unnecessary garbage. At the end of the day you realize - the visual garbage that's what it is. It supposed to make code more readable - but in reality it does not, it distracts more than helps. Human brain is fascinating, it plays tricks with you - once you see a dot in the middle of a visual illusion, you lose the ability to "unsee" it. But most people may still try to convince you that there is no dot.


While I disagree with you, your comment is both valid and well stated.



Applications are open for YC Winter 2020

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: