Hacker News new | past | comments | ask | show | jobs | submit | page 2 login
How Lisp Became God's Own Programming Language (twobithistory.org)
614 points by chwolfe on Oct 16, 2018 | hide | past | favorite | 307 comments



... I thought that was HolyC :-)


Lisp programmers and Lisp braggers are distinct groups.


Lisp and Scheme are great. They would be my favorite programming languages, if they had a statically typed, Hindley-Milner type system.

As they stand, they are great learning tools, but I would never build something serious with them. Let alone questions about parallelism, concurrency, available libraries, development tools, etc.

Any suggestions are welcome.


I think that part of the power of LISP is it's dynamic nature but if you want typed options they exist for Clojure and Racket. There is also development on a language called Carp that aims to be a Clojure style language for C https://github.com/carp-lang/Carp.


There's an experimental Racket language called Hackett which is attempting to give you the best of both worlds of Lisp's macros and Haskell's types.

https://github.com/lexi-lambda/hackett


Typed Racket? It's not Hindley / Milner, but it seems pretty powerful.


I'm interested to see how contracts (`clojure.spec` is a new implemenation of them Clojure) can be used to make reading, debugging and maintaining Clojure easier.


Everybody's just praising lisp like it's the best language ever, yet very few people are actually using it - and I think it's because it's really easy to write smart code which has to be explained over and over again to new people (and to your future-you).

I think this should be noted.


On SICP and Lisp, I was recently asked:

    "Thanks a lot for this insightful reply! I've read about how 
    powerful are Lisp languages (for example for AI), my question is: 
    does Emacs really use all this theoretically powerful functionality 
    of these languages? In what way is this metalinguistic abstraction 
    used? In the built-in functions of Emacs, the powerful packages 
    made by the community, or the Elisp tweaking of a casual Emacs user 
    to customize it (or all three of those).

    I've read a lot of people praising and a lot of people despising 
    Elisp. Do these people who dislike Elisp do it because they want a 
    yet more powerful Lisp dialect (like Scheme) or because they want 
    to use a completely different language?

    PD: Excuse my ignorance, I'm still learning about programming. As a 
    side note, would you recommend me to read SICP if I just have small 
    notions of OOP with Python and Java and I want to learn more about 
    these topics? Will I be able to follow it?
Let me start from the end: Reading SICP changed everything I thought I knew about programming and shattered any sort of non-empirical foundation - that I had built up to that point - regarding how my mind worked and how I interfaced with reality. It's not just a book about programming, there are layers of understanding in there that can blow your worldview apart. That said, you do need to make an effort by paying attention when you go through the book and (mostly) doing the exercises. The videos on youtube are also worth watching in-parallel with reading the book. The less you know about programming when you go through SICP, the easier it will be for you to "get" it since you'll have no hardwired - reinforced by the passage of time and investment of personal effort - prior notions of what programming is and how it should be done.

* Metalinguistic abstraction

Short answer: all three.

Long answer: The key idea behind SICP and the philosophy of Lisp is metalinguistic abstraction which can be described as coming up with and expressing new ideas by first creating a language that allows you to think about said ideas. Think about that for a minute.

It follows then that the 'base' language [or interpreter in the classical sense] that you use to do that, should not get in your way and must be primarily focused in facilitating that process. Lisp is geared towards you building a new language on top of it, one that allows you to think about certain ideas, and then solve your problems in that language. Do you need all that power when you're making crud REST apps or working in a well-trodden domain? Probably not. What happens when you're exploring ideas in your mind? When you're thinking about problems that have no established solutions? When you're trying to navigate domains that are fuzzy and confusing? Well, that's when having Lisp around makes a big difference because the language will not get in your way and it'll make it as easy as possible for you to craft tools that let you reason effectively in said domains.

Let's use Python as an example since you mentioned it. Python is not that language since it's very opinionated and constrained by its decisions in the design space and, additionally, has been deliberately created with entirely different considerations in mind (popular appeal). This is very well illustrated by the idiotic Python moto "There's only one way to do it" which, in practice, isn't even the case for Python itself. A perfect example of style over substance, yet people lap it up. You can pick and choose a few features that superficially seem similar to Lisp features but that does not make Python a good language for metalinguistic abstraction. This is a classic example of the whole of Lisp being much more than the sum of its parts, and in reality languages like Python don't even do a good job of reimplementing some of these parts. This is the reason I don't want to just list a bunch of Lisp features that factor into metalinguistic abstraction (e.g. macros and symbols).

* Feedback loops

The other key part of Lisp and also something expressed fully by the Lisp machines is the notion of a cybernetic feedback loop that you enter each time you're programming. In crud, visual terms:

[Your mind - Ideas] <--> Programming Language <--> [Artifact-in-Reality]

You have certain ideas in your mind that you're trying to manipulate, mold and express through a programming language that leads to the creation of an artifact (your program) in reality. As you see from my diagram, this is a bidirectional process. You act upon (or model) the artifact in reality but you're also acted upon by it (iterative refinement). The medium is the programming language itself. This process becomes much more effective the shorter this feedback loop gets. Lisp allows you to deliberately shorten that feedback loop so that you _mesh with your artifact in reality_. Cybernetic entanglement if you will. Few other languages do that as well as Lisp (Smalltalk and Forth come to mind). Notice that I emphasized your mind and reality/artifact in that previous diagram, but not the medium, the programming language. I did that in order to show that the ideal state is for that programming language not to exist at all.

* Differences between Lisps

All Lisps allow you to express metalinguistic abstraction (they wouldn't be Lisps otherwise). Not all Lisps allow you to shorten the feedback loop with the same efficiency.

The Lisps that best do the latter come out of the tradition of the Lisp machines. Today this means Common Lisp and Emacs Lisp (they're very similar and you can get most of what Common Lisp offers on the language level in Emacs Lisp today). For that reason, I don't think Scheme is more powerful than Emacs Lisp, since Scheme lacks the focus on interactivity and is very different to both CL and Emacs Lisp.

As far as other people's opinions go, my rule of thumb is that I'd rather educate myself about the concepts and form my own opinions than blindly follow the herd. Which is why I also think that people who are sufficiently impressed by an introduction to Lisp (such as the OP article) to want to learn it and ask "Which Lisp should I learn? I want something that is used a lot today" are completely missing the point. You'll notice that most programming today is done for money, in languages that are geared towards popularity and commoditization. For me, programming is an Art or high philosophy if we take the latter to stand for love of wisdom. And as someone has said, philosophical truths are not passed around like pieces of eight, but are lived through praxis.

P.S. The talk by Gerry Sussman (https://www.infoq.com/presentations/We-Really-Dont-Know-How-...) that I saw mentioned in HN yesterday provides an excellent demonstration of metalinguistic abstraction and also serves as an amalgamation of some of my other ideas about Lisp.


What I love about Lisp, ML and logic languages is how they come down to the basics of CS, Algorithms + Data Structures.

Ideally the same solution can then be applied to a single core, hybrid CPU + GPU, clustered environments, whatever.

Yes, abstractions do still leak, specially if optimization for the very last byte/ms is required, but productivity is much higher than if that would be a concern by each line of code being produced.

And 90% of the times the solutions are good enough for the problem being solved.


Lisp is God's language. TempleOS is God's operating system.


in XKCD comic "Lisp" God also says the universe was mostly hacked together with Perl.

Edit: I see I read the story but I didn't click the first comic, just the second one, Sorry disregard.

Another language that gets holy reverence is smalltalk.



Thank you!


> a field as esoteric as “recursive function theory”

How can this kind of descriptions persist when recursive functions are the very thing that a computer computes.


Are they? They are the very thing that lambda calculus models computation on. They are not the very thing that Turing Machines model computation on. They also are not the very thing that CPUs use, or that assembly uses, or even that most high-level languages use. (Unless you're going to say that most high-level languages use a recursive descent parser to generate the binaries that the CPUs run, and that's the basis for your statement...)


Neither lambda calculus, nor Turing machines "model computation on" recursive functions. Rather, these two, and all the other models of computation are equivalent as they compute the same class of functions from the naturals to the naturals. And we call that class the class of recursive functions.

That being said, a computer is a physical device that is able to compute this entire class of recursive functions. It is what makes it a computer and not a pen or a chair. It's not some esoteric notion. It's the whole thing about it.


That's like saying that the basis of computation is the standard model, since all actual physical computers are made out of particles that are in the standard model. It may be true, but it's so far abstracted from what's actually going on that it's not useful at all.


Because rft is esoteric.

A lot of more applied CS departments don't require a CS theory course, and many programmers didn't major in CS or never attended university.

So there are far more computer programmers than there are people who know what recursive function theory is or how it relates to modern computers.


I thought god mucked around on quantum computers, and I think we can all feel a little bit better that god can't seem to write bug free code either.


Either that, or it's all a feature.


And on the seventh day, he set the version number to pi and declared all bugs to be features.


If you don't believe in God, why speak so spitefully about him?


Because of the damage the concept has done?


Please keep religious flamewar far away from this site.


[flagged]


I must agree. "God's Own Programming Language" (not meant as satire) is certainly indicative of the psychological phenomenon of a superiority complex in the Lisp community.


Yeah, even MIT eventually gave up on using Scheme as a first programming language.

The Lisp community may have the worst ratio of self-regard to actual accomplishments in our entire industry. Not one of the great companies is built on Lisp. I don't think it's even a second-tier language at the FAANGs.


While some will disagree to an extent, I partially judge languages & tools on their amount of usage.

A robust community for a "bad" tool will see improvements made, and solutions for problems quickly disseminated.

Lisp is just not used enough for it to be the best language or the most useful tool.


Why do you think it's a circlejerk? I'm yet to see an idea in the domain of PLs that's as effective as lisp (dependent types come close). Lisp is a whole package, it solves bunch of problems very elegantly. All "good" languages solve problems one by one by introducing exceptions and rules and features whereas lisp did it by introducing a totally novel, elegant concept. It's hard not to be mystified by lisp. I'm not an old school hacker, but there is certainly something fascinating about lisp.


Sure but calling it a “God’s language” is a little too much.


Better than the Haskell one.


Just because it has been going on for longer and the veterans actually know how to get someone other than themselves off ;)


I'm not familiar with any of these memes about lisp. If anything, the stuff you're claiming about lisp I've heard about Haskell.


Lisp - oh what it could have been. It had such potential, but then it got broken. I find it fascinating that those who are dedicated to the proselytisation of Lisp don't see the brokenness of the language. For them, all of the broken things are the features of the language.

Scheme was one attempt to fix some of those flaws.

In latter times, we see the development of Kernel to fix other flaws.

So many second class citizens, so many exceptions to the rule.

I am going through the source code for Maxima CAS (written in Lisp) and in so many ways, it's a mess. I am not at all disparaging those who have been involved in writing the Maxima CAS system and its source. They have done an incredible job and what they have achieved is remarkable.

However, like any software system of any complexity in any language, it has lots of areas that are difficult to maintain, let alone advance. In that regard, Lisp has not been as an advantageous language as it could have been.

Lisp (as in Common Lisp and its add-ons) is not a simple language and it is not a consistent language (see CLHS - Common Lisp Hyper Spec docs).

When I first came across it in the latter 1970's, I thought "wow". But its flaws quickly came to the fore.

So, there is no way that it would ever be God's own programming language. Especially since, God doesn't need to program, that's just for us very limited mortals.


Maxima (née Macsyma) is a program dating back to the 1960s—1968 specifically—that has largely kept the style from the 1960s. Isn’t it a bit amazing you’re even able to read a modern, running, maintained program that’s lasted for 50 years? Five decades is an enormous amount of time that should really be appreciated, even if for a moment, especially in the era of extremely rapid technological growth. I’d estimate at least three, if not more, generations of programmer have worked on Macsyma/Maxima.

Maxima could be modernized with even the features of Common Lisp, but it’s very architecturally stuck in its ways. It’s rife with lists-as-data structures, symbol-plists, and other early Lisp baggage that isn’t found in modern Common Lisp code. The Maxima developer team is also not interested in gratuitous modernization.

I don’t think Maxima’s style is a reflection on Lisp so much as it is a reflection of the style of programming at the time, and a reflection on the style of maintenance.


> Isn’t it a bit amazing you’re even able to read a program that’s lasted for almost 60 years?

Not really. One would expect to be able to read most kinds of programs once you get into the language used. Though there are various languages that deliberately inhibit understanding - these tend to be somewhat esoteric in nature and were intended to be difficult to read.

Maxima uses Common Lisp and many of those who support the code base use Common Lisp to do so. The source code problems are a reflection of Lisp as is the source code of other major Lisp programs still being actively maintained. I have a number of these and my investigations have led me to believe that Lisp and Common Lisp, in particular, are in no way the panacea that the Lisp community portrays.

It is more about how you go about your development and how you document your than about the language you use to write your code. I personally have a language of choice. It allows me to solve the problems I have in an easier manner than other languages. However, there are plenty of warts on it and it can quite easily get in the way of problem solving.

Regardless of whether you like Lisp or hate it, the language has inconsistencies and problems which will come back and bite you. If you love the language and it works for you, good. But be honest when talking to others about the problems with the language and what it cannot do. This will at least let others who have not been introduced to it to make a fair value judgement about whether or not to give it a try.

As a matter of course, when discussing programming languages with young people, I encourage them to look at all sorts of languages apart from the ones they are familiar with. These languages include Lisp, Fortran, Algol 60, Simula, Smalltalk, Forth, Icon/Unicon, OCaml to just name a few.

I let them know that each language allows them to look at different kinds of problem solving techniques and that each is a welcome tool in their repertoire that will allow them to continue advancing in their skills.


> Maxima uses Common Lisp

Maxima is a direct port of Macsyma to Common Lisp and has been maintained in CL since the mid 80s. But it has not been re-architectured to take advantage of the many improvements in CL. Macsyma took advantage of the backwarts-compatibility of CL to Maclisp.

> The source code problems are a reflection of Lisp as is the source code of other major Lisp programs still being actively maintained

Many other Lisp programs have radically different source code / architecture from Macsyma. Check out the architecture of Reduce or Axiom. Maxima itself implements a language on top of Lisp.

Reduce is written in Portable Standard Lisp (PSL). It uses an algebraic Lisp (without s-expression syntax), written in (PSL).

Axiom implements an advanced statically typed language on top of Lisp and the original implementation makes EXTENSIVE use of literate programming.

If you want to see what actual CL specific code looks like, you would need to look at code bases which have been architectured for CL - after end 80s / early 90s onwards, when CLOS and a bunch of other things was added to the language.

One of the early great code bases for CL was the prototype implementation of CLOS with a meta-object protocol: Portable Common LOOPS.

https://github.com/binghe/PCL

There is a book which describes a simpler implementation - but with still quite nice code:

The Art of the Meta-object Protocol.

https://en.wikipedia.org/wiki/The_Art_of_the_Metaobject_Prot...

https://github.com/binghe/closette

> It is more about how you go about your development and how you document your than about the language you use to write your code.

Lisp is called a programmer amplifier. Unfortunately bad programmers get amplified, too.

The software engineering research has had the goal to create better programming languages. That let to Pascal, Ada, ... and a bunch of other languages.

Lisp is the exact opposite. Lot's of freedom. With freedom comes responsibility. I have seen great Lisp code bases and also stuff I could not understand (basically write-once code).

For me the ability to write extremely descriptive Lisp code is a huge attraction. But I know that one can easily write large programs which are hard to understand - and not just because they lack documentation.


I am looking at Axiom and I find it to exhibit the same kinds of problems. As a programmer, I find Common Lisp requires a significant additional burden to understand what "tricks" may be in use. I don't like languages like Java, C#, C++, etc but I can generally follow without too much burden what is going on, and if I have to update some code in these languages, fine.

Common Lisp requires a much higher mental burden just to understand what the code may be doing and it takes somewhat more effort to ensure that you aren't screwing up your code base.

If you are immersed in Common Lisp and CLOS most of the time, you have already assimilated the knowledge that others will have to acquire to get to a level that they can safely modify the code base. This is the point that most Lisp aficionados miss. Steep learning curves are not helpful in "off the cuff" maintenance.

I have refused to update badly written programs when it would take huge amounts of time to learn all the gory details just to make a simple change - it is not worth the angst.

I am interested in the code bases for both Maxima and Axiom and am slowly working through resolving the underlying semantics and algorithms used in both systems. Both systems provide an alternative language in which you can write algorithms for each system. Why do that if Lisp is the "bees knees" so to speak?

You highlight that software engineering has led to languages like Pascal, Ada and others and that Lisp is the opposite. Pascal was nobbled by the implementation (by design), yet it too could have been so much more by removing the second classedness of its types and structures as well as other areas of second classness.

You raise the concept of a programmer amplifier, yet there are many languages that will allow this. Lisp is only one of them. I heard the same said of Forth, Smalltalk and others.

I have seen beautifully written code in all sorts of languages, including Cobol of all languages. I have seen absolutely awful code written in all sorts of languages - some of that code has been my own throw away and get the task done now junk.

The point is that there is no language so far ahead of all the others, irrespective of what anybody may believe. All languages have limitations that make it a pain to use in some way or another. This is one of the reasons that I like learning new languages, someone somewhere has given thought to solving some problem in a more amenable way. Lisp is just one of many that each of us should have a familiarity with. We have lost a lot over the decades by the way we have failed to teach each succeeding generation the wide range of languages that have be made available to us.


> I am looking at Axiom and I find it to exhibit the same kinds of problems. As a programmer, I find Common Lisp requires a significant additional burden to understand what "tricks" may be in use.

I wouldn't expect that you can understand something like Axiom from 'looking' at it. Axiom is easily one of the most capable programming systems ever created.

If anyone is interested: scroll down here http://axiom.axiom-developer.org/axiom-website/documentation... to User documentation and there are programs written in Lisp and languages on top of Lisp.

> I am interested in the code bases for both Maxima and Axiom and am slowly working through resolving the underlying semantics and algorithms used in both systems. Both systems provide an alternative language in which you can write algorithms for each system. Why do that if Lisp is the "bees knees" so to speak?

I don't think Lisp is the "bees knees".

Lisp can implement a full new programming system for the domain of mathematics. Many other programming languages have been developed in Lisp (for example ML) or have implementations written in Lisp (from C to Prolog and Python). Computer Algebra systems like Reduce, Axiom and Macsyma target mathematicians and their special notations. There are other systems for mathematics, which use Lisp syntax: for example Kenzo https://github.com/gheber/kenzo

> If you are immersed in Common Lisp and CLOS most of the time, you have already assimilated the knowledge that others will have to acquire to get to a level that they can safely modify the code base. This is the point that most Lisp aficionados miss. Steep learning curves are not helpful in "off the cuff" maintenance.

Same for any JavaEE software... any sufficiently complex programming system requires lots of education.

> You raise the concept of a programmer amplifier, yet there are many languages that will allow this. Lisp is only one of them. I heard the same said of Forth, Smalltalk and others.

There are a bunch. Smalltalk isn't that much a programmer amplifier because of it's language like Lisp is - that's more a factor of its original interactive Smalltalk IDE - which might also benefit from how the Smalltalk system works.

The Lisp idea of a programmer amplifier is based on the observation that one can write extremely complex software with a small team (example Cyc) and that one can write large amounts of code in much more compact way. As I mentioned the AT&T/Lucent team reported a productivity advantage over an C++ team of up to 10 - and we are talking about a 100+ person Lisp team compared to a much larger C++ team - both shipping products in the same domain: enterprise telco switches. We are talking about code that was measured in MLOCS.

Nowadays this is a bit more difficult, since there are large eco-systems like J2EE/JEE, which can easily dominate productivity.

> The point is that there is no language so far ahead of all the others, irrespective of what anybody may believe.

Lisp is still different from most other 'mainstream' languages in its ability to be programming itself easily, its style of interactivity and by directly supporting ways of linguistic abstraction. It's not so much about having a feature - Java has lambda expressions now - but it's about the whole integration and how it supports the developer. Java has fantastic IDEs, but they support a different workflow from how I develop code - and I have seen many Java developers attempting to interactively write code... it's not pretty.

We can use that as an advantage and we can shoot into our foots. But the fact that coding in Lisp and the resulting software looks radically different from Java development remains. It's not about 'being ahead' - it's more about supporting certain styles of development and certain styles of code well. I think it's absolutely okay when people prefer other tools and even that these are widely used - but I would have to use them differently and the result is different.


> I wouldn't expect that you can understand something like Axiom from 'looking' at it. Axiom is easily one of the most capable programming systems ever created.

Let me ask you a question - What do you mean by "looking at it"? I am getting into reading the code, finding the definitions, analysing what has been written and taking notes as to what various parts actually mean. This is in part dependent on the SPAD code that the Lisp code is supposed to implement.

In terms of that, Lisp is being used to provide a basic system and another more appropriate language to do your work in. In effect, providing the same services that C is used for. I have available to me a number of Lisp systems that were implemented in C.

In regards to C++, I find it a "monstrous" language and having Lisp be more productive than C++ is no surprise. But is Lisp more productive than all other languages?

> We are talking about code that was measured in MLOCS.

Then we must ask the salient question - How much of that code was actually necessary? One of the "features" of Lisp is the ability to write code generating macros. The code using those macro calls looks small, but in reality the code is many times the size that it appears to be. Okay, my question to you is this - instead of using macros, could the code have been restructured in other ways that would mean far less code being written?

Lisp has its uses and a knowledge of the language enhances your ability to be a better programmer. This is true of all languages though. Each language provides an insight into a class of problems as a better solution language.

> Lisp is still different from most other 'mainstream' languages in its ability to be programming itself easily, its style of interactivity and by directly supporting ways of linguistic abstraction.

You use the word "mainstream" now. Too often it is declared that Lisp is better than "every" other language. It has its advantages, no problem with that at all. It also has its problems as a language. When the problems with the language are glossed over by the Lisp aficionados then Lisp loses the war.

> But the fact that coding in Lisp and the resulting software looks radically different from Java development remains.

No issue with this and this is true of other languages as well. I am no fan of Java.

> It's not about 'being ahead' - it's more about supporting certain styles of development and certain styles of code well. I think it's absolutely okay when people prefer other tools and even that these are widely used - but I would have to use them differently and the result is different.

I agree with you here. Different languages give rise to different styles and development and are applicable for different problem domains. I would really like to see in our education systems more provision to the exposure of the new generations to all sorts of languages and a serious encouragement to investigating all sorts of languages for different problem domains. But first and foremost, there needs to be a serious push to the proper documenting of code.


> But is Lisp more productive than all other languages?

Not really, but you can make Lisp very productive.

https://www.cse.iitk.ac.in/users/karkare/courses/2010/cs653/...

See the table for the development time. Relational Lisp (a Common Lisp with support for relations as an extension) had by far the shortest development time... since the particular language has a high-level look&feel, there is also less documentation needed - it might be more like an executable spec.

> instead of using macros, could the code have been restructured in other ways that would mean far less code being written?

By a custom code generator, perhaps. But that's a tool on top of C++. Another stage. Another tool...

> Too often it is declared that Lisp is better than "every" other language

That's a nonsense believe. 'better' is not a quality we can measure or even argue about.


> then it got broken

Got broken? I think of it more as having failed to obtain/coordinate the resources needed to progress.

What it means to have a healthy language ecosystem has advanced. 1970's Prolog implementations couldn't standardize on a way to read files. 1980's CommonLisp did, but had no community repo. 1990's Perl did, but few languages then had a good test suite, and they were commercial and $$$$. Later languages did, but <insert-your-favorite-thing-that-we-still-suck-at>.

And it's not easy for a language to move on. Prolog was still struggling with creating a standard library decades later. CommonLisp and Python had decade-long struggles to create community repos. A story goes that node.js wasn't planning on a community repo, until someone said "don't be python".

The magnitude of software engineering resources has so massively ramped, that old-time progress looks like sleep or death. Every phone now has a UI layout constraint system. We knew it was the right thing, craved it, for years... while the occasional person had it as an intermittent hobby project. That was just the scale of things. Open source barely existed. "Will open source survive"? was a completely unresolved question. Commercial was a much bigger piece of a much smaller pie, but that wasn't sufficient to drive the ecosystem.

The Haskell implementation of Perl 6 failed because the community couldn't manage to fund the one critical person. It was circa 2005, and the social infrastructure needed to fund someone simply wasn't the practiced thing it is now.

And we're still bad at all this. The javascript community, for all it's massive size, never managed to pick up the prototype-based programming skills of self and smalltalk. The... never mind.

It's the usual civilization bootstrap sad tale. Society, government, markets, and our profession, are miserably poor at allocating resources and coordinating effort. So societally-critical tech ends up on the multi-decade glacial-creep hobby-project-and-graduate-student installment plan. Add in pervasively dysfunctional incentives, and... it becomes amazing that we're making such wonderful progress... even if is so very wretchedly slow and poor compared to what it might be.

So did CL get broken? Or mostly just got left behind? Or is that a kind of broken?


The critical person behind the Haskell implementation of Perl 6 got ill. It had nothing to do with funding.


You raise interesting history and it's a good thing to see the perspective as you've given.

I don't know if Common Lisp got left behind or just took a completely different path. From my perspective, it got broken with its macro system decisions, it dynamic/static environment decisions and its namespace decisions. It created too many second class citizens within the language which means that you have to know far more than you should in understanding any part of the programs you are looking at.

Every choice a language designer makes affects what the language will do in terms of programmer productivity, not only for the original developers of programs using that language, but also for all those who come later when maintaining or extending those programs.

I have come to the conclusion that a language can be a help when writing the original program and become a hindrance when you need to change that program for any reason. It is here that the detailed documentation covering all the design criteria and coding decisions, algorithm choices, etc, become more important than the language you may choose.

Both together will enable future generations to build upon what has been done.

All the points that you have highlighted above are important, but the underlying disincentive to provide full and adequately detailed documentation will work against community growth. No less today than in the centuries past is the hiding away of knowledge where individuals are not willing to pass on the critical pieces unless you are a part of the pack or do not think it is important enough to write down because it is obviously obvious.

To understand a piece of Lisp code, one has to know what the special forms and how they interact, what the macros being used are and what code they are generating and what the various symbols are hiding in terms of their SPECIALness might be. These things may help in writing the code, but they work against future programmers in modifying the code. Having had to maintain various code bases that I did not write in quite a variety of different languages, I have found that "trickily" written code can become a nightmare to bring about required changes. I have found that Lisp code writers seem to like writing "trickily" written code.

Now, that is only one person's perspective and someone else may find something completely different. That is not a problem as there are many tens of .... programmers in the world. Each one having a perspective on how to write good code.


> namespace

Nod. I fuzzily recall being told yeas ago of ITA Software struggling to even build their own CL code. Reader-defined-symbol load-order conflict hell, as I recall. And that was just a core engine, embedded in a sea of Java.

> second class citizens

I too wish something like Kernel[1] had been pursued. Kernel languages continue to be explored, so perhaps someday. Someday capped by AI/VR/whatever meaning "it might have been nice to have back then, but old-style languages just aren't how we do 'software' anymore".

> detailed documentation covering all the design criteria and coding decisions

As in manufacturing, inadequate docs can have both short and long-term catastrophic and drag impacts... but our tooling is really bad, high-burden, so we've unhappy tradeoffs to make in practice.

Though, I just saw a pull request go by, adding a nice function to a popular public api. The review requested 'please add a sentence saying what it does.' :)

So, yeah. Capturing design motivation is a thing, and software doesn't seem a leader among industries there.

> enable future generations to build upon what has been done.

Early python had a largely-unused abstraction available, of objects carrying C pointers, so C programs/libraries could be pulled together at runtime. In an alternate timeline, with only slightly different choices, instead of monolithic C libraries, there might have been rich ecology. :/ The failure to widely adopt multiple dispatch seems another one of these "and thus we doomed those who followed us to pain and toil, and society to the loss of all they might have contributed had they not been thus crippled".

> To understand a piece of Lisp code [...struggle]

This one I don't quite buy. Java's "better for industry to shackle developers to keep them hot swappable", yes, regrettably. But an inherent struggle to read? That's always seemed to me more an instance of the IDE/tooling-vs-language-mismatch argument. "You're community uses too many little files (because it's awkward in my favorite editor)." "You're language shouldn't have permitted unicode for identifiers (because I don't know how to type it, and my email program doesn't like it)." CL in vi, yuck. CL in Lisp Machine emacs... was like vscode or eclipse, for in many ways a nicer language, that ran everything down to metal. Though one can perhaps push this argument too far, as with smalltalk image-based "we don't need no source files" culture. Or it becomes a "with a sufficiently smart AI-complete refactoring IDE, even this code base becomes maintainable".

But "trickily" written code, yes. Or more generally, just crufty. Perhaps that's another of those historical shifts. More elbow room now to prioritize maintenance: performance less of a dominating concern; more development not having the flavor of small-team hackathon/death-march/spike-into-production. And despite the "more eyeballs" open-source argument perhaps being over stated, I'd guess the ratio of readers to writers has increased by an order of magnitude or two or more, at least for popular open source. There are just so very many more programmers. The idea that 'programming languages are for communicating among humans as much as with computers' came from the lisp community. But there's also "enough rope to hang yourself; enough power to shoot yourself in the foot; some people just shouldn't be allowed firearms (or pottery); safety interlocks and guards help you keep your fingers attached".

One perspective on T(est)DD I like, is it allows you to shift around ease of change - to shape the 'change requires more overhead' vs 'change requires less thinking to do safely' tradeoff over your code space. Things nailed down by tests, are harder to change (the tests need updating too), but make surrounded things easier to change, by reducing the need to maintain correctness of transformation, and simplifying debugging of the inevitable failure to do so. It's puzzled me that the TDD community hasn't talked more about test lifecycle - the dance of adding, expanding, updating, and pruning tests. Much CL code and culture predated testing culture. TDD (easy refactoring) plus insanely rich and concise languages (plus powerful tooling) seems a largely unexplored but intriguing area of language design space. Sort of haskell/idris T(ype)DD and T(est)DD, with an IDE able to make even dense APL transparent, for some language with richer type, runtime, and syntax systems.

Looking back at CL, and thinking "like <current language>, just a bit different", one can miss how much has changed since. Which hides how much change is available and incoming. 1950's programs each had their own languages, because using a "high-level" language was implausibly heavy. No one thinks of using assembly for web dev. Cloud has only started to impact language design. And mostly in a "ok, we'd really have to deal with that, but don't, because everyone has build farms". There's https://github.com/StanfordSNR/gg 'compile the linux kernel cold-cache in a thrice for a nickle'. Golang may be the last major language where single-core cold-cache offline compilation performance was a language design priority. Nix would be silly without having internet, but we do, so we can have fun. What it means to have a language and its ecosystem has looked very different in the past, and can look very different in the future. Even before mixing in ML "please apply this behavior spec to this language-or-dsl substrate, validated with this more-conservatively-handled test suite, and keep it under a buck, and be done by the time I finish sneezing". There's so much potential fun. And potential to impact society. I just hope we don't piss away decades getting there.

[1] https://web.cs.wpi.edu/~jshutt/kernel.html


My point about "understanding the code" and the burden of additional information to retain is about the semantics applicable to the language itself, not about the tooling that we have build around it for development.

Lisp started with some core simple ideas to which were added many others. For some like the dynamic scoping, simple idea that it is, it has complexity interactions with the rest of the language. These interactions increase the knowledge burden that must be retained at all times to be able to make sense of what you are reading. This burden is on top of any knowledge burden you need to carry in relation to the application you are modifying or maintaining.

This is about what are the things you design as part of your language, not the things you do with your language. This was what I was trying to somewhat humorously write in my first comment. As I look back over it, I failed to make that clear.

Lisp had the beginnings of "wow", but then it took a wrong turn down into a semantic quagmire. Scheme started to fix that and later Kernel was another attempt.


> failed to make that clear [...] the burden of additional information to retain is about the semantics applicable to the language itself, not about the tooling that we have build around it for development. [...] knowledge burden that must be retained at all times to be able to make sense of what you are reading

Not lack of clarity I think - it seems there's a real disagreement there. I agree about the burden, and the role of complex semantics in increasing it. But I think of bearing the burden as more multifaceted than being solely about the language. I think of it as a collaboration between language, and tooling, and tasteful coding. For maintenance, the last is unavailable. But there's still tooling. If the language design makes something unclear and burdensome, it seems to me sufficient that language tooling clarifies it and lifts the burden. That our tooling is often as poor as our languages, perhaps makes this distinction less interesting. But a shared attribution seems worth keeping in mind - an extra point of leverage. Especially since folks so often choose seriously suboptimal tooling, and tasteless engineering, and then attribute their difficulties to the language. There's much truth to that attribution, but also much left out.

Though as you pointed out, cognitive styles could play a role. I was at a mathy talk with a math person, and we completely disagreed on the adequacy of the talk. My best trick is "surfing" incompletely-described systems. His best trick is precisely understanding systems. Faced with pairs of code and tooling, I could see us repeatedly having divergent happiness. Except where some future nonwretched language finally permits nice code.


> These interactions increase the knowledge burden that must be retained at all times to be able to make sense of what you are reading. This burden is on top of any knowledge burden you need to carry in relation to the application you are modifying or maintaining.

That's why you have a Lisp where you can interactively explore the running program.

> Lisp had the beginnings of "wow", but then it took a wrong turn down into a semantic quagmire. Scheme started to fix that and later Kernel was another attempt.

I think that's misguided. Lisp is not a semantic quagmire that was tried to be fixed with Scheme or Kernel. As a Lisp user, I find it great that someone tries to revive Fexprs, but practically it has no meaning for me.

Lisp is actually a real programming language and its users have and are determining what features it has.


node.js's API and require() was based on CommonJS [1], and server-side JS was a thing since the Netscape times around 1996.

Prolog's API (and syntax) became a formal standard only in 1995, but Edinburgh Prolog was widely used way before (early 1980's or earlier) [3].

[1]: http://wiki.commonjs.org/wiki/CommonJS

[2]: https://www.iso.org/standard/21413.html

[3]: https://www.cs.cmu.edu/Groups/AI/util/lang/prolog/code/tools...


> node.js's API and require() was based on CommonJS [1], and server-side JS was a thing since the Netscape times around 1996.

I'm sorry, I'm missing the point. Perhaps I should have said community code repository/database? CPAN, PyPI, npm.

> Prolog's API (and syntax) became a formal standard only in 1995, but Edinburgh Prolog was widely used way before

As was Quintus prolog, the other big "camp". A SICStus (Edinburgh camp) description: "The ISO Prolog standardization effort started late, too late. The Prolog dialects had already diverged: basically, there were as many dialects as there were implementations, although the Edinburgh tradition, which had grown out of David H.D. Warren’s work, was always the dominant one. Every vendor had already invested too much effort and acquired too large a customer base to be prepared to make radical changes to syntax and semantics. Instead, every vendor would defend his own dialect against such radical changes. Finally, after the most vehement opposition had been worn down in countless acrimonious committee meetings, a compromise document that most voting countries could live with was submitted for balloting and was approved. Although far from perfect," [...] "contains things that would better have been left out, and lacks other dearly needed items," https://arxiv.org/abs/1011.5640 The later took more years.

Similar 'incompatible dialects' balkanization afflicted other languages. CommonLisp and R?RS were the lisp equivalents of ISO Prolog. Acrimonious committees.

It's happily become increasingly hard to imagine. A general pattern across languages of nested continuums of incompatibility. No sharing infrastructure. Diverse tooling. Hand documentation of dependencies. Each company and school with their own environment, that had to be manually dealt with in any struggle to borrow code from them. Small islands of relative language compatibility around the many implementations, nested in less compatible camps, nested in a variously incompatible "language" ecology. More language family than language. To download is to port. With no way to share the NxN effort, so everyone else gets to do it again for themselves.

Perhaps an analogy might be python 2 and 3 being camps that aren't going away, with largely separate communities (like openbsd and linux), with no language test suites and variously incompatible implementations (cpython, jython, ironpython). No BDFL, and an acrimonious committee process struggling to negotiate and nail to the floor a CommonPython standard. And far far fewer people spread among it all, so each little niche is like its own resource-starved little language. Imagine each web framework community struggling to create/port it's own python standard library and tooling.

The opportunity to complain about left-pad was... like complaining about wifi slowness for your kindle, at your seat at 30 thousand feet over the atlantic. :)


> I'm sorry, I'm missing the point. Perhaps I should have said community code repository/database? CPAN, PyPI, npm.

Maybe I've missed your point, I just wanted to point out that node.js' API and core module system was based on community consensus.


> Maybe I've missed your point,

Ah, ok. I was pointing out there that something we now take for granted, being able to "yarn add foo" (ask a build tool to download and install a foo package/module from the central javascript npm database which collects collective community effort), didn't used to be a thing. It was once "search (on mailing list archives, later on web) for where some author has stashed their own hand prepared foo.tar file (no semantic versioning originally) on some ftp server somewhere (hopefully with anonymous login; and reading the welcome message go by often describing how/when they did/didn't want the server used; and often groveling over directory listings, and reading directory FILES and READMEs, to figure out which might be the right file); download it; check the size against the ftp server file listing to see if it's likely intact (no sums, and truncation isn't uncommmon); check where it's going to spray its files (multiple conventions); unpack it; read README or INSTALL to get some notes on how it was intended to be built, and perhaps on various workarounds for different systems; variously struggle to make that happen on your system; including repeating this exercise for any dependencies; and then hope it all just works, because there's no or only very minimal tests to check that".

Python was originally like this. Then there were years of "we're trying to have a Perl-like central repository... again... but we're again not yet quite pulling it off as a community..." There's a story that the python experience was the cautionary tale which motivated having a single officially-sanctioned npm code repository to serve node. Instead of not, and hoping it would all just work out. Using 1990's python was a very different experience than using 2010's python, far more than the difference of pythons 1 and 3. And the 2020's python experience may become something where you can't imagine ever going back... to how you're handling python now.


Sorry, I've only now come to read your post(s). I guess if one were to compare npm with anything, Java's maven central is named as reference at multiple places in the npm source and on gh forums, and is also the point of reference for CommonJS modules, since many of the early CommonJs/node adopters were Java fall-outs.

I know very well how downloading packages and patches used to be like in the 90s ;) and I think it was Perl/CPAN who nailed the idea of a language-specific central repository and automatic dependency resolution, though it was practiced in Debian (and possibly pkgsrc) before that. Not that I had much success using CPAN to resolve old .pm's; these seem to be lost to bitrot.


This is a fantastic perspective, one that I hadn’t thought about as clearly as you’ve laid out.


Lisp indeed doesn't live up to its admirable ideals. Parent comment is on the mark.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: