Yes lots of programming consists of gluing together libraries, but in my experience languages vary dramatically at how well they do even this.
For example, consider logging libraries. It's useful to have statements like log.debug(someSlowFunction()) in your code. In LISPs it's really easy to create a (debug) macro that generates empty code when you don't want debug logging turned on. In other languages, you have to wrap the arguments in a function to avoid extra runtime costs, and even then you can't avoid the "if debug" conditional at runtime. All those anonymous function wraps add clutter, and that clutter accumulates. There are many other cases where having advanced language features greatly helps gluing together libraries.
Another aspect is the tooling. When I am considering a new library, I like to try it out in the REPL. In Clojure I can quickly start calling library functions, and use the (time) macro to get a sense of how long they take to evaluate. Not all popular languages are amenable to this kind of REPL-driven experimentation with libraries.
Not only does the language impact how you use libraries, but it also impacts what libraries may exist. Some libraries are simply not possible to write in less powerful programming languages. In LISP this would include any library that uses a macro. For example, Clojure was able to introduce core.async as a library, providing an async facility similar to what golang offers. But in most languages you wouldn't be able to implement that as a library.
Another major example is reagent vs. react. The concise hiccup representation supported by Reagent is only possible because of design decisions that went into Clojure. JavaScript users are stuck with JSX, which is less concise, and in my opinion far less good.
Another issue that arises when using libraries is whether or not the language has a static type system. Without getting into the age-old flamewar about static vs dynamic typing, I'll just note that popular languages differ in this dimension, and this has a big impact on what it's like to glue together libraries.
So overall, I think this essay undersells the benefits of LISP. Even if you spend all day gluing together libraries, LISP makes that much better by improving how you can call libraries, how you can quickly experiment with them, and even what kinds of libraries can exist.
> All those anonymous function wraps add clutter, and that clutter accumulates.
Another great example of this is feature flags -- adding/removing an entire chunk of code in most languages tends to be limited to (for example) the inside of a function.
... is not possible in most languages, where you end up intermingling conditionals into the body of definitions and functions all over the place, creating dead code and issues besides.
> JavaScript users are stuck with JSX, which is less concise, and in my opinion far less good.
And a key aspect to it as well is that JSX is a fully-custom extension, not something that can be implemented in JS as a library -- whereas Hiccup is 'just another library' allowing for fast iteration and experimentation by the community.
If the code in the parens that does what I think it does, Ruby has the same capability. Ruby allows you to redefine and even undefine... just about everything. Case in point: `binding.local_variable_set`. https://www.rubydoc.info/stdlib/core/Binding I think the only thing you can't do is undefine a local variable.
I've used environmental variables to determine which modules extend a class. This allowed me to test the same code against two different apis while we were migrating vendors. While this is done at "compile time", it could have easily been done during run time.
One nightmare of a system I worked on had the initializer of a class conditionally load modules to completely redefine the class. I called it reverse inheritance because it did the same thing as inheritance, without any of the readability or simplicity. Just because you can do a thing doesn't mean you should do a thing. XD
But I think Ruby is the exception that proves the rule. It allows absolutely crazy things and probably deserves the label of being a weird language. It's just wrapped in the trappings of a non-weird language.
I am going to argue for a most normal language with your points.
> For example, consider logging libraries. It's useful to have statements like log.debug(someSlowFunction()) in your code.
Java/C# have good debuggers and thus you can set breakpoints.
You often want to have a debug printing in final release build anyway, activated through a --verbose option for example.
Thus I think this is less useful than it seems.
> Another aspect is the tooling. When I am considering a new library, I like to try it out in the REPL. In Clojure I can quickly start calling library functions, and use the (time) macro to get a sense of how long they take to evaluate. Not all popular languages are amenable to this kind of REPL-driven experimentation with libraries.
In python REPL you can do a reasonable level of experimentation with libraries. You can't do weird stuff like runtime monkey patching but that's not always considered a good thing.
In java also, there's jshell. (importing maven libs is little roundabout, but there are packages for that, and in return you get a solid ecosystem with static typing and great compilers / runtime.
> Another major example is reagent vs. react. The concise hiccup representation supported by Reagent is only possible because of design decisions that went into Clojure. JavaScript users are stuck with JSX, which is less concise, and in my opinion far less good.
It's a matter of opinion. In my opinion, jetpack compose or flutter (admittedly, fringe language but can be learnt by anyone with java or c# experience), are better way of doing UI than jsx / fxml etc..
> Even if you spend all day gluing together libraries, LISP makes that much better by improving how you can call libraries, how you can quickly experiment with them, and even what kinds of libraries can exist.
I think benefits of macros is way overblown, and higher order functions with ocassionally well documented APIs that use compile-time or run-time reflection are good enough. Macros are not worth the readability and compile time sacrifices.
Overall, my point is, blub paradox isn't one dimensional. I place more importance to tooling and libraries than language itself, as long as language isn't horrible.
I agree with your main points but to be nit-picky, every C/C++ project I've ever worked on was/is able to 100% compile out logging code with a compile time flag. No "if debug" is left in the compiled code, it just doesn't exist.
That's a good point. Even though this would weaken my argument (slightly), I now wonder if modern JITs can completely optimize debug.log(someExpensiveFunction()) into a NOOP if they realize that the argument won't end up being used inside the log function.
The javadoc for slf4j's Logger.debug implies that the JIT cannot be relied on for this:
"This form avoids superfluous string concatenation when the logger is disabled for the DEBUG level"
I think that means if you call Logger.debug("a"+"b"), the string concatenation happens even if DEBUG logging is disabled. But maybe JITs have improved since that javadoc was written, or the author was not aware of how smart JITs are.
I would be curious to know if there is a JIT expert who could weigh in on this question.
In the case of non-toy runtimes in general to be honest.
It's one of the simplest and most commonly implemented peephole optimisations - perl has handled it for as long as I can remember, I'm pretty sure cpython does, etc. etc.
For example, consider logging libraries. It's useful to have statements like log.debug(someSlowFunction()) in your code. In LISPs it's really easy to create a (debug) macro that generates empty code when you don't want debug logging turned on. In other languages, you have to wrap the arguments in a function to avoid extra runtime costs, and even then you can't avoid the "if debug" conditional at runtime. All those anonymous function wraps add clutter, and that clutter accumulates. There are many other cases where having advanced language features greatly helps gluing together libraries.
Another aspect is the tooling. When I am considering a new library, I like to try it out in the REPL. In Clojure I can quickly start calling library functions, and use the (time) macro to get a sense of how long they take to evaluate. Not all popular languages are amenable to this kind of REPL-driven experimentation with libraries.
Not only does the language impact how you use libraries, but it also impacts what libraries may exist. Some libraries are simply not possible to write in less powerful programming languages. In LISP this would include any library that uses a macro. For example, Clojure was able to introduce core.async as a library, providing an async facility similar to what golang offers. But in most languages you wouldn't be able to implement that as a library.
Another major example is reagent vs. react. The concise hiccup representation supported by Reagent is only possible because of design decisions that went into Clojure. JavaScript users are stuck with JSX, which is less concise, and in my opinion far less good.
Another issue that arises when using libraries is whether or not the language has a static type system. Without getting into the age-old flamewar about static vs dynamic typing, I'll just note that popular languages differ in this dimension, and this has a big impact on what it's like to glue together libraries.
So overall, I think this essay undersells the benefits of LISP. Even if you spend all day gluing together libraries, LISP makes that much better by improving how you can call libraries, how you can quickly experiment with them, and even what kinds of libraries can exist.