Seibel's book is good, don't get me wrong; I just don't think its the proper pairing in the sense of culinary arts putting together a plate of well paired books.
The common lisp people and the clojure people fight viciously as only siblings can; this will infuriate those in the common lisp tribe but the ideal introductory lisp book in the general sense of the language is probably not Seibel's book its probably the comedic Clojure for the Brave and True. Not aware of anything quite like that, for common lisp.
I recommend this order:
1. Practical Common Lisp
2. On Lisp (kind of a big jump, so maybe check out ANSI Common Lisp)
3. Let over Lambda
4. Lisp in Small Pieces (especially if you want to implement a Scheme or Common Lisp)
With Common Lisp the Language (Guy Steele) and ANSI Common Lisp (Paul Graham) somewhere in the mix.
These are all excellent books and must-reads for anyone interested in programming languages and/or the Lisp family.
I think my favorite Forth introduction comes at it from the other direction, though: https://github.com/nornagon/jonesforth/blob/master/jonesfort...
- Paradigms of Artificial Intelligence Programming: Case Studies in Common Lisp (1991)
- Common Lisp Recipes (2015)
Its a great book for people coming out of non computing backgrounds!!!
Putting this here because I'm not aware of a lot of research done in/with lisp these days (only other thing I know is the quantum computer dsl)
> This our (thirdlaw.tech's) computational chemistry system Cando. Some might remember it from Lisp conferences or our previous videos. Here is a demo of going through a free energy perturbation calculation. FEP is very important in chemistry right now. https://www.youtube.com/watch?v=hZRuGt4TcD8&list=PLbl4KVdl9U...
If you have any questions I'd be happy to answer.
If you want to learn more about the programming side of our work (not so chemical), check out our other videos. They are from programming conferences:
Shorter one from the LLVM meeting:
(yes, the whole system is a compiler using LLVM as the last stage to make machine code from abstract assembly. The thing I always missed in SBCL)
There are some web frameworks in there that are fairly impressive.
There is very little advertising going on. It goes a bit back to what your folks mentioned here, a bit of dejection and being condescending to other languages. Common Lisp is the only language with reasonable compile-time computing right now, and it is hard to bring that concept over to people who still think about writing software, as opposed to living with changing software for 30 years. So the Lisp community is a bit in the mindset that those who get it do so on their own, and software is distributed among those who already know they want.
The Cando chemical computation system (based on Clasp) in the video posted below/above is open source, BTW.
How does D compare, with its CTFE (Compile-Time Function Evaluation/Execution)?
Interestingly, it was originally written in Python, but the author decided to rewrite the entire thing in Common Lisp due mainly to Python's poor performance and the lack of native thread support. The result Common Lisp version is 30 times faster than the old Python one. More info can be found in the author's blog post .
I need to get cracking on backporting the changes we made to cl-jupyter for Cando/Clasp to portable Common Lisp tho. Sigh. So little time.
(It's open source but uses features of a proprietary CL implementation: https://github.com/franzinc/nfs )
... really? the most advanced language (?), are we back in the 50ies again?
In Lisp you have the same language at your disposal at runtime and at compile time. In C/C++ you have the preprocessor, which is a really lousy language. And in C++ templates. Remember the "template metaprogramming" hype about 10 years ago? How far did they go? Compute some prime numbers? I mean C++ templates are a compile-time language, but they are a language that has only one type - types and no real iterations over collection types. Wait, they have no collection types. The one and only type is types.
So, "macros" in Lisp are really a new capability, and they can make code very dense. But think about the alternative - rolling out all this code with generating it at compile time from a single point of origin. That is the real maintenance nightmare. Every time you make an assumption during programming you want to contain that assumption in a single place. So that when the assumption changes later (otherwise known as "always" when other people are involved in telling you what the program is supposed to do in the end) you change a single place, and the rest of the dependent code is generated at compile time - using the full programming language you chose.
I started a web series on compile-time computing. Didn't get as far as I wished for now, but here is some:
I actually plan to use some of the chemical code I write (for the above mentioned Cando/Clasp chemical package) to make more parts of that series.
There is Boost.Hana and the older Boost MPL and brigand.
C++20 will have more constexpr improvements.
Although I do conceed they require a C++ Wizard hat, and aren't not even close of Lisp's macros friendliness.
In Lisp, they say "here is a tool that can do anything. Have fun". In Python they say: "here are many tools that are specific and limited to that purpose. Also you gona take bad decisions, so we limited it some more".
The first approach is appealing to very good programmers. The second one, to anybody who had to work with the vast majority of normal programmers.
I prefer the second one. People abuse macros, make way to long lambdas, don't document their magic and indent poorly if they are not forced to.
Since tooling is something people always learn too late, I'm going with the language with sane defaults.
And there is a limited pool of that.
We, humans, also have work to do.
Average Joe and Susi have been for 3+ decades nicely programming lots of extensions to AutoCAD and its various clones in Lisp.
Generally I agree that Lisp programming is a bit more challenging due to the slight mathematical nature (lambda, ...), code as data and meta-programming.
I don't think that one needs to be a 10x programmer to use Lisp - many of those 10x programmers are not programming in Lisp, since the 10x programmer gets hired because he/she might be able to cope with enterprise Java architectures.
Since Lisp is a programmer amplifier, you might think that these Lisp users are 10x programmers, when in fact they are not.
But there are domains where people are using Lisp, who are not focused on programming. For example there are various Lisp-based music apps + literature which are used by composers. Example: https://opusmodus.com
Ha ha, I read somewhere that years ago, in some org or the other, Emacs Lisp was programmed by secretaries to help them achieve their routine tasks. Apparently they were not told that it was "too difficult for them" :)
I've heard about other tools like this before ( for composition,  for synthesis,  for mind-blowing :-p) but somehow never heard about opus modus before.
* Where did you learn about it?
* Are there other similar prog-lang music production tools with similar level of polish?
* Any cross-platform / Windows supported tools? -- unfortunately opus modus only runs on macs :-(
Opusmodus was supposed to have a Windows version in 'the future', but I don't know what plans are currently.
The actively hostile to collaborative programming seems particularly misguided. Even an extreme case like the code from hu.dwim, which is written in practically a new language they built on top of CL and has barely any documentation is still fairly easy to understand.
Which would easily make decades old and still maintained Lisp implementations unmaintainable, since they contain zillions of macros themselves.
I hope the author has matured by now. Seriously, don't buy this book.
As in arts and politics, some curmudgeons are needed in every field, if not for anything else, to show that there's another way than the mainstream.
(They don't have to be perfectly right either -- it's having a strong perspective that counts, nobody is 100% right anyway).
Of course for a technical book it's the technical content that matters -- whether it's presented politely or not.
Even if one does really think it's important to convey they are presenting superior technology, it can be stated clearly and concisely without sounding corrosive.
The words you're looking for are "writes poorly."
There's absolutely no rational reason for it not to be a thing.
"Politely condescending" might be a contradiction in terms, but some people just lend themselves to being treated condescendingly.
Same thing with how some people might deserve a sarcastic response -- sarcasm is not that remote from condescension.
Yeah, the sixth chapter sure bashes on Scheme a lot. :/
Scheme's simplicity is awesome if you're implementing it; it's awesome if you're learning it; it's not so awesome if you're trying to actually use it. As LoL notes, it's dumb to throw away information in conditionals. As LoL also notes, hygienic macros are toys and DEFMACRO is generally what you really want. I'll add that having only a single namespace for variables, classes, functions, types, packages & everything else means that you end up having to be extremely verbose: foo, fooClass, fooFunc, fooType, fooPackage. call/cc is incredibly cool & powerful, but also really hurts efficiency in a way that I don't think is acceptable in a production system (it's fine, of course, in a teaching language, because the experience of understanding it is really useful). dynamic-wind is broken compared to UNWIND-PROTECT.
And on & on & on.
Steele also wrote the Java spec, and he wasn't wrong when he said 'We were not out to win over the Lisp programmers; we were after the C++ programmers. We managed to drag a lot of them about halfway to Lisp,' but I don't think in 2019 anyone thinks that his involvement with Java means it is anywhere near as good, powerful or acceptable a language as Lisp.
You can implement defmacro in 8 lines of syntax case. The opposite is not possible. Anything you can do with defmacro is possible using whatever lower level macro facility available in scheme (be it based on syntactic closures, implicit/explicit renaming or syntax case).
The extra complexity of dynamic-wind is required because scheme supports call/cc. You can easily limit dynamic wind to be comparable to unwind-protect, but that would render continuations useless for other things than escapes.
The single namespace has roughly one one common consequence: having to name your lists lst, which is easy. Types and classes are by convention <type>.
Continuations are cool and useful, although full blown ones are ugly. Delimited continuations are making their way into some schemes. In guile they are used to build guile-fibers, which are really cool.
The only thing common lisp really brings is cross-implementation compatibility. Compared to <your favourite scheme> it is mostly a matter of taste and, compared to some implementations, maturity.
Even if that is so, that doesn't mean I want syntax-case. If I don't want syntax-case but do want defmacro, that requires a lot less code than implementing syntax-case (which I don't care for) and then eight more lines to get defmacro. (Are we talking with a defmacro that supports optional parameters, &environment? and how about macrolet? destructuring lambda lists)
Hygienic macros are basically wrong. A good way to express why they are wrong is by analogy. In visual art, as a joke, we can paint paintings that perpetrate deliberate errors of perspective: some feature obviously in a distant background reaches into the foreground in an incorrect way and overlaps or interacts with foreground objects.
There is precisely such an absurd situation between the lexical scope of a hygienic macro definition, and that of the macro call site.
The macro injects code into distant call sites, yet that code makes lexical references into the macro definition, like the silly character jumping out of an inner painting into an outer painting.
I don't ever want that; I don't want the code generating system (the macro) to be wrongly mixed up with the generated site. If the macro generates some symbol X, that should refer to an X definition that is lexically (or in any other manner) apparent at the call site, even if that is a bug. If I make that bug, I will own that bug.
I do want macro generated code to refer to a clearly defined set of run-time support symbols, if that is required, not coming from some random lexical scope enclosed in a macro definition, but from a global scope. Referring to a private scope of the macro definition is horribly wrong. That scope must completely die when the dust settles. It's not necessarily even on the same machine. When the macro is running and its scope is active, it's on some build farm. When the code is running, it's on a target system. Why would the target environment refer to the build environment?
No hacker interested in constructing practical systems from the ground up could have invented hygienic macros; they are an ironic joke coming from idle minds.
'(a b c))
Of course, more usually we use a backquote, and insert interesting things into the template. The non-inserted template parts are data, just like in a quote.
We don't know whether b and c will ever refer. a will refer to some operator or function at the expansion site, which determines the meaning of b and c. In any case, none of them refer to anything while they are still inside the macro.
Hygienic macros implement a scheme whereby the quote is removed, and, wee, the datum does refer to the surrounding environment even though it's still supposedly a datum that will be inserted elsewhere where it becomes code. Although this has a touted benefit (the hygiene), it has a nasty implementation.
Just think about what it takes to have a (list ...) expression in a macro template still refer to the global list function that is in scope at the macro definition, even though the expansion site has defined a local list. Somehow we have to "wormhole" around the shadowing definition to reach that list. Now let's think about it in the context of separately compiled modules. How about the requirement for a macroexpand function for debugging macros: will the output be understandable?
There is an obvious down-side. While we have made the macro expansion immune to local shadowing definitions, the macro is now susceptible to its own. If we redefine list around the macro, it's referring to the wrong one.
A non-hygienic macro doesn't have this problem:
(defmacro foo ()
(labels ((list () ...)))
`(list ...,@(list ...))))
I also don't have to re-export bindings that I happen to use (like a specialised let from srfi-71 that supports multiple return values) and not changing let for whoever imports my library or polluting their namespace.
Expanding the macro in guile makes it clear in the generated code where things come from a (let ...) in the generated code gets expanded to ((@@ (loops for-loops) let) ...), Which is easily debuggable.
I just don't happen to see what you are describing about hygiene as a problem. It is the behaviour I want as a macro writer and the vast majority of time as a macro user.
The only big problems with it are 1. The Implementation sucks. The concepts are quite easy to grook, but hard to implement. 2. The hygiene has some edge cases where it might not protect you if you are writing very complex macros gluing quoted syntax objects together.
"We were not out to win over the Lisp programmers; we were after the C++ programmers. We managed to drag a lot of them about halfway to Lisp." -Guy Steele
Maybe the author was immature. That's great. Let it remind you what it's like to be young and so excited about something that you'd write a book about it. This book is great.
Something that might be worth noting: two minutes of googling didn't turn up the author's age, but I recall seeing this photo on his website when I bought it some time after it was published in 2008: https://hcsw.org/contact.php
Here's a more up-to-date photo: https://hoytech.com/about
It's possible he was quite young when it was published. If that's the case, then I think the book is all the more a towering achievement. It's abrasive, and it smells a bit of 2000s forum fanboy discussions, but mature people should be able to read past the more opinionated parts and find the gems in the technical treatment.
Or, as we used to call it: "It's all about having had a few conversations with actual people in the actual world".
Also, IMO, the defmacro/g macro defined in this book would have been better defined as a reader macro than as a codewalking macro.
I think CCL still works.
The code here should fix the problems : https://github.com/thephoeron/let-over-lambda
Note, it's been a while since I looked at it so I could have some of the details wrong. The book is still well worth reading to help you understand the possibilities with a programmable programming language, but just bear in mind the various caveats..
I haven't read it, I have to confess. It would be useful if those assumptions were explicitly mentioned and explained in the book.
Some of the advanced macro stuff is easily outside of the standard and can be still useful, which for example lacks some of the environment features described in CLtL2.
Many things in LOL are quite opinionated. I don't think it's a distillation of knowledge (like On Lisp), but rather an exploration how far you can take some things (macros).
I thought Haskell surpassed it a long time ago already.
Doesn't need one.
> restart/condition system that Common Lisp has
Utility is questionable, but you could build one in the continuation monad.