If one were to learn a functional language, is Lisp a good choice today? Or is Haskell more appropiate?
If you are hacking on stuff or doing it as a hobby, learn Scheme. Something like Chicken Scheme is good. Small concise language but a great scarcity of libraries. If you want a pretty ide for scheme or write your own scheme use Racket.
If you want production code in lisp with good library support use Common Lisp. Clojure is better because of better functional programming features than CL but debugging requires knowledge of Java as it runs on a JVM but the plus is you can seamlessly call java libraries which are numerous (more than what is in CL).
OCaml is functional programming with Objects. It it used where there is a need for performance like compilers with the advantage of a functional paradigm. But OCaml is a pain to work with for any Non Unix systems. F# is functional programming for dotNet. Then there is Scala which runs on JVM like Clojure and so has the same pros and cons except Scala is closer to OCaml and Haskell than to Lisp.(ML family)
Depending on your platform make your choice. OCaml is horrible for Windows. Haskell works. Scala and Clojure just need java so platform makes no difference. Schemes mostly have an interpreter for most platforms. Make your choice.
CL now has [shameless plug] an excellent functional collections library, FSet . Using FSet greatly expands the range of algorithms that can easily be written in a functional style. It's been a while since I looked at Clojure, but my impression is that FSet has a substantially richer API, including a bag (multiset) type and many more operations.
I don't know exactly which Clojure features you have in mind, but I suspect that FSet makes CL at least as good for functional programming as Clojure.
It's an interesting coincidence - just yesterday I learned about FSet while searching for something else and thought it's interesting. However, after arriving at https://common-lisp.net/project/fset/Site/Project.html and reading through the page I left it without installing.
For the very wrong reason, too: I just couldn't find the source code. Obviously, I didn't try very hard, but on the project page (which looks a bit too minimal, btw, some CSS wouldn't hurt) I didn't find a link to the repo. I think it would be great if you added it on top of the project page.
My only complaint about FSet would be map default values; not necessarily the fact they exist, but (1) when using defaults, that two maps can compare equal even though they have different defaults, and (2) when not using defaults, the way they nevertheless intrude upon the map-union protocol, necessitating handling of otherwise unused default values (usually NIL).
But I love FSet when using CL. Thank you for writing and publishing it!
Agreed on both counts about map default values. I've been wanting to change how they work, but the change I have in mind would be incompatible. Here's what I'm thinking. Instead of every map (and seq) having a default, which defaults to NIL, we would distinguish maps with defaults from those without. Doing LOOKUP on a defaultless map at a key it has no mapping for would signal a condition (similarly for an out-of-bounds lookup on a defaultless seq). Then the defaultless versions could be used more conveniently with UNION (and COMPOSE, another place where I've tripped over this problem).
I was thinking that creating a map without supplying a default should give you a defaultless map, and that's how I wish I had done it from the beginning, but making it work that way now would break a lot of code. I could release it as a new major version (and call it "fset-2" in Quicklisp so people don't pull it down accidentally). Or, I could leave the existing constructor behavior unchanged and just add a new defaultless map (and seq) constructor. I think this is considerably less elegant, but it wouldn't break existing code. What do you think?
I have no way of knowing how such changes would affect the wider installed base of FSet. All I can tell you is what I'd prefer myself.
I wouldn't mind, and actually prefer, if the default SEQ would signal a condition on out-of-bounds access (I'd also suspect that not a lot of existing code would break, but there's no way for me to know that.)
On the other hand, I don't think I'd get much besides having to rewrite code out of map lookups signaling a condition. I glossed over my uses of LOOKUP. For most lookups, the case of some element not being present in the map is normal behavior, expected and handled.
For a minority, an element not being present is an error, but for a majority of those, a specific error message is appropriate, so there'd still be explicit checking and no advantage from signaling within LOOKUP.
The smallest group of LOOKUPS expects the lookup to succeed without checking. In all of these cases, the lookup will succeed unless some code is buggy, and in all cases, a bogus NIL would be consumed immediately by a function that would signal a condition on getting a NIL.
So my personal favorite default kind of map would return (VALUES NIL NIL) for lookup failures but would not have a default for purposes of the MAP-UNION etc protocols.
An alternative (or additional) change that would work for me would be an additional optional function parameter to MAP-UNION and friends which, if present, would handle default values.
In 2017 the status of Lisp is about the same as it has always been only more accessible than ever given faster networks over which to download tools and to find documentation...e.g. the Racket ecosystem. And a person who starts exploring one of the Lisps today will learn as much or perhaps more (due to generally better access to diverse information about programming) than at points in the past.
My opinion reflects my bias, but in general I think Lisps are more accessible than Haskell because there has been more time to smooth over the rough edges related to the more significant underlying mathematical/theoretical concepts (lambda calculus versus type theory) and perhaps because lambda calculus is a bit more easily understood than type theory (or perhaps not). There's also a much stronger pedagogical tradition of Lisps as teaching languages (forty years or so) at the undergraduate level than with Haskell.
But in the end, why not both?
Lisp is a multiparadigm language. It has closures and higher order functions and uses them where they make sense; it also has a lot of imperative constructs, and a very sophisticated object system. They all work together and make Lisp a unique, interesting language. If you're only interested in the functional parts, you'd probably be happier with Haskell or Clojure.
Common Lisp has an implementation, ABCL, which runs on top of the JVM. Access to the Java ecosystem is a shared trait, not a difference.
Being a multi-paradigm language CL has a wealth of features That Clojure doesn't and they all play well together. Except for the type system and the class system.
For example, it has places and the ability to define your own setf forms.
Common Lisp makes iteration easy with the lisp macro.
Common Lisp has tagbody and gotos so one compile FSMs efficiently
Common Lisp has an object system were the semantics are describe using the same object system. So users can extend the object model according to their needs.
Clojure is designed to support one style of programming envisioned by its hypemaster. Rich Hickey is a brilliant person, and has done an excellent job designing Clojure, but
Clojure approach to 'user erros' is we don't guarantee to signal an error, we may do w/e.
By momentum you mean it is trending. We know mainstream programming is a pop culture, but we shouldn't encourage it by recommending one language over another based a FAD.
To further underscore this point, Common Lisp was created in 1984 incorporating elements of proven late 1970s Lisps like MACLISP; CL has been successfully used on spaceships and important complex systems, and the CL ecosystem continues to evolve and grow till today. The specification has been unchanged since the '80s and probably won't change any sooner, since the language is extensible and thus can evolve as the state of the art evolves.
CL is largely free from any kind of fad; everything that is built into the language is there either because (a) it has been proven to be necessary and proven to work correctly in the best way possible, or (b) was needed for backwards compatibility with earlier lisps.
Clojure is a new language and tries to be nice to newcomers, but it's difficult to compete against technology that has at least 33 years of continuous evolution. Also, if you need to interoperate Java classes with CL, there's ABCL which will do it nicely.
As a lisper said, "those who don't know Common Lisp are doomed to reinvent it."
Are you sure about that? RPLACA, RPLACD, NCONC, etc are present in the Lisp 1 Programmer's Manual dated March 1st, 1960: http://bitsavers.trailing-edge.com/pdf/mit/rle_lisp/LISP_I_P...
Or the more accessible explanation by Paul Graham
There are only few primitives and no such things as RPLACA, RPLACD, and NCONC.
> Or the more accessible explanation by Paul Graham http://www.paulgraham.com/rootsoflisp.html
> There are only few primitives and no such things as RPLACA, RPLACD, and NCONC.
That paper is from a month after the manual I posted. It's pretty open about not covering the entirety of Lisp, too, for instance in the section "The LISP Programming System" it says: In addition to the facilities for describing S-functions, there are facilities for using S-functions in programs written as sequences of statements along the lines of FORTRAN (4) or ALGOL (5). These features will not be described in this article.
No one's denying that Lisp's early history wasn't wrapped up in early research on functional programming, but I haven't seen any evidence to support the claim that any programming language or anything called Lisp from that time period was purely functional.
As for McCarthy's original conception of Lisp, in 1979 he wrote: LISP also allows sequential programs written with assignment statements and go tos. Compared to the mathematically elegant recursive function definition features, the ``program feature'' looks like a hasty afterthought. This is not quite correct; the idea of having sequential programs in LISP antedates that of having recursive function definition.
I'm not the only one who has this view.
From Paul Graham, http://www.paulgraham.com/rootsoflisp.html
"There were already models of computation, of course - most notably the Turing Machine. But Turing Machine programs are not very edifying to read. If you want a language for describing algorithms, you might want something more abstract, and that was one of McCarthy's aims in defining Lisp.
The language he defined in 1960 was missing a lot. It has no side-effects, no sequential execution (which is useful only with side effects anyway), no practical numbers, and dynamic scope. But these limitations can be remedied with surprisingly little additional code.
If you understand McCarthy's eval you understand more than just a stage in the history of languages. These ideas are still the semantic core of Lisp today. So studying McCarthy's original paper shows us, in a sense, what Lisp really is. It' s not something that McCarthy designed so much as something he discovered. It's not intrinsically a language for AI or for rapid prototyping, or any other task at that level. It's what you get (or one thing you get) when you try to axiomatize computation."
Since the word LISP is not mathematically defined and has different meanings in different contexts, there are different views of what LISP is and I understand your confusion.
The language LISP I in 1960 had numbers, side-effects and sequential execution.
Lisp I programmer's manual from 1st March 1960.
It had PROG -> sequential execution. chapter 4.5
It had floating point numbers -> chapter 4.4.
It had side-effects -> SET, SETQ. chapter 4.5
Yes, true. But in this context Paul Graham was talking about the language defined in the 1960 paper below and not LISP I.
"Recursive Functions of Symbolic Expressions and Their Computation by Machine, Part I"
In the paper McCarthy mentions also:
'6. A ``program feature'' allows programs containing assignment and go to statements in the style of ALGOL.'
'7. Computation with floating point numbers is possible in the system'
Basically the paper is more a collections of ideas around various ways to describe parts of LISP. The actual LISP language is described somewhere else. See the LISP I Programmer's manual linked earlier, which gives a complete overview of the state of the LISP language in 1960.
This part is about the status of LISP I and not what Graham is referring to. Did you look at the article by Graham?
He is talking about the core of LISP that he found in the 1960 paper that is pure.
>Basically the paper is more a collections of ideas around various ways to describe parts of LISP
There is clearly a gem in the paper that is viewed as a pure LISP.
What Graham's Paper is describing is a back formation abstracted from what they got running on that IBM 704.
I doubt that the theory had quote, either, which is the gateway to "code is data". Since code-is-data is the core of Lisp, a theory without code-is-data (like the vapid nonsense that is Lambda Calculus) isn't about Lisp.
Since the 1960 paper is after the LISP I manual and Graham's article is based on that paper the actual wordings are clearly influenced by LISP1.
>What Graham's Paper is describing is a back formation abstracted from what they got running on that IBM 704.
We viewed it as the core of LISP.
>Since code-is-data is the core of Lisp
Yes an another important part of LISP.
Since LISP I is the real first implementation, I don't mean the original LISP in the sense of the first LISP, but rather "a core of LISP defined in the original paper". I called that the original LISP. I have explained this view many times so far.
>McCarthy's original idea of Lisp was not purely functional
At least McCarthy had some pureness in his mind for LISP and it is not black or white argument.
>the original Lisp implementation was not purely functional.
> but what you and Paul Graham seem to be doing is saying that that pure Lisp is the real Lisp and what was called Lisp before that (in theory and implementation)
This is not my intention.
In his 1958 paper describing what would become Lisp, literally the first example of the use of car he gives is destructively modifying a cons, ie, RPLACA. In his own words, which I quoted earlier and you didn't respond to: "Compared to the mathematically elegant recursive function definition features, the ``program feature'' looks like a hasty afterthought. This is not quite correct; the idea of having sequential programs in LISP antedates that of having recursive function definition."
It seems black and white to me.
> > but what you and Paul Graham seem to be doing is saying that that pure Lisp is the real Lisp and what was called Lisp before that (in theory and implementation)
> This is not my intention.
Could you explain how your intention differs? McCarthy's original conception of Lisp was not purely functional (according to McCarthy himself as well as the historical evidence), and the original implementation was not purely functional, so how would the purely functional subset described in a later paper be "the original LISP" if you aren't denying the earlier work?
"One mathematical consideration that influenced LISP was to express programs as applicative expressions built up from variables and constants using functions. I considered it important to make these expressions obey the usual mathematical laws allowing replacement of expressions by expressions giving the same value. The motive was to allow proofs of properties of programs using ordinary mathematical methods. This is only possible to the extent that side-effects can be avoided. Unfortunately, side-effects are often a great convenience when computational efficiency is important, and “functions” with side-effects are present in LISP."
Seems not black or white to me. This paper is also written by McCarthy. He had pureness in his mind, but the real implementation had side-effects for practical reasons.
>McCarthy's original conception of Lisp was not purely functional (according to McCarthy himself as well as the historical evidence)
Again not black or white (according to McCarthy himself). The pure language is written in the original LISP paper and Graham and I viewed it as the core and it also showed the pureness of LISP.
>so how would the purely functional subset described in a later paper be "the original LISP" if you aren't denying the earlier work?
Again the definition. I have already explained what I meant with "the original LISP".
> Again the definition. I have already explained what I meant with "the original LISP".
I didn't ask what you meant by it, I asked how using that definition is different from claiming that McCarthy's earlier work on Lisp wasn't Lisp. In the same paper you just quoted, McCarthy says that Lisp was planned to have sequential programs before it was planned to have recursive functions; the part you quoted does not say that destructive functions were added later or that he ever planned not to have them. In the 1958 paper, he was already describing a language with mutable objects.
I didn't say this. I have already explained about the difference between the first LISP implementation and the core language written in the 1960 paper. There is no one true definition of LISP and I have already talked about this. People have different views of what LISP is.
You said black and white and it is not true from the paper I have shown.
>In the 1958 paper, he was already describing a language with mutable objects.
I am not talking about the 1958 paper. I am not also talking about which idea is first or not. Since it is not written clearly in the papers shown so far which idea is first, but I have showed that it is not black or white. Graham and my claim is based on the so called "original LISP paper" in 1960 and the pure version of LISP is written in it.
You and I have different views of LISP and that is OK. But please don't force your view to us, since there is no one true definition or interpretation of LISP.
> LISP is understood as the model of a functional programming language today. There are people who believe that there once was a clean "pure" language design in the functional direction which was comprised by AI-programmers in search of efficiency. This view does not take into account, that around the end of the fifties, nobody, including McCarthy himself, seriously based his programming on the concept of mathematical function. It is quite certain that McCarthy for a long time associated programming with the design of stepwise executed "algorithms".
> On the other side, it was McCarthy who, as the first, seemed to have developed the idea of using funtional terms (in the form of "function calls" or "subroutine calls") for every partial step of a program. This idea emerged more as a stylistic decision, proved to be sound and became the basis for a proper way of programming - functional progamming (or, as I prefer to call it, function-oriented programming). We should mention here that McCarthy at the same time conceived the idea of logic-oriented programming, that is, the idea of using logical formulae to express goals that a program should try to establish and of using the prover as programming language interpreter. To come back to functional programming, it is an important fact that McCarthy as mathematician was familiar with some formal mathematical languages but did not have a deep, intimate understanding of all their details. McCarthy himself has stressed this fact (23). His aim was to use the mathematical formalismus as languages and not as calculi. This is the root of the historical fact that he never took the Lambda-Calculus conversion rules as a sound basis for LISP implementation. We have to bear this in mind if we follow now the sequence of events that led to LISP. It is due to McCarthy's work that functional programming is a usable way of programming today. The main practice of this programming style, done with LISP, still shows his personal mark.
> I didn't say this.
Again, I was asking how what you did say was different from it. Your answer seems to just be that your definition of Lisp means the subset described in the 1960 paper, and doesn't include the original implementation or McCarthy's earlier writing on the subject.
> You said black and white and it is not true from the paper I have shown.
The paper you quoted said that Lisp was originally conceived as an impure language (the part I quoted) and that that was unfortunate, but true (the part you quoted).
> I have already explained about the difference between the first LISP implementation and the core language written in the 1960 paper. [...] I am not talking about the 1958 paper.
So you meant to say that the 1960 paper was the first description of a pure language. Calling the 1960 paper's subset of Lisp "the original LISP" and disregarding the already-existing implemented Lisp system which the paper acknowledges, as well as McCarthy's previous writing on the subject, is misleading at best.
> I am not also talking about which idea is first or not.
Then we have different definitions of "original," too.
We viewed it as the core of LISP and this is just one view of LISP.
>The actual LISP wasn't pure. Not before this particular paper and not after.
LISP I is not pure and I agree.
The white paper doesn't reproduce the entire reference manual; what are the odds?
The Lisp 1 Programmer's Manual is dated 1960 and ... bears McCarthy's name.
The paper you cited has a section titled The LISP Programming System which says:
"The LISP programming system is a system for using the IBM 704 computer to compute with symbolic information in the form of S-expressions."
The Programmer's Manual's Preface begins:
"LISP 1 is a
Looks like they are about exactly the same thing to me: an implementation for a specific computer (IBM 704), described in the same year by the same author, with an a nearly identical sentence in the intro.
Lisp was always as much as an idea as a real proramming language and both were developed together. Lisp was developed as a practical tool for AI research, especially for his AI research.
"One mathematical consideration that influenced LISP was to express programs
as applicative expressions built up from variables and constants using
functions. I considered it important to make these expressions obey the
usual mathematical laws allowing replacement of expressions by expressions
giving the same value. The motive was to allow proofs of properties of programs
using ordinary mathematical methods. This is only possible to the
extent that side-effects can be avoided. Unfortunately, side-effects are often
a great convenience when computational efficiency is important, and “functions”
with side-effects are present in LISP."
Reading above, in his original idea, he clearly had the idea of mathematical pureness for LISP, but for practical reasons the real implementation became one with side-effects.
DrRacket looks a little dated in its vanilla configuration, but the layout of the editor / REPL parts can be changed around a bit, the color schemes swapped out. There's even a paredit mode for DrRacket. Overall, if I were somehow coding more exclusively in Racket, I could easily see using and extending DrRacket rather than my current emacs.
Common Lisp may be old, but it is extremely powerful relative to Clojure IMO. Clojure startup times are awful, it takes a very opinionated approach to programming (immutable data), its debugger is buggy, it produces unsightly error messages.
Sure, these problems may be fixed, but for me, I want a powerful tool that I can use now.
I've tried several new hot languages at the time (Rust, Dart, Clojure, Go).
I have come to a point in my career where I want to stick to highly complex, but slightly dusty artifacts defined by a standard( C++ & ANSI Common Lisp).
The SBCL implementation of common lisp does not give me all of the headaches of Clojure. I also don't have to restart the repl to load libs.
On using another dynamic language that isn't a lisp... Forget about it.
...Python maybe for machine learning, but I will probably just wrap cafe in CL or something.
PS. I am 25. Seems relevant to this discussion.
This made me LOL. I figured you were 50 or older based on the top of your post. You are wise to pick languages that are proven and "just work " though. I would say Go, while young, is incredibly stable and predictable.
Sadly the only Lisp environments that resemble in any way the old Lisp Machine or Xerox PARC ones are comercial.
Emacs is kind of ok with SLIME, but it still isn't the same as using Allegro Common Lisp or LispWorks .
 - https://franz.com/products/allegro-common-lisp/
 - http://www.lispworks.com/
Just grab the latest release from https://github.com/slime/slime/releases Install atom-slime and sbcl (something like brew install sbcl on osx) Point atom to the directory you extracted slime to and then hit packages > slime > start
Most Common Lisp IDEs had a collection of these tools: Editor, Listener, Inspector, Debugger, Stepper, Preferences, ... Additionally there were various browsers, an interface builder, profiler, ...
The Lisp Machine IDEs had most of these tools, too. But they weren't only styled for development, but also for general use. Thus a Lisp Listener was not only a REPL, but also a general command line interface to interact with the machine. They were also more oriented towards group work and had additional features like versioned file systems, software versioning systems, databases, support for multiple programming languages (like C, Pascal, Ada, Fortran, Prolog, ...), source repositories, ...
Over the years (80s onwards) a multitude of these IDEs on Workstations and PCs were developed. CMUCL comes with one for X11 (debugger, editor, ...), Macintosh Common Lisp, ExperLisp, Procyon Common Lisp had one for Macs and Windows, Golden Common Lisp had one for Windows, Corman Lisp for Windows, SPE from SUN, Lucid CL had one (called XLT, their editor was called Helix), Star Sapphire Common LISP had one, Clozure CL has one for macOS (which one can run on a current Mac), ...
Then there were always IDEs based on GNU Emacs or XEmacs. Ilisp was once popular, Allegro CL has ELI, SLIME currently is the most popular with Sly as a fork. This is not so bad, since there are decent versions of GNU Emacs for Windows, Unix/GTk+ and macOS.
Of the commercial ones only Allegro CL (IDE for Windows and Unix/GTk+) and LispWorks (IDE Windows, Unix/Gtk+, deprecated for Unix/Motif, macOS) survived with regular releases.
Of the non-commercial ones SLIME/GNU Emacs is the most popular and it is also used by some users of Allegro CL and LispWorks. McCLIM lingers in the background with a collection of tools which would make an IDE, but it lacks one or more better GUI backends.
There are also a bunch of other ways to use Common Lisp with other editors (vim, atom, ...).
Now 'darkmatter' is another one. Let's see what people think from using it...
I got mine from endless hours reading the manuals, books and papers from those days, so I sometimes miss a few details.
Worth noting that McCLIM is again under active open-source development, so there are hopes for a decent GUI backend here.
The IDE screenshots for Windows show off Windows XP.
Maybe they could improve their marketing message, but I doubt most wannabe Common Lisp developers are willing to pay for comercial tools, with enterprise prices, even if they had the best marketing of all times.
The IDE is based on their native GUI backend. Thus it picks up much of the looks of the current OS automatically (windows, buttons, dialogs, ...).
Though I'll propose it to them for the next LispWorks release...
Missing 5 Windows releases is just sloppy: Vista, 7, 8/8.1, 10.
I'm not saying it's a bad product, just that first impressions matter. There's a reason this site is full of discussions about conversions and funnels.
As a side note, Windows is also 30+ years old. Hardly new tech these days ;)
I can bet that LispWorks' customers (big companies and experienced lisp users) are largely away from the mainstream desktop world, and wouldn't care too much on how the current Windows UI looks like.
If it has being left unaddressed because no one thinks it is enough of a problem. That is how free software works
Tl;dr: web sites can send requests to localhost TCP sockets despite origin restrictions using a trick called DNS rebinding.
Many hobbyists and professionals use Lisp to solve their problems and make money.
IDEs are in a sorry state. If you're an Emacs user, you'll be happy. Otherwise, unless you spend hundreds of dollars for a professional IDE, you won't have much. Editing Lisp requires a good IDE.
Is there anywhere that one can read about such cases? Would be interesting to read about both how well they are doing and what sort of apps they use Lisp for making.
Also, aren't there any other OSS or free Lisp IDEs outside of Emacs? Or are they all crap?
Really for a good experience you just need something with Swank support and ability to highlight matching parens. Eclipse would even work, though I haven't tried any of the plugins for it. IDE features like introspecting values, breakpoints, tracing, profiling, stepping, recompiling are all part of the Common Lisp standard so you'll get them for free just by running the Lisp REPL in a terminal. What you use on top of that is really just making it more convenient to interact with the Lisp environment than typing everything in by hand. I like slimv with its auto-paren-infer stuff turned off (matching and rainbow parens are fine for me) but I've also just used a tool that uses two screen sessions to send data back and forth without using Swank (or even Lisp, I use it sometimes for Python but that's trickier because I can't just send a class to (re)define if I have newlines between methods and I can't casually change my namespace).
And [Portacle](https://shinmera.github.io/portacle/) is an Emacs shipped with SBCL, Slime and Git in a portable and multiplatform way, so it's straightforward to begin with.
Just grab the latest release from https://github.com/slime/slime/releases
Install atom-slime and sbcl (something like brew install sbcl on osx)
Point atom to the directory you extracted slime to and then hit packages > slime > start
The Common Lisp Cookbook
262 points by macco 9 days ago | 184 comments
Not so much about CL vs. functional languages, but it gives a sense of the current scuttlebutt about CL.
Is Lisp a good choice (for a functional language)?
If you want to be free to do whatever you need to do, to apply the paradigm you need at every moment -- be it procedural, imperative, logic, aspect-oriented, object-oriented, functional... then Common Lisp is for you. If you need the full power because you know what you are doing, then Common Lisp is for you.
Status of (Common) Lisp in 2017
- Free books and resources are available, for example "Practical Common Lisp"
- Package ("system") installation is damn easy with Quicklisp. Download package from the web, compile it and load it... with only one command: (quicklisp:quickload <package name>)
- There are libraries out there for anything you need: REST services, web frameworks (many), JSON, machine learning, math, anything really.
- There are many ANSI-compliant CL implementations out there, most of them free: SBCL, CLISP, Clozure CL, SICL, ABCL, LispWorks (commercial), Allegro CL (commercial)
- Interoperability with C is easy with CFFI (there are other ways as well)
etc etc. Lisp in 2017 is better than ever!
- Interoperability with Java is easy with the ABCL implementation.
- IMO the documentation state is not great, but it's being worked on. (it's said to be non-existent for Haskell :p ) CL has many books to learn from (CL recipes of 2015 being the more interesting to me http://weitz.de/cl-recipes/), but the online documentation is not brilliant. Often the web page isn't attractive at all, not every project provides good documentation, the "official" websites are bad (cliki, others). It's being fixed by https://github.com/CodyReichert/awesome-cl, http://lisp-lang.org/, https://lispcookbook.github.io/cl-cookbook/, http://quickdocs.org/ and http://phoe.tymoon.eu/clus/doku.php
- one can feel the language is dating. It has IMO too many weird semantics: no map and filter but map which takes a second argument telling the type of the structure or mapcar and others, remove-if[-not], etc. Creating hash-tables is too verbose and not consistent, et. That all a lot more are fixed in CL21 http://cl21.org/ (its wiki https://github.com/cl21/cl21/wiki).
CL21 also offers more functional-oriented stuff like shorter lambdas, lazy sequences, generic functions, or string interpolation, regexp literals, and more. It's a bit under-documented, but its wiki is ok.
- web dev in CL is another topic. I sugest https://stackoverflow.com/a/42838145/1506338 but what's doable now is very elegant.
- CL has a big ecosystem, we can use Emacs/Vim/Atom, Portacle is straightforward is portable and thus straightforward to use (https://shinmera.github.io/portacle/),…
The fact that it runs on the JVM and makes it easy to interop with existing Java code makes it practical to integrate in many organizations already using this stack.
Can anyone comment on the benefits of this vs integrating a lisp kernel into jupyther notebook?
Personally I've been wanting to do the ProbMod  course, which uses a version of lisp called church, but been wishing for a notebook environment to complete the course in, so this looks very interesting!
I really like the notebook format but I've yet to come across a browser window so good that I'm happy to give up an actual editor program for it.
You can reference data in a table of your page.
You can access the "session" of a code block in a repl. Useful for putting credentials or other sensitive data in.
You can copy from in a result or a code block with the same ease.
Any tags you have setup will jump as expected. Same for completions and documentation.
Arguably, the way images are referenced externally is better for post processing.
Finally, being able to source control the document easily is huge. Json is a terrible format. :(
I'm convinced that you just pushed my over the limit to finally start learning org-mode.
And the examples sound lovely.
But I get a stracktrace indeed, missing "libev.so" which we get with libev-dev.
My TXR Lisp actually has that function. :)
Oops, I mean accessor.