Disclaimer #1: I've been working professionally as a Common Lisp programmer---not as a contractor!---for the past decade. I have a vested interest in the language and hiring for it.
Disclaimer #2: I am going to ignore commercial implementations of Lisp here, which provide very useful and advanced features, like GUI development, a user-friendly IDE, paid support, etc. [1,2]
So let's get started. Common Lisp's best feature is that it allows you to be insanely productive at the "raw programmer" level. You can write, edit, and debug code very quickly and incrementally, and end up with safe & performant code.
There's a price to pay: currently the best-in-class experience is still Emacs and SLIME (which come nicely packaged here ). As an Emacs fan, that's the best news, but to my fellow PyCharm/VSCode/vim users, it's terrible and alienating news. My colleagues who aren't Emacs users managed to learn just enough Emacs to be productive in a week, but they still frequently fired up their editor of choice in times of need.
It really is worth underscoring that the Emacs+SLIME experience truly fits Common Lisp development like a glove, and the experience, in my opinion, is better than almost every mainstream editor environment out there.
Common Lisp's worst feature is that it feels like just about everything imaginable has a catch. I don't mean "there's no free lunch", I mean that things just plainly don't feel cohesive or "100%" most of the time. To name a few examples:
1. GUIs: If you want to make a traditional, native GUI using open source solutions, you're stuck with really goofy libraries that are non-obvious to get working. As the article points out, you have options. Lisp actually has a renowned framework called CLIM, but I consider the open-source implementation McCLIM  currently only principally useful to hobbyists and hackers.
2. Deploying applications: Almost every implementation of Lisp has some way to create an executable. But very important aspects that real people care about in production are missing, inconsistent, or poorly documented. For example, almost no open source implementations of Lisp have first-class support for signing binaries on MacOS. Almost no open source implementations have a "tree shaker" to remove unnecessary cruft from the executable. Almost no open source implementations make building a shared library practical.
3. Libraries: Many libraries don't do even usual things people might want to do. The linear algebra library MAGICL , for example, doesn't at the time of writing have a way to solve the matrix equation Ax=B. This isn't due to laziness of the authors or lack of foresight, but rather that it's a library that's just not used by enough people to see regular, high-quality contributions as an open-source project. I'm sure MAGICL solves problems for the authors, but the authors haven't taken it upon themselves to make a general, useful, and quasi-complete library for matrix programming in Lisp.
These examples are just examples, maybe not even the top examples.
There are many things I wish for Common Lisp, but there are two I think I wish most.
First, I wish Common Lisp implementations put in work so that they could play nice with other programming languages. Google  recently came out with support for protobufs in Lisp, which is nice, but I feel something deeper is needed. I think Common Lisp implementations supporting building C ABI-compatible shared libraries would be an insanely big step forward; it'd mean that Lisp could feasibly used by every language out there. Right now, the closest we've got is Embeddable Common Lisp, an implementation of Lisp which makes embedding Lisp within C relatively painless, but as usual, it has many catches .
The way I've coped is to produce stand-alone command-line applications, or to build servers with HTTP APIs. But it feels icky, especially if you're working with Python programmers who want to `import` stuff and not run servers just to get some code to work.
Second, another thing that I constantly hope for in the Lisp world is for more "hyper productive" programmers to join it, or programmers whose livelihood depends on it. Of course, since Lisp is used by hobbyists, you see tons of hobbyist code. To be sure, a lot of this hobbyist code is perfectly fine. Usually it works, but it's just a tad incomplete. However, in my opinion, the worst thing about hobbyist code is that it usually doesn't do something useful.
What does "useful" even mean? I won't claim to be able to define this term in a one-size-fits-all fashion, but "useful" to me is about getting practical computing work done. The further away from being concrete the library is, typically the less useful it is. For example, a typical Lisp programmer will have a penchant for writing a domain-specific language for parsing binary files (cool!), will open-source that code (cool!), but then nobody---including the author of said library---will actually use it to, say, write a parser for GIFs . When somebody does come along to write a GIF parser, they're likely not going to use this general binary parsing framework, but hand-roll their own thing.
In Lisp, it seems popular to solve meta-problems instead of problems, which is partly due to the fact that Lisp lets you think about problems at very high levels of abstraction using its advanced object system, the meta-object protocol, and macros.
(One of my biggest "pet peeve" projects in Lisp, second only to "utility libraries", are documentation generator libraries. As soon as somebody figures out that documentation strings can actually be programmatically queried in Lisp, they invariably write a baroque "generator" that spits out HTML. I've never, not a single time, ever, used a documentation generator for doing real, paid work. I think one Lisp programmer I know uses it nicely is Nicolas Hafner, aka Shinmera, who uses a documentation generator simply to augment his long-form documentation writing. Staple  is one example library of his, where you can see some generated documentation at the bottom.)
"Useful" also has to do with how a library is consumed. In the Common Lisp, a library like this  is typical. It's a bare page (be it on GitHub or otherwise) that provides no examples, no indication of dependencies, etc. Not all libraries are like this, but you run into it frequently enough.
The Common Lisp ecosystem lacks a certain "go-getter" philosophy, needed to forge through "boring" work, that some other language ecosystems seem to have. To cherry pick one example, though I don't use it, Zig  comes out with interesting stuff all the time that's genuinely useful. Andrew Kelley, its main developer, is putting tons of hours into getting details around deployment right (e.g., cross-compilation). Little about Common Lisp prevents a motivated person from making equally productive-enhancing strides with the language, but I find that either (a) the interest isn't there or (b) the interest is there but the interest is for developing weird, esoteric stuff in Lisp.
(My favorite example of a "productive stride" that happened in Lisp is the following. For context, people talk about all the time how difficult it would be to port a Lisp compiler to a new architecture. I myself have clamored for documentation on how to do it with SBCL. But, out of nowhere, some grad student named Charles Zhang came out with a port of SBCL to RISC-V. Not only did he port it, he's maintained it with 100s of new commits, making it more performant and less buggy .)
Common Lisp is an amazing language purely from a practical point-of-view. As I said, to me, it's bar-none the best and most productive language to use if you want to "sit down and write code". The implementations of Lisp, like SBCL, are marvels. Lisp code, once you write it, will work forever (seriously, decades). The #lisp channel on Freenode is nice and helpful, and there are so many amazing people in the community. In Lisp, it's seamless to inspect assembly code and work with the world's most high-level, meta-object systems all at the same time. But the ecosystem mouthfeel is still off, and Common Lisp would greatly benefit from programmers obsessed with making the language more useful to themselves and others today.
: LispWorks: http://www.lispworks.com/
: Allegro CL: https://franz.com/products/allegro-common-lisp/
: Portacle: https://portacle.github.io/
: McCLIM: https://common-lisp.net/project/mcclim/
: There is a GIF parser though called SKIPPY! https://www.xach.com/lisp/skippy/
: MIDI: http://www.doc.gold.ac.uk/isms/lisp/midi/
: Zig: https://ziglang.org/
: MAGICL: https://github.com/rigetti/magicl
: Staple: https://shinmera.github.io/staple/
: Charles Zhang's SBCL commits https://github.com/sbcl/sbcl/commits?author=karlosz
: CL-PROTOBUFS: https://github.com/qitab/cl-protobufs
: Poorly documented, performance isn't very good, it's maintained by essentially one person, the rituals needed to use ECL-built libraries are more extensive than necessary, build times are insanely slow, ...
> Doug Katzman talked about his work at Google getting SBCL to work with Unix better. For those of you who don’t know, he’s done a lot of work on SBCL over the past couple of years, not only adding a lot of new features to the GC and making it play better with applications which have alien parts to them, but also has done a tremendous amount of cleanup on the internals and has helped SBCL become even more Sanely Bootstrappable. That’s a topic for another time, and I hope Doug or Christophe will have the time to write up about the recent improvements to the process, since it really is quite interesting.
> Anyway, what Doug talked about was his work on making SBCL more amenable to external debugging tools, such as gdb and external profilers. It seems like they interface with aliens a lot from Lisp at Google, so it’s nice to have backtraces from alien tools understand Lisp. It turns out a lot of prerequisite work was needed to make SBCL play nice like this, including implementing a non-moving GC runtime, so that Lisp objects and especially Lisp code (which are normally dynamic space objects and move around just like everything else) can’t evade the aliens and will always have known locations.
also, ASDF's main author spent his career at Google.
> the best-in-class experience is still Emacs and SLIME
Really I want to mention again that Atom's SLIMA is getting very good. The Sublime plugin and a VSCode one are coming close.
I attended the Vienna conference where he discussed such matters, and he has great ideas, but many of his play-nice-with-UNIX tools are still very bleeding edge and require you to be SBCL-developer-level in-the-know to use them.
I hope later this year that I'll be able to open up a full-time role for SBCL development as a part of my own job, in a similar way that Google employs dougk to work on SBCL. The only trouble will be finding a qualified applicant who can do the full-time work...
You've put into words something I've struggled with saying for a while, and the exact reason I could never really enjoy working with Common Lisp.
The example I always return to is the lack of pervasive use of CLOS throughout the standard. Something like generic-cl feels like it should be a baseline part of the language. As it is, it's a great library. But people are hesitant to even use it, because the result of making everything generic is a loss in performance!
Of course,  talks about optimizations they've done. I don't know how it compares to using the methods in the standard as opposed to the lib. Regardless, these libs only help when writing code, not reading it. Though don't get me wrong, that's a big deal! It's very helpful! But you wish everyone would play by those rules.
But the language is essentially frozen in the existing standard, and can only be extended by community consensus right now.
Is there a CLOS implementation with inline caches for generic method dispatch?
Even if you know all the types, you still have polymorphism that can’t be efficiently monomorphized without an exponential space penalty.
> you still have polymorphism that can’t be efficiently monomorphized without an exponential space penalty.
Wouldn't it be linear at each call site only by the number of types that actually reach that call site? Plus you don't have to cache _all_ types at the call site, only the most frequent ones.
In Java and JS for instance, dynamic dispatch is JITed away at hot spots, and can be un-jitted later to reclaim space if the spot is no longer hot.
I generally don't care if a dynamic dispatch is slow if I'm only calling it a few times, but in a tight loop I want it to be inlined as much as possible.
FWIW, in my computing career there have been two crucial pivot points. Exposure to and programming and using Unix (SunOS/HP-UX and later Linux) and its powerful mode of working was the first epiphany. And later Common Lisp, and the degree of freedom it enables in expression of code and ideas, as well as its speed and interactivity model: REPL/SLY, EMACS etc. I have become a much better programmer because of my exposure to this mode of working.
When I was first starting though, my frustration with the language was large, and it was compounded with learning Emacs (which I love and curse at on a daily basis). Libraries seemed poorly and very inadequately documented for someone new to the language, especially compared to Python for example. But the newish culture of hosting open source projects in Github is improving and READMEs/project descriptions are getting better. And, doc strings are in the code themselves, mostly...
Common Lisp gets criticized for not changing. Sometimes it can be a bit verbose (setf (gethash mykey my-hash-table) my-value). A lot of programmers will instantly gravitate to "that's a lot of words to set a value in a hash table". But it really isn't the problem it turns out to be in real life because you learn to read it pretty damn quick, and you save key strokes elsewhere (for example using loop or CLOS). So it balances out. However, I have come to learn that having a rock underneath that has no problem running years old code is a HUGE benefit. Things generally continue to work with the updated compilers and that reduces the cost to maintain the code base. I trust the language to stay solid underneath. You can extend it, but you generally don't need to use reader macros and other features to solve most problems. So while the world is changing around you, you aren't updating your python2 to python3. Instead you can stay focused on app level changes.
And completely agreed on stability. I love that all of the code samples from all of my lisp books still work. Seems that number approaches zero for all the other languages. :(
The result of working in Lisp is nice code, and that code continues to be nice as the months and years go by, long after the interactive session in which that code had been developed has been forgotten.
I glanced at CLIM, and it's very much still traditional verbose, imperative GUI code. Someone should really take a look at doing a Lisp version of the SwiftUI concept. It seems to be made for s-expressions.
I mean, for instance, oh, Windows resource files.
Glancing through CLIM, I also don't see anything like that sort of separation between GUI logic and form.
SwiftUI provides a fundamentally different way of thinking about UI composition and managing data flow in your app. Much of the GUI logic simply disappears. Syntactically and functionally, it seems like something structured like it would be a great fit for a cross-platform Lisp UI framework.
It's never going to happen.
Lispers were saying this same tune back when I last was using Common Lisp in 2003.
The last standard for Common Lisp is ANSI Common Lisp that came out in 1994. With no new movement in modernizing the language, there is simply no point. It's dead, and the effort isn't worth it. Why modernize a crufty Lisp when there is Clojure? Or many other Lisp-ish languages. There is even R6RS/R7RS Scheme today.
> First, I wish Common Lisp implementations put in work so that they could play nice with other programming languages.
Same song, 20 years later. I remember painfully hacking together "alien" calls (CMUCL's foreign function interface) to use Gtk+. Yes. There really were no Common Lisp bindings for the most popular open source GUI toolkit in 2003 during the era that is probably Common Lisp last great chance at making it out alive. CL was going through a resurgence period and you still couldn't get Lisp to talk to anything. My god, it sucked. And their only documentation was HyperSpec.
If anyone is not aware of HyperSpec, I implore you to Google it now and see what the state of Common Lisp was around 2001-2005. I have nothing against Kent Pitman. But the HyperSpec was not doing Lisp any favors.
The other thing that burned Lisp was everything was a secret society. HyperSpec was free-ish. Sort of. But not really. It was not a community document. If you were just getting in to Lisp it was already recommended to get the proprietary Allegro Lisp or LispWorks. CLISP and CMUCL where the only open source game in town and they weren't receiving much love. Even SBCL was not going anywhere fast. A good segment of the old Common Lispers were against open source at the precise time they needed to embrace open source. They were stuck in a proprietary world.
I could go on, but I'm going to end this rant here.
I suspect I'm not alone in being burned by Common Lisp.
I’ve not and never been burned by Common Lisp. I don’t know how you could be burned by it either. I can see how canceling Python 2 could burn someone, but I don’t see how an evergreen standard like that of Common Lisp could. I’m not hurting by there not being a new standard. Functionality is de facto standardized just fine and we live with it just fine. Lisp is like the only language with a useful web-accessible standard. If you ask a Lisper how frequently they visit Stack Overflow or random tutorial websites, you’ll find it’s much less than others.
Lisp isn’t a lost cause. It has outlived just about every other thing out there. It just doesn’t outlive these other things with tons of fervent enthusiasm.
'Why modernize a crufty Lisp when there is Clojure?'
is a strange comment. Those two languages have more differences than they have common ground. "It's got parentheses, and all languages with parentheses are equal" is a poor basis for critique. Now, you also speak about Perl in the same way, so ... ?!
The CL folks have done exactly this.
A serious way to measure the strata layers is to write exit(2) portably without using external packages.
I'm not sure what you're getting at with exit(2). The CL standard does not include an exit function, because it would be limiting. For example the meaning of exit for an operating system is unclear. But most implementations include a simple exit function. You can do what exit does, more simply in CL than in C, using unwind-protect. You can use the FFI built in to every implementation to write an exit that exactly duplicates what the C stdlib exit(3) and _exit(2). With a very small set of macros, it is portable to all CL implementations. You can even execute the processor abort instruction if it has one. But of course you can't do that portably. But exit written in C is not actually portable, but only looks that way due to a bunch macros.
Threads cannot be implemented as a library
Of course there's also a chicken-and-egg element
Things written for public consumption from the outset are cleaner.
> The Common Lisp ecosystem lacks a certain "go-getter" philosophy, needed to forge through "boring" work, that some other language ecosystems seem to have.
"Boring" work that creates something actually useful almost always happens in the crucible of real, pressing problems. Hobby projects tend to focus on what's fun or interesting. People don't want to take the time to bring something the rest of the way into "useful" territory if they don't actually need to use it for some purpose
On the job code, whether lisp or not lisp... I'm more used to managers preventing writing anything more complete or documented, except for once in few months forcing everyone to write BS low-quality docs because "documentation" is a line item on the contract.
My pleas to set aside time to document and polish things are usually ignored, my pleas for a technical writer on staff to help us keep docs current and well written have never been listened to so far.
We're both speaking from limited sets of experiences, and neither is likely to be representative of the whole industry
> There's a price to pay: currently the best-in-class experience is still Emacs and SLIME
Does viper-mode work well with this setup for vim users?
viper-mode is terrible for me, as it's a "vi like" layer rather than a true vim emulation; it hits a sort of "uncanny valley" where My fingers try to use vim keystrokes and then fail horribly.
The good news is evil-mode actually is a vim emulation layer and I can switch back-and-forth between evil-mode and vim with very few issues. There are only 3 things that ever trip me up, and none of them are major:
1. Find and replace is different (but IMO much better) in evil-mode, since it preserves case when doing case-insensitive matching (:s/foo/bar will change "Foo" to "Bar"). Since emacs shows the substitution live, and I like this better, I haven't bothered to fix
2. C-a and C-x don't increment/decrement numbers in evil-mode like they do in vim. I mainly use this in macros, I suspect they may interfere with other keybindings, and sometimes it doesn't "do what I want" in vim
3. Yanking to the default register also yanks to the clipboard (+ register) and primary selection (* register). This behavior was changed after I adopted evil-mode, and I hate it when it matters, but it doesn't matter too often.
I'm sure there are obscure vim features that are corner cases similar to the above 3, but I still give evil-mode a firm "recommend" for vim users wanting to to lisp dev.
Can be hard with Lisp due to the long history and strong backwards compatibility. I've started using Screamer for the first time today and that was written around 30 years ago...
i.e. I'm tracking all links posted on Hacker News and Reddit. With that data in place, you can find out the projects people are linking to and their alternatives https://www.libhunt.com/l/common-lisp
Scheme - Great amount of good learning materials. Smaller language, gets to the core of writing in both a lisp and functional style quickly. Pure Scheme code should trivially work across implementations. If you ever deal with external libraries your choice of implementation becomes more critical.
Racket - You're using this already. Also a good language, and you can reuse the Scheme resources on top of it because it has language modes that are compatible with Scheme. Like with Clojure, you're dealing with a singular implementation so at least that choice is gone for you. Libraries should "just work".
Common Lisp - Like with Scheme, implementation choice does matter somewhat. Multiparadigm, so you're a bit freer to choose how to implement your program (you can adopt a more procedural, functional, or OO style to suit your needs or wants at the moment). It's less opinionated than Scheme and Clojure. Some good learning resources out there, some of the Scheme materials translate well (in that they work) but don't teach good CL style.
To answer your specific question:
I first learned Scheme helping college mates who were a year or two behind me as GT switched to Scheme as its first CS course language after I took it. I really learned the lisp family with Common Lisp in grad school from an AI course that was using Norvig's Paradigms of AI Programming, and I kept using it as a hobby language ever since.
Which I am using matters very little.
The destruction of the distinction between your code and the compiler, and the recursive relationship between the reader (and CL has reader macros, another dimension), macroexpander, and evaluator, is what is so mind-opening about lisp to me, and CL embodies that trifecta with the most ideological purity (imo).
However, all schemes include a lower level macro system that allows you to break hygiene when you want to, most of them by asking explicitly (explicit renaming macro systems being an exception).
r6rs, a standard that was disliked by some, standardises a lower level macro system called syntax-case that superficially looks the same as syntax-rules but gives you all the power of unhygienic macro systems (in guile for example, the old unhygienic defmacro is implemented using syntax-case. A very trivial thing to do, I might add).
So, to re-hash: syntax-case macros are procedural. They allow for the full power of scheme at expansion time, and allows you to introduce bindings unhygienically using datum->syntax.
With either hygienic or unhygienic macros this can create problems as in either case you may think you're using a particular defun when you're really using the other.
NB: My CL macros are written carefully so they'd be roughly equivalent to the hygienic macros of Scheme, but with more boilerplate. At least when it comes to most variables, but I don't take care to ensure the functions used in my macros are not replaced in the local context where they're used.
Heck, implementing an unhygienic defmacro over a regular lambda is something like 7 lines of syntax-case. In guile you could even use lambda* to get optional and keyword arguments.
(defmacro my-not (x) `(not ,x))
(flet ((not (x) x))
How would you write a hygienic macro which would make use of the local redefinition of not without also having to pass the redefinition to the macro?
In syntax-case that would be (datum->syntax syntax-object-where-i-want-to-introduce-binding 'not)
(and in practice the error is "Lock on package COMMON-LISP violated when binding NOT as a local function", you can lock your own packages too if you want).
Also, you generally mind your own symbols, you don't rebind symbols from other packages (hygienic macros are more useful when you only have one namespace).
Edit: barring code-walking macro of course. Once you start using those, the bets are off.
Edit2: of course, I don't think overwriting core forms count either. That is poor bedside manners :)
(KMP was on both Common Lisp and Scheme standard committees; Erik Naggum was an eloquent, if controversial, Lisp expert)
1. are at least a little bit wrong.
2. Never use FLET or LABELS
I think we can rule out #2. Function capture is a legitimate problem that does (admittedly rarely) happen in real-world code.
The thing that makes writing hygienic macros possible in CL today is the package system. It's disallowed to rebind functions from the CL package, so if your macros only rely on non-exported symbols from your package and symbols in the CL package, then you are free from any hygiene problems that aren't caused by code that is clearly bad locally (that is code that binds internal symbols of other packages; any "::" in use should be immediately flagged by a code review.)
Not quite true, recursive macro invocations that create bindings can step on themselves and force you to use GENSYM, even if your criteria are satisfied.
Erik says that gensym and packages solve the same issue that hygiene solves - something I said all along. His main gripe seems to be with the single namespace, which most people dont see as a problem. If you really really need to call your variables `list`, then by all means: use multiple namespaces.
The real problem, which Naggum includes, is the lack of GENSYM (if scheme indeed lacks it), and lack of first class symbols, as he mentioned.
Bingo. Lisp-2 and GENSYM are attempting to solve the same issue in two different use cases. And, if I may add, GENSYM looks a bit lazy and half-assed next to the scortch-the-earth multiple namespaces of Lisp-2. It's like they blew up Lisp with dynamite and then sat down and said "whatever" when the same problem appears in macros.
> The real problem, which Naggum includes, is the lack of GENSYM
Guile Scheme has both GENSYM and DEFMACRO (unhygienic). I think quite a few Scheme systems err on the side of practical concerns.
But that doesn't erase that fact that GENSYM is an ugly hack to get macros to work.
Now this is… post-hoc, Lisp-2 is the original form of LISP and predates the concepts of macros.
> And, if I may add, GENSYM looks a bit lazy and half-assed next to the scortch-the-earth multiple namespaces of Lisp-2.
I would agree. However in reality we have a system that is half-assed vs a system that nobody likes to use.
Though LISP macros are quite old, from around 1962...
His spirit lived (lives?) on for a long time, and it made staring common lisp a much shittier experience.
And from GP's own site (presumably, name is the same):
There are two ways to look at this. One is that they encourage better coding practices, the other is that they are restrictions that diminish the power of the language and make you jump through unnecessary hoops to get stuff done. Basically, “guard rails” vs “training wheels.”
I can’t say much more since Clojure is the only lisp I have played with (and I don’t see this changing, I quite like the language design). Code written in Clojure is not trivially portable to other lisps, and vice versa. I think there are differences in what you can do with macros too.
No value judgment here, I just think it’s important to know that these non-trivial differences exist when choosing which language to explore.
I think you will find that Clojure has the largest community and the widest use for commercial applications but, depending on your specific interests, either racket or common lisp could be a better fit.
In summary, I think we need more information about your goals.
For the (browser) front-end, there are lots of neat idiomatic-clojure libraries like https://reagent-project.github.io/
For command-line scripting, your program will ultimately mostly access the filesystem or other OS functionality via interop with the host platform (either Java or Node).
Otherwise, and especially if you want to use C library interop, I'd suggest Racket. Racket is also relatively close to Clojure in philosophy.
Thanks for the effort and the information you put together.
Generic-functions are, in general, more useful for some types of functional programming than the typical single-dispatch style used by most other languages.
Haskell and common lisp are at nearly opposite ends of the spectrum on many design decisions (Haskell is a Lazy, curried, typed language with lots of syntax and custom infix operators; Common Lisp is a (usually) eager, mostly untyped language with little syntax and only prefix operators)
1: I say mostly because CL does have type declarations, but it's not defined how (or even if) the declarations are enforced in the CL standard. SBCL is arguably a typed implementation of CL, but even then, the fact that you can't in any useful manner describe a type that is "A list of items of type X" in CL puts it in strong contrast to Haskell's rich type system.
2: You can use a "satisfies" type, but satisfies type declarations are ignored at compile time, so even in SBCL they are largely just syntax-sugar for assertions.
If immutability is good then it would seem it is best if you can enforce immutability at the language level.
I'm just wondering whether mutability is a downside of Common Lisp?
CLisp supports but, unlike Scheme, does not really encourage programming based on functional purity. Only the CLisp implementations that support full tail recursion are OK choices for pure functional programming.
it was also implemented originally with variables, functions, types, system specific data types, numbers, symbols and a bunch of other things, including itself.
> CLisp supports but
Note that CLISP is an implementation of Common Lisp. The language itself is abbreviated as CL, not CLISP. If one says CLISP, then one only means CLISP, the specific implementation. The name CLISP comes from the implementation being implemented using C.
Quicklisp has nothing to do with documentation. That’s a distribution mechanism.
This was after having it work on a previous macbook. Then changed the laptop, tried installing it and got stuck. Led me to give up learning it at all.
I'd like to add better onboarding to the list of things to improve
Python is starting to look like the Lisp of the second Winter.
Between 1988 and 1992 I worked for a UK company participating in a multinational project to use Common Lisp to build an expert system building tool. After creating the thing, we worked with clients (internal and external) to try to solve real customer projects. Our conclusions matched others, and contributed to the AI winter:
* the rule-based expert systems were extremely brittle in their reasoning
* once you got beyond toy problems with small rule sets, you needed some programming skills in addition to the domain expertise that you were supposedly encoding
* you sometimes spotted the actual algorithms that could be coded in a conventional language rather than rules + inference engine.
We eventually abandoned the AI side and kept going with the underlying Lisp, and started to get real leverage in the company, rapidly becoming a prototyping / risk reduction group who delivered usable functionality.
[Edit] We were using Lisp processors embedded in Macintosh hardware, with outstanding (for the time) IDEs and thanks to the Mac interface, we could create some really slick apps for the end users. One of our Lisp systems that got rave reviews internally was a drag-and-drop network modelling tool that replaced a frightening mass of Fortran and data entry spreadsheets. No AI/ML at all, but it really improved the throughput of our network modelling group. As we were a comms company, this got positive reaction from senior management, offsetting the non-return on investment in the rule system.
By the late-80s and early-90s you have venture-backed companies, big institutional efforts, grifters who had honed their pitch in academic tenure-track positions prior to moving to richer waters, and the first real claims being made regarding just-around-the-corner deliverables that would change everything. Maybe I am jaded from experiencing that same winter, but from what I recall the prior decades were more consumed with people making broad claims to try to establish intellectual primacy more than making claims about what could be delivered.
there might be some kind of a contraction when people realize they're not going to get HAL-9000, or that you can't use deep learning to replace every white collar employee. but the results this time are much more "real".
I don't think we will get another AI winter where the entire world completely gives up on the research area.
Some of it was due to non-delivery of hyped predictions, some of it was due to sudden disappearance of military funding that helped spread the hype.
The 5th generation project was part of the hype, but at least in the west, lesser part of the winter.
The original AI Winter was, iirc, related to UK report in 1960s/1970s.
The second AI winter killed a lot of advancement in computing in general, combined with new waves of programmers coming from very limited environments who were never exposed to more advanced, capable techniques - by the time they got to play with them, AI winter was in full swing and they were pushed out as "toys".
While expert systems in naive form were definitely not the answer to many generic problems, they are often part of the answer, and the amount of time I hit cases of "this could be done as rule system and be clearer", or even "Why can't I depend on features of 1980s XCON in 2020??? It would have saved me so much money!" is simply depressing.
(n.b. I suspect a significant portion of why many modern shops don't even look toward features like XCON - which ensured correct configuration in computer orders - is because the norm became that customer pays for anything like missing necessary pieces)
Lisp was the language people like Richard Stallman or John McCarthy(the inventor of Lisp) used at MIT AI laboratory:
Everybody used Lisp there and young people that learned from the masters learned lisp too.
But that was AI 1.0. Then came the AI winter and 2.0 spring with GPUs that were programmed in C dialects and gave incredible levels of raw power.
So, as a high level access to low level C code, python was picked by most people.
((((((((((((((((( by the way
Typing )))... is easy; just hold down the ) key to repeat, and watch the cursor jump back and forth due to parenthesis-matching. When it flicks back to the correct target, release, and backspace over any overshot parens.
or in JS: