Hacker News new | comments | ask | show | jobs | submit login
How innovative is Clojure as a Lisp dialect from Lisp expert's perspective? (reddit.com)
150 points by myth_drannon 9 days ago | hide | past | web | favorite | 134 comments





Outside of applying the research of persistent immutable data structures (which I think is from the Okasaki paper[0]), I think the most innovative part of Clojure is the fact that it's actually used -- they've managed to build a community and ecosystem with high quality packages and buzz around lisp.

I don't pick Common Lisp/Scheme/Racket despite desperately wanting to because IMO Haskell has a better package ecosystem in my eyes, which is saying a lot. For example, try and find a HTTP server (by far the most common usecase for new projects IMO) that is both multi threaded and async IO capable (i.e. maximally using every core) -- it's surprisingly difficult, I found decent seeming libraries that did either-or, with varying levels of polish and paradigms involved.

I personally prefer CL to Clojure but I just can't imagine trying to convince an enterprise to use Common Lisp these days especially since I can't even really convince myself... Clojure would have the same problem but the ecosystem + buzz is the differentiator in my mind.

[0]: https://www.cs.cmu.edu/~rwh/theses/okasaki.pdf


The persistent datastructure that clojure used are actually Rich own invention, building on Okasaki and Phil Bagwells.

Phil Bagwells then started working in that direction also, see:

https://www.youtube.com/watch?v=K2NYwP90bNs

Some of Clojure data structure are pretty much from Okasaki, but Vector or HashMap.


There's a talk from Bodil Stokke where she explains (w background) and live-codes the core of Clojure's HashMap algorithm, in TypeScript: https://youtu.be/cUx2b_FO8EQ

Indeed there's novel stuff there.


You should also read Phil Bagwells paper on Ideal Hash Trees. I believe they were also highly influential.

http://lampwww.epfl.ch/papers/idealhashtrees.pdf


By the way Okasaki not just has a paper, but a whole book on that topic. It has code examples in ML and Haskell. It's really nice if you want to learn to design persistent data structures.

I just double checked and I think the link I posted is to the book -- I thought it was just one of the papers but that PDF is the actual 160 page book that was his thesis.

Unless I'm misunderstanding, and book != thesis.

[EDIT] - Looking through the Amazon listing, it looks the book is certainly not just a republish of the thesis... Looks like I have even more reading to do


+1 Okasaki’s book is a great reference. I’ve personally given out a few as gifts.

https://smile.amazon.com/Purely-Functional-Data-Structures-O...


An async http server ? What about Woo and Wookie ? https://github.com/orthecreedence/wookie seen on https://github.com/CodyReichert/awesome-cl#network-and-inter...

also for reference some companies keep betting on CL today: https://github.com/azzamsa/awesome-lisp-companies (and I suppose that's just the emerging iceberg)


> An async http server ? What about Woo and Wookie ? https://github.com/orthecreedence/wookie

Wookie is one of those that does either-or -- the main competency is async io, and it offers multi-threaded perf with some plumbing (see the Threading guide[0]).

I don't mean to put down the libraries out there (lack/clack, hutchentoot, wookie, woo, ningle, etc), but for languages that aren't single-core-by-design/circumstance (i.e. node, python, ruby), I don't expect it to be this difficult to find a library that handles both multi-threaded operation and asyncio at the same time, abstracted cleanly so users don't have to think about it or add plumbing.

Performant web servers that take advantage of what your language is good at is table stakes these days, if things like this are not standardized/obvious choices at this point it tells me that I won't be able to find other things down the line. It's a circular problem though, someone has to build all the libraries that I'm expecting to find, and it could be me -- I just don't want to join the ranks of those committed to the CL ecosystem, right now at least.

> for reference some companies keep betting on CL today: https://github.com/azzamsa/awesome-lisp-companies (and I suppose that's just the emerging iceberg)

I didn't say that companies don't bet on CL, I just personally couldn't imagine recommending it to anyone as the language to do a new project/startup/whatever in. Lisp has value, but for me that value is much more slanted to the way of thinking than for practical usecases, even more than Haskell which is weird.

[0]: http://wookie.lyonbros.com/docs/threading


>> “I think the most innovative part of Clojure is the fact that it's actually used

Popularity is not “innovative”.


IMO it depends on the lens through which you're looking. If "Clojure" in that sentence to you means the language and the features, then no, popularity is clearly not an innovative feature and is basically orthogonal. It obviously doesn't matter how popular the language is if what we're talking about is whether it supports <language feature x>.

However, if "Clojure" in that sentence is interpreted in a wider, more general and necessarily less concrete sense -- to represent the language, ecosystem, branding, etc -- good marketing/advertising, community management, and general operations-level can be rife with places to innovate.

An excellent language that never gets adopted has effectively zero value IMHO until it's corpse is ransacked by a language that does get adopted. People rediscover things that older languages have done all the while almost every week it seems -- creating a language and the requisite community to actually drive adoption can be considered innovative if the rest of the rest of the landscape is sufficiently similar. Clearly the Clojure project is doing something differently from the Common Lisp/Scheme/Racket crowd.


It's too early to say. Clojure's still very young in Lisp terms, so it's natural that it has more hype around it now than languages that were cool 30 years ago. It's also not exactly taking over the world, and I think you're forgetting the buzz Lisp had in the 80s. It will be an innovation in the sense you mean if it has wide adoption and hype in 2050, but it's certainly not anything unprecedented yet.

IMHO It is.

It is like Windows. When it appeared it was significantly inferior to Unix options, but it made a hundred or a thousand times more people to experience computers.

If a 1 to 10 percent of people is naturally innovative and creative, then the platform with most users ends innovating way more.

Lisp use at first was restricted only to Academics, it was so inefficient that you even required a very special(and expensive) purpose computer to run it. Those academics were not paying the machine with their paycheck so they didn't care.

More affordable solutions like c became popular and forced Lisp to improve(creating compilation pathways and so on), when it was obvious everyone was using the c toolchain(c++, objc, java).


> Lisp use at first was restricted only to Academics

Lisp was spreading in the 60s to everyone who had a mainframe or mini computer. In the 70s first applications were built and companies like BBN, Xerox, IBM started to use Lisp. In the 80s Common Lisp was standardized upon a DARPA initiative to have a single Lisp language for the applications they were prototyping or deploying in the military. The military got huge applications for battle management, battle simulation&training (like SIMNET), planning, design & construction developed by contractors.

> it was so inefficient that you even required a very special(and expensive) purpose computer to run it

Lisp developers required a personal workstation and thus they co-invented them in the 70s. Machines with 1 Megabyte or more RAM for a single user. Machines where one could develop and work with programs like Macsyma, an early computer algebra system widely used in maths and physics.

> creating compilation pathways and so on

efficient execution of Lisp on non-special-purpose hardware was available in the 70s for example with Maclisp and in the 80s with Common Lisp.


Definition of "innovative":

    Characterized by the creation of new ideas or things
Popularity isn't innovation by its definition. These two issues are orthogonal.

>> "It is like Windows ... but it made a hundred or a thousand times more people to experience computers."

Right, but that's completely irrelevant and not innovation.


"Innovative" is a great example of an aircraft-carrier-clearance word. The way this word is used in e.g. startup context, popularity can absolutely be innovative - after all, if Lisps were not popular, and now one suddenly is, it's a new thing therefore innovative (send funding rounds this way please).

Honestly, I think the word "innovation" needs to be tabooed from the discussion to get meaningful questions and answers.


>>> “I think the most innovative part of Clojure is the fact that it's actually used”

> Popularity is not “innovative”.

But the way to reach popularity is innovative, at least for a lisp.


As a Lisp dialect it isn't that innovative besides a few syntactic improvements [0]. I'd gladly use Common Lisp instead if it could do something like ClojureScript. But by design it'd be hard to pull off.

Like a good politician Clojure is pretty unimpressive in of itself, for any given characteristic, someone has a better take. But as a whole it's one of the few sensible choices out there.

[0] I feel like they're generally underrated. Adding [] {} and #{} might not seem like much, but when writing DSLs they really do make all the difference.


I appreciate that CL doesn’t have reader macros for those characters too: it allows me to do things like repurpose [] for objective-c method calls.

https://github.com/fiddlerwoaroof/objc-lisp-bridge


Well, Clojure doesn't have programmable reader macros either way.

As long as you're not dead set on specific characters you could do:

   (defun extract-from-objc (obj)
     (objc-typecase
      obj
      (%@NSDate [[[[%@NSISO8601DateFormatter @(alloc)]
                   @(init)]
                  @(stringFromDate:.) :pointer obj]
                 @(UTF8String)]s)
      (%@NSString [obj @(UTF8String)]s)
      (%@NSNumber (parse-number:parse-number
                   (objc-runtime:.:extract-nsstring
                    [obj @(stringValue)])))
      (%@NSArray (map-nsarray #'extract-from-objc obj))
      (%@NSDictionary (fw.lu:alist-string-hash-table
                       (pairlis (map-nsarray #'extract-from-objc [obj @(allKeys)])
                                (map-nsarray #'extract-from-objc [obj @(allValues)]))))
      (t (or (funcall-some (cdr (objc-pick-by-type obj *objc-extractors*))
                           obj)
             obj))))
The reason I say they're good for DSLs is exactly for re-purposing to have a special meaning. Yes, nowhere close to being as powerful as reader macros. But the nice part is everyone can and knows how to work with these primitives.

I do wish we had reader macros like CL, but I'd still totally want the built-in [] and {}.


I really like how most of the non-sexp syntax is optional in CL. Anyways, I'm also a bit ambivalent about the merits of [] vs. #(): I like how the latter syntax is more "standard", since the # dispatching macro character is used for a bunch of things. E.g. multidimensional arrays #2a() #3a() complex numbers #c(1 2) character literals #\space, pathnames #p"/foo/bar" etc.: it adds to the uniformity of the syntax, ime.

Anyways, I guess I like CL's approach where #[] #{} [] and {} are all specified as reserved to the user and then, using named-readtables, you can pick an appropriate implementation as you see fit.

> do something like ClojureScript

I'm betting on the ongoing Weblocks rewrite: http://40ants.com/weblocks/quickstart.html Here no need to transpile to JS, you just always stay in CL, in the CL environment, state is preserved on edits, etc.


> I'd gladly use Common Lisp instead if it could do something like ClojureScript

It does. Parenscript has existed for years.

And there's also JSCL.


Parenscript is pretty close, as is js-cl. The issue is mostly that no one has written something like figwheel or shadow-cljs yet.

Never heard of js-cl, but Parenscript definitely isn't. It's a glorified s-exp notation for javascript. ClojureScript doesn't compile to plain js. It maintains a lot of it's characteristics at run-time. And this is possible because Clojure does something very similar on the JVM. This in turn enables you to write code targeting both platforms without too much effort.

WISP is a little more like the Parenscript of Clojure (though it's written in javascript itself) https://github.com/Gozala/wisp/blob/master/doc/language-esse...


js-cl is a (still-partial) Common Lisp implementation in JavaScript. https://github.com/jscl-project/jscl/blob/master/README.md

Parenscript is a bit more than just a glorified s-expression wrapper for JavaScript: it also implements a lot of the core CL macros (defun, let, lambda, etc.) in ways that are close to the CL semantics. It’s limited by the fact that it doesn’t want to have its own runtime: so, while there is some degree of source-level compatibility, it doesn’t have things like restarts and CLOS that would require runtime support (although, there is a library called the Parenscript object system that provides some degree of source compatibility with CLOS.


Yea, it looks like js-cl is more like it. You'd want to implement some reasonable subset of CL. And ClojureScript really didn't have to compromise on much, which is why it's been embraced so easily.

Most of the time you just have a .cljc file and pretty much the same exact code just works on both platforms. In a few places you'll use the #? reader macro that lets you do different stuff based on platform, and that's it.

Well, you basically get simple macros for free as long when you have an s-exp based notation for something. Which is why I do in fact like them. Something like Parenscript or Wisp is really great for lightweight websites.

I remember way back around 2010 or 2011 playing around with CL and Parenscript a bit and it looks like the landscape isn't too different today.


>Parenscript definitely isn't. It's a glorified s-exp notation for javascript.

It is not. Parenscript is a subset of Common Lisp that then gets translated to Js.

ClojureScript is also a subset of Clojure that runs on javascript.

> ClojureScript doesn't compile to plain js

How does it run on a javascript-enabled browser, then?


Yes, you could call both of them subsets. But one is so small that for most purposes is incompatible for anything but the simplest uses. While the other lets you run an almost identical codebase on two platforms. There are plenty of people who only ever write ClojureScript and couldn't care less about the JVM Clojure, because it really is that close.

And yes, as other mentioned js-cl is trying to do something similar, and I hope it works out!

Sorry bad wording on my part, was just reiterating the point that Parenscript maps closely to regular looking js, while cljs abuses js quite heavily.


Someone has a long time ago:

https://github.com/johnmastro/trident-mode.el

It works like a charm.


the best innovative: Clojure is a functional programming language based on relational database theory.

```

               Clojure -> DBMS, Super Foxpro

                   STM -> Transaction,MVCC
Persistent Collections -> db, table, col

              hash-map -> indexed data

                 Watch -> trigger, log

                  Spec -> constraint

              Core API -> SQL, Built-in function

              function -> Stored Procedure

             Meta Data -> System Table
```

I don’t care about static or dynamic types, nor about FP, LP, or OO. For me, they are overly complex, unreliable, and unscientific. I think they are very bad and upset me.

The production methods and business management ideas of large industries are also more mature than FP&OO. I have used them as programming ideas.

I think that RMDB is the simplest and most reliable in theory and practice, and it is the most rigorous, long-term, high-stress test in critical situations.

Before using clojure, I was a Foxpro programmer. I used clojure as a super Foxpro, and I also used it successfully in the field of WebApp and R language mixed programming. I will continue to apply this routine to the AI field in the future.

The main development goal of clojure is to write the database. The development idea is actually from the database, not the FP.

https://github.com/linpengcheng/PurefunctionPipelineDataflow


>I don’t care about static or dynamic types,(...) I think that RMDB is the simplest and most reliable in theory and practice

But... RDMBs are often statically typed (you specify the column type when creating it), and strongly typed (they often don't do auto-conversions of data types.)


With reference to the database, as long as I use spec to strictly define (standardization) the core data model, I can ensure the correctness of the system.

I've turned the traditional work of ensuring code correctness from in-code type system validation to data self-validation. Turn the work into verifying the core data model instead of validating the input and output data types of the function.

A system requires only a few core data models, and development is built around the core data model.

Persistent data structures ensure that the modification of the immutable big data model are high performance and memory efficient.

In addition, using my pure function pipeline data stream (https://github.com/linpengcheng/PurefunctionPipelineDataflow) makes debugging, parallelism and extension very simple.


Similar to industry, verify that all finished products meet the standards before entering the warehouse.

Also similar to databases, verify their compliance before data enters the database.

That is "Data as a service, Warehouse as the core, operates around it".


I think their point is that it's the ability to express relational algebra in a natural way that's important, and the static/dynamic types thing is orthogonal to that.

There are alot of interesting points of view in that thread, unfortunately they're all from people who've spent very little time with Clojure.

The biggest innovation I think, its putting Clojure on the JVM allowing for tight host integrations. This has been my way of introducing Clojure in a number of places it wouldn't otherwise have been able to go.

In terms of writing better code, the performant immutable datastructures yield more power than I can sum up a single comment.


The jvm is really nice. Operationaly, it’s kind of a pain. You can upgrade the installation on bare metal, you can play the packer game and ship new vms, or you can have crazy fat docker containers. K8s can sort of deal with the vm model, but it’s beta.

Go is pretty nice in this regard. I should look into racket and chicken again for static binaries. I gotta say “from scratch “ is a fun way to start a dockerfile


If you want to write a program in a Lisp and ship it as a compiled binary, your best bet is to use Common Lisp and compile it with SBCL. A stripped SLAD binary will be way smaller with way less overhead then running a whole host of bytecode on a VM like Racket or Clojure.

But few people care about overhead or executable size much anymore, outside of embedded systems.

What most people care about is computational performance, and clojure performs OK by this metric (certainly SBCL can still beat it in many use cases on this metric as well, but often only by a modest margin)


sbcl also has the ability to statically link all your C dependencies using CFFI’s STATIC-IMAGE-OP

Clojure can also do it. Java Abstracted Foreign Function Layer https://github.com/jnr/jnr-ffi

I don't think that's exactly the same: CFFI has the ability to bind C libraries with minimal boilerplate just like jnr-ffi does. But, in addition, with CFFI (and sbcl built with a particular set of flags) you can generate an CL image that only depends on GLIBC because you've statically linked all your .so dependencies into the resulting image.

If you are writing software in an enterprise then Java / JVM is a well understood quantity with a whole lot of deployment options. Being something that looks exactly like every other Java program from the point of view of those deploying it is a huge win.

Can you expand on the comment about Kubernetes and VMs?

I don't know much about Kubernetes but AFAIK people just put a jar file and JVM in the container image and run it just like any other server app. The jar files can get big (100+ MB) if you have lots of deps, but not to the extent that container image size would be problematic in normal server app environments.


Well JAR files are that big because many java devs import libraries that they never use and you end up with 100+ dependencies when you really need 10. Clojure (Maven) has a way of dealing with this using :exclusions. My typical jar files are 10 - 50MB which are manageable today very well.

You can compile the Clojure project with graalvm, or you can compile the Clojure-clr project (or Clojure project translated by ikvm.net) with the .net core CoreRT

It's debatable how innovative that was because ABCL was running on the JVM before Clojure. Clojure was definitely marketed better, though.

+1 for mentioning Armed Bear Common Lisp. I usually use SBCL but ABCL is great when you need Java interop (I use it with DL4J).

I have used Clojure when requested by customers (about 18 months total) since its inception: I donated to Rich when he asked for community support because I liked the Common Lisp to Java bridge he wrote many years ago.

Clojure is a good language that has a uniform API for anything that is a sequence and I (usually) like working with immutable data.

That said, I enjoy CL and Scheme more.


Not marketing, just a bunch of different core ideas, immutable/concurrent first. It was a lot more attractive than yet another CL port.

There's been a few Lisps targeting JVM since early 2000s, back when JVM arguably had a lot larger mindshare.

Given the amount of embedded devices with a JVM running on them, and the OSes that have someone selling a JVM for them, I would say the mindshare is even bigger nowadays, just not hip as it once was.

Unless you count Dalvik, there's probably more embedded devices running Node than JVM nowadays. The share of Java related technology has definitely declined on most platforms.

SIM Cards, eletricity meters, Cisco phones, Ricoh printers, factory control systems, IoT gateways, BluRay players, there are tons of invisile JVMs running on the wild.

You could say that in 2003, and that would have been even more true. Especially about BluRay players ;)

My local library has started replacing DVDs with BluRay discs.

It's kind of a trick question. Clojure pulled together a bunch of ideas from other languages and packaged them on the JVM. I'm not sure any single feature is particularly innovative.

Lisp is well known for its ability to adopt ideas from other languages, and most of the important Clojure features (immutable data, transactional memory, etc.) were already available for Common Lisp and Scheme, either built-in or as libraries.

My guess is that most Clojure users came from Java rather than another Lisp. Moving from Java is an obvious win (IMO), but as a CL user I don't see much reason to use Clojure.


Scheme and CL have immutable data structures, but not persistent data structures.

But the paper that introduced the term 'persistent data structure' said they already existed in Lisp and gave the example of a persistent queue built by sharing the tail of an existing one with a new head?

Doesn't have to be built-in to exist in the Lisp ecosystem.

Sorry I don't understand what that means?

A Lisp list is a persistent data structure.


Sort of.

First, conses are mutable so lists are persistent as long as they are used in a persistent way -- it has to be enforced through the codebase (and deps).

Second, lists big-O access/update costs are not as interesting as persistent maps and vectors.


A Clojure seq offers much more than that. But you can re-create it in CL, it's just not built-in.

I was replying to

> Scheme and CL have immutable data structures, but not persistent data structures.

And I said they do. You've said Clojure persistent data structure offer more. Ok, but I just said they existed in Lisp and they do.


Not built-in, but there are libraries.

But afaik most immutable, persistent data structure libraries in other lisps either (1) had poor performance or (2) were developed after clojure and borrowed from clojure.

Isn't changing the head of a Lisp lisp by referring to the same tail from a new head an example of a persistent data structure?

Strictly speaking: yes. But whether conses (aka pairs), lists, or other derivative structures support pervasive FP in a language is probably most influenced by whether conses are mutable or not.

In Common Lisp, they are. In Scheme, they are not.

Mutable conses are useful because at least because values can be accumulated in a list from the end by retaining and mutating the tail.

However, the presence of mutable conses detracts from a language's ability to support FP, since it must be done by convention and with much copying.

I see Clojure's innovation in this area not in the fact that its lists are immutable (Scheme did this already) but in the fact that its lists are immutable and there's never a need to mutate the tail, because lazy sequences are supported throughout the language. The icing on the top of the story is that concrete lists and lazy sequences both inhabit the "Seq abstraction", and so don't require different APIs most of the time.

Of course, I'm pretty sure lazy sequences existed before CL was standardized (they're in SICP), but I can imagine how the designers would have preferred a simple, well-known tool (mutable conses) over admitting a whole other sequence abstraction. Plus, immutable conses would have been a breaking change.

So, it's not that anything that Clojure does with regard to lists couldn't have been done before, just that it wasn't -- while also being packaged in a vehicle that was wildly compelling for many other unrelated reasons.


Scheme's conses are mutable: r7rs (and the previous standards) define set-car! and set-cdr! to mutate the parts of the conses. One of the reasons why Racket is not a Scheme is that Racket's conses are immutable.

Thanks for clarifying. I remembered it as a point of contention in the Scheme community.

> because lazy sequences are supported throughout the language.

Most of the time Lisp implements these data structures themselves - many/most Lisps are largely implemented in itself. As such Lisp exposes and provides very low-level data structures like conses, which are simply two element cells. Basically Lisp here is on the level of assembler - which is reflected that the historical name CAR means something like 'contents of the address register'.

Clojure's persistent immutable sequences are a very different data structure. Clojure does not expose its implementation and the implementation does not easily map to hardware, especially since quite a bit of the language is implemented in terms of the JVM and the Java language.

Example:

https://github.com/clojure/clojure/blob/master/src/jvm/cloju...


"lazy" and "immutable" are orthogonal. "lazy" happily co-exists with "mutable". TXR Lisp has mutable lazy conses:

  1> (take 12 (range 1))
  (1 2 3 4 5 6 7 8 9 10 11 12)
  2> (let ((r (range 1)))
       (inc [r 4] 15)
       (take 12 r))
  (1 2 3 4 20 6 7 8 9 10 11 12)
There is no reason to prevent mutation; all we have to do is assume that programmers are grownups and treat them as such.

Remember, mutate responsibly; if you mutate, don't derive.

Even if someone has been mutating "harmless" local variables, do not jump into the same lexical scope with them.


I think you're referring to "association lists" which definitely predate clojure and are definitely a persistent data structure and very cool in their own right. However, their performance profile is pretty poor (i.e. O(n) lookups) and are therefore rarely used in modern lisp programming anymore, to the best of my knowledge.

"Persistent" generally means something like "survives process restart and machine reboot".

Not in this context. Persistent here means "stable". It is like having a versioned database; any reads of that version are stable. Once a version is created, it cannot be written to. All further additions create newer versions, but those changes are not visible via a handle to an earlier version.

I see now that this term was introduced in a 1986 paper about versioned, immutable data structures (Making Data Structures Persistent, Driscoll, Tarjan et al).

I'm well aware of the techniques, just not the word.

Interestingly, they were aware of the connection between their work and Lisp:

"The node-copying method of Section 2 can be modified so that it is write-once except for access pointers. The main modification is to handle inverse pointers as discussed in Section 3. The write-once property is important for certain kinds of memory technology. Also, it implies that any data structure of constant bounded in-degree that can be built using the augmented LISP primitives cons, cdr, replaca, and replacd can be simulated in linear time using only the pure LISP primitives cons, car, and cdr. Thus the result sheds light on the power of purely applicative programming languages."

Or, rather, it seems they were made aware by a reviewer:

"We thank Nick Pippenger for noting the connection between write-once data structures and the power of pure LISP."

(Good old Nick didn't go as far as to point out that rplaca and rplacd can introduce cycles, which cannot be constructed with cons over existing structure.)

Sadly, no reviewer similarly piped up about "persistent" being taken already for durable storage.


Does it? I thought for data structures it was simply that it survived editing. That is, old versions are always accessible.

It's an overloaded term: in some constexts it means "survives reboots" but in this context it means something else.

I don't know. Outside of a few narrow use cases I personally don't think there's much advantage to using explicitly persistent data structures, so I haven't tried them, I just know the libraries exist.

How do you implement a persistent hashtable when your only abstraction is a linked list?

> only abstraction is a linked list

What language are you referencing here?


>> But afaik most immutable, persistent data structure libraries in other lisps either (1) had poor performance or (2) were developed after clojure and borrowed from clojure.

>> only abstraction is a linked list

> What language are you referencing here?

They are referring to Lisp.


You'd do that in roughly the same way as in any other language that allows you to heap-allocate stuff and pass references to it.

Does it have to have the performance characteristics, or just the semantics of a hashtable?

I found the conversation to be sub-par and disappointing. I also find it discouraging that these arguments inevitably boil down to a Clojure-vs-Common-Lisp argument. What about Racket, Guile, Chicken Scheme, etc.?

Probably because CLojure feels the closest to Common Lisp, so it's the best comparison.

Vsevolod Dymokin, a known name in the CL Community, especially for his work around NLP in CL has posted a detailed article on his views about Clojure. [1]. Worth a read.

FWIW, when I worked in Clojure in it's early days, I disliked when it dropped into Java stack traces and dealing with Java exceptions made dealing with Clojure REPL not a great experience. I don't know the state of things today.

[1]: http://lisp-univ-etc.blogspot.com/2011/11/clojure-complexity...


It is still far away from CL conditions, but clojure 1.10 has improved error messages A LOT through spec (a sort of optional typing system), and the plan is to improve it even better (cryptic error messages have been for some time the top issues in the yearly community feedback poll).

I think people here are making a lot of good points put I want to be a bit more specific. I don't have a lot of experinace with others lisps so please correct me.

- The EDN (https://github.com/edn-format/edn) data format is a further development of the code is data idea. 'Tagged Literals' are interesting feature and used in Clojure to make code shared between different Hosts, mostly JS/JVM.

- Transducers are a new interesting features that are fairly unique to Clojure (https://clojure.org/reference/transducers)

- Reducers (https://clojure.org/reference/reducers)

- Clojure Multimethods are different from CLOS in CL and one interesting feature are stand alone hierarchies. Some people would maybe call this an step back from CLOS. https://clojure.org/reference/multimethods

- Concurrency primitives like Agents and Refs (featuring Software Transactional Memory) are fairly unique to Clojure.

- Metadata. In Clojure most types can have metadata attached, meaning data that flows threw the program with your data but does not effect things like equality or size.

- Protocols are dynamic single dispatch system

- spec. Clojure Spec is a core part of Clojure now and used internally as well. Its a type of dynamic specification system and not like most systems of this type (https://clojure.org/guides/spec)

Some more information here: https://clojure.org/reference/lisps


I think Clojure is certainly innovative in that it includes software transactional memory in its core language. Scheme doesn't, Common Lisp doesn't, and perhaps most other lisps do not. It's not very often used, but it's a welcome addition to the core language.

STM has nondeterministic performance characteristics and doesn’t live up to expectations.

STM is indeed non-deterministic in performance, but non-determinism is basically a given in all concurrency problems.

In the same vein you can also say locks have non-deterministic performance characteristics: I never know how long I'm going to wait when I'm trying to acquire the lock! In other MVCC scenarios such as databases you also don't know whether your final COMMIT will fail due to concurrent writes and you have to retry.

Furthermore Clojure's STM in fact has an extra primitive called commute that is not present in Haskell's STM. It makes commutative changes even faster.


Can you elaborate? I have used it successfully in many projects. People keep saying it has bad performance but I haven't seen any proof yet.

"Nondeterministic" doesn't mean "bad", just that things can be OK until suddenly they aren't, and you can't really predict it a priori.

MVCC with the same principle as STM has been widely used by RMDB.

My comments from Reddit threads:

"I have not used Clojure professionally, but I welcome it, and applaud the effort that has gone into its creation and development. It's being used to solve real problems for real customers, and supplying jobs for those wanting to work in a Lisp family language. I also have a personal fondness for the immutable data structure work.

Is it perfect? Nothing is perfect. I'd rather focus on the wins than lament blemishes."

and

"Not all of us CL people are like that, even the language lawyers who read the standard for entertainment. :)

I find a lisp that explores a different part of design space to be inherently interesting, btw. The constraints surrounding Clojure are different from those around CL, so design choices have been different. The lessons learned when these choices collide with real world users cannot be obtained in any other way."


I think I am qualified to answer this: I wrote reasonably complex systems in Common Lisp before I gradually moved to Clojure several years ago. I have written several fairly large apps and systems in Clojure and Clojurescript (this includes https://partsbox.io/ as my current project and apps like an e-commerce search engine processing hundreds of requests per second for many years).

I used to think Common Lisp was a fantastic language, but I always tried to keep an open mind remembering the Blub paradox. Then I started working with Clojure and after the initial friction (some things were just plain annoying in the beginning) things clicked for me. It was, quite simply, a much better language.

The major improvements over Common Lisp for me were: 1) immutable data structures: these days I can't even imagine dealing with code that mutates data in-place, 2) a great approach to concurrency and parallelism (it's really difficult to get yourself deadlocked in Clojure), 3) Rich Hickey coming out every year or so with a solution to yet another problem (core.async, transducers, spec), 4) really flexible data structures.

I know there are many CL advocates who will try to point out that CL has all these things. But these will be the same people who will be annoyed when Java programmers point out that Java has all these things, too. It's just a matter of friction, elegance and ease of use.

One additional factor which is often glossed over is Clojurescript. Most software these days exists in some form in the browser, too. Being able to use pretty much the same language (cljc files can be compiled both server-side and client-side to Javascript!) on both sides brings many advantages.

People sometimes ask me if I miss CLOS or the MOP. Not at all. It's a great set of concepts, worth learning (the green book makes for great reading). But I found that larger systems using CLOS were really difficult to understand and follow (just following the code flow is a nightmare if you have a lot of multimethods), and systems that used the MOP in any but the most basic ways were intractable without a debugger. Besides, please see above what I wrote about immutability: I'd much rather have immutable data structures.

Clojure's emphasis on functions and data, while it may be based in the 60s, is one of the really good ideas. And Clojure is really, really good at tackling complexity, it's just not apparent in the language design. While other languages focus on syntax sugar, Clojure's focus is on getting things done on a large scale. You don't see that, but things like namespaced keywords, clojure.spec, pre/post conditions, maps, sets, all play together to allow you to create extensible and reusable code.

Also, it's easy to discount certain solutions in Clojure as "also existing" elsewhere, but without actually using them in real systems one often doesn't see the full advantage. Transducers are a good example: I think they are under-appreciated and under-utilized. Common Lisp veterans will laugh and say that SERIES did the same thing back in the 80s. But SERIES was all about optimization and anyone who actually tried to use it will have stories to tell about how easy it is to use. Transducers, on the other hand, if used to build data-processing pipelines processing Clojure maps with namespaced keys, are an elegant and efficient tool for building composable systems. Plus, once you've coded your pipeline using transducers, it's a snap to parallelize it.

Looking at it differently: to "get" Clojure one has to understand that there is no silver bullet: no single revolutionary "feature". Instead, the gains come from tiny improvements all over the place: immutable data structures, good built-in library for dealing with them, core.async, transducers, spec, Clojurescript, not having to convert to JSON or XML on the way to the client, re-using the same code on server-side and client-side, and more.

In practical terms, I would never be able to write and maintain by myself the system I'm currently working on (https://partsbox.io/) without Clojure and Clojurescript. It isn't impossible, but I can't imagine tackling the complexity in a comparable time frame.

I can't imagine going back to Common Lisp. I sometimes have to look at the code I've written in CL, and it looks clunky. Especially place-oriented programming (setf all over the place) is something I really really don't want to go back to. It's a nightmare, particularly if you intend to have any kind of concurrency.

I still try to keep an open mind, remembering about the Blub paradox. I make it a rule to regularly check and learn about new languages. There are some which I found really good for certain types of applications and which I would use instead of Clojure in certain cases (Elixir is a good example). I still haven't found anything comparable which is a general-purpose language, especially if you consider the client-side story — but I'm checking all the time

See also my answer to a similar question posted on Quora several years ago: https://www.quora.com/What-do-veteran-Lisp-programmers-think...

Also, as a general observation: most (not all, but most) criticism of Clojure seems to be written by people who did not write anything significant in it, and much of that by people who simply do not write large software systems at all. Reading many discussions, I gloss over most comments, as they are simply beside the point and not even worth addressing.


This is really interesting, I write mostly Common Lisp as hobby projects (at varying levels of scale, I've submitted patches for sbcl and mcclim and I've written a bunch of libraries) and at work I help maintain a data pipeline written in Clojure. I've generally found that I much prefer the "feeling" of writing Common Lisp and have come to think that immutability in the core + being hosted on the jvm forces some compromises on the interactive development front that make writing Clojure always feel like something of a chore.

I had similar feelings in the very beginning, but I was quickly won over (I didn't see compromises in interactive development, though, unless you mean difficult debugging because of lazy sequences). Debugging in general was something I missed the most (after all, even in CL, I implemented return-from-frame in CMUCL to get better interactive debugging). And to this day I think CL has better interactive debugging than Clojure.

But Clojure has so many advantages that it isn't even a close contest.

Also, most of the code I write today isn't hosted on the JVM, it runs in the browser. Or gets compiled to both JVM bytecode and highly optimized JavaScript.


The more I've used Clojure, the less I've liked it: protocols don't work correctly, multimethods are badly designed, core.async isn't source-compatible between CLJ/CLJS, unless you know what to avoid, etc.

As far as interactive development compromises go, I'm mostly thinking about how in Clojure, it seems like the only practical way to load your new code is to use something like tools.namespace/refresh (or whatever it is). And, that essentially involves throwing away everything and rebuilding it from scratch: while that has some nice properties (e.g. you don't end up with an inconsistent image state), it's also really annoying because you have to restart all your servers, etc.

And, the other thing that makes CL standout from every system I've used except Smalltalks, is that Quicklisp loads third party systems into the current image without restarting: there are a bunch of tools I've seen floating around that claim to be able to do this for Clojure, but I've never really been able to make any of them work consistently.


Much of what you describe reminds me of when I started playing with Kotlin and the fresh feeling it gave me that made me not want to use other languages again. Simplicity and just getting stuff done. I see Rich Hickey had kind words for Kotlin on his last conference talk.

That's really cool to see that you built a BOM system in Clojure. I worked with BOM data for electronics a lot in my last job and while I was mostly confined to using SQL to do it (recursive CTEs ugh), I often thought about how much easier it would be to manipulate the BOM hierarchy structure in a lisp-like language.

I think the Sequences abstraction[1] is a pretty important one. I'm not sure it's unique to Clojure though, I have no other practical Lisp experience.

[1] https://clojure.org/reference/sequences


Never used Lisp, but huge Clojure user, the real innovation is the immutable data structures:

RICH HICKEY: Ok, so, problem number one on that list was place oriented programming. Absolutely, this is the problem. Almost all the programs I wrote, lots of the things on that list were multi-threaded programs, you know, they're crazy hard in C++. Just impossible to get right, when you adopt a normal mutability approach, you know, mutable objects. So, this is the number one self-inflicted programming problem. It seemed, you know, clear to me that the answer was to make functional programming and immutable data the default idiom. So the challenge I had was, were there data structures that would be fast enough to say, "we could swap this for that". And the objective I had, the goal I had was to get within 2x for reads and 4x for writes. And I did a lot of work on this, this was actually the main research work behind Clojure, was about these persistent data structures. Eventually I found...you know I looked at Okasaki's stuff and, you know, the fully functional approach and none of that gets here. And then I found Bagwell's structures were not persistent, but I realized could be made so, and they just have tremendously great characteristics combining the persistence with the way they're laid out, the way memory works. They made it. They made this bar, and I was able to get my friend to try my programming language. You know, we (?) have large library of pure functions to support this, and, you know, immutable local bindings. Basically if you fall into Clojure, your first hurdle is not the parentheses, right? It's this, this functional paradigm, everything is gone, there's no mutable variables, there's no state, there's no mutable collections and everything else, but there's a lot of support, there's a big library. You just have to, you know, sort of learn the idioms. So I think this was straight-forward, the critical thing that's different about Clojure is, by the time I was doing Clojure, the people who invented this stuff had adopted a lot more, right? I think most of the adherents in the functional programming community considered functional programming to be about typed functional programming, statically typed functional programming is functional programming. And I don't think so, I think that this is, you know, this falls clearly in the 80/20 rule. And I think the split here is more like 99/1. The value props are all on this side, and I think Clojure users get a sense of that, they get a feel for that. This is the thing that makes you sleep at night.

https://github.com/matthiasn/talk-transcripts/blob/master/Hi...


On top of persistent data structures, I find the main innovation of Clojure is the way it mixes functional programming with dynamic typing into its unique paradigm:

a) Data is just data, divorced from behaviors, and functions (not methods) manipulate data. This is really just old-school functional programming.

b) Data is represented by hashmaps, and are extensible - if the data gets richer over time (= more keys in the hashmap), functions that used to work on the previous, poorer data still work on the new, richer data. In other words, each key of a hashmap is automatically an interface, and data satisfies interfaces depending on which keys they have (structural typing).

Now, this interface mechanism is a bit sloppy if keys are just string identifiers, but this is overcome by namespacing keywords (which has its own syntax, ::keyword), thus removing the risk of the same name being used for different purposes. This idea is most clearly articulated in the core.spec library, which encourages you to establish a 1-to-1 relationship between a (namespaced) keyword and a spec (https://clojure.org/guides/spec#_registry).

Previous dynamic languages have always used hashmaps to represent data, either directly or indirectly. But once in the land of high-level abstractions, they would revert to "canonical" Java-style OOP (Python and ES6 being the most notable examples). Clojure seems to be the only mainstream-ish language out there that uses -- or at least aspires to use -- hashmaps as a real abstraction mechanism rather than a convenience.


Have you ever looked into Lua? Lua uses their tables (hash maps) for everything. It is a prototypical language similar to Javascript and everything is a key in the table. The language provides syntactic sugar for certain accessors such as `table.x` translates to `table["x"]` to make it seem like a more traditional OOP language.

Everything you say is not new to lisp though.

I would go as far as to say that a language with more traditional syntax (for instance Python style) with pervasive immutable data structures would do very well. Few of the benefits Clojure brings to day to day development seem to require a lisp syntax.

Maybe – s-expressions have a killer app though, which is that they are so simple that the level of effort to port Clojure to other platforms (ClojureScript, Clojure CLR, etc) is within reach of one person. I am irrationally bullish on Clojure, but this is the key to the bet – platform reach has network effects which I believe may outweigh in longrun the resistance to alien syntax. My Clojure codebase (http://www.hyperfiddle.net/) runs on JVM, node, browsers, lambdas, and inside Datomic cluster as stored procedure. Same code! Same library artifact! Simultaneously targeting all these different platforms!

> which is that they are so simple that the level of effort to port Clojure to other platforms (ClojureScript, Clojure CLR, etc)

I don't see why that would be at all relevant to porting to other platforms? Most languages with richer syntax would parse to a backend-independent AST first, and so porting to a new platform would not require touching the parsing code at all - you'd be working on the same level as S-exprs, effectively.


the s-expressions make the compiler simple. Clojure the language was designed to make portability simple. you need both

A well-designed grammar isn't difficult to parse into S-expressions. And they were writing compilers for languages with rich grammars (e.g. Algol-60) back in 1960s - often in hand-written assembly, on punch cards! I think a claim that it makes the implementation overly complex strains credulity.

I'm definitely as hardcore of a clojurist as you can be, but I would NOT call myself "bullish" on clojure: Certainly, for those of us familiar with the language it's an awesome tool, but clojure definitely is not the "hot new thing" anymore and new user adoption I think is declining in recent years, for reasons that have little to do with intrinsic properties of clojure.

i track clojure growth metrics, it is definitely not declining, and the killer apps have yet to reach market. Python was dominating Ruby at the 10 year mark, and then Rails happened. the metric i cherrypick to try to measure disruption is how many edge innovators an ecosystem attracts, and Clojure aces this. React.js is one of few great innovations that came out of Javascript, and even virtual dom I saw first in ClojureScript at your talk Webfui at Conj 2013!

oops, Webfui was 2012, React.js was 2013

All platforms could share the same syntax->s expression desugaring code, so I'm not sure why that would matter.

The difference Clojure has is that the core language is tiny, but you could do this with an alternative syntax as well. You'd just need macro support, even if they're more clumsy to write.

(I literally have an editor open working on a Clojure compiler as I write this, so I don't disagree with your premise).


Better yet, a new syntax for this language, like Reason is to OCaml.

(or the original M-expressions for Lisp)


My experience starting to work with lisps was similar: I thought "this is a great language, if only it had a nicer syntax" and I played with things like https://readable.sourceforge.io to try to make writing lisp more like writing my favorite language at the time (Python). After a while (and especially once I learned how to use paredit effectively), something changed and I realized that the s-expression syntax was exactly the syntax that I wanted.

Nevertheless, there's a strong argument to be made that unwieldy syntax is at least part of why Lisp-like languages aren't as popular as competitors, with syntax being the only seemingly consistent difference. And it seems particularly relevant to Clojure, given that it's otherwise very un-Lispy in many respects (e.g. cons cells are not the end-all be-all), and closer to other modern languages.

Elixir?

Clojure is radically practical :)

Lots of user names with 4-6 characters in this thread- Definitely an HN topic of yore!

I wonder how good clojure is with soft real-time like graphics rendering/games, or as an embedded script engine.

See Arcadia - Unity 3d programming in Clojure https://arcadia-unity.github.io/

I’ve embedded Nashorn in Clojure and the environment is able to be inspected with Java development tools at runtime. Objects are able to be put into the JS context, hypothetically with native linked libraries without hastle. JS is single threaded so there should be no issue if the native library is not thread safe, and Java makes single thread guarantees. IIRC one can tie the script runtime to a thread, and create several per process and put them in a scheduler like any other thread. Ie. probably as good as native.

For real-time graphics, i think there is an impedance mismatch with the GPU data structures which are basically big mutable array buffers

And yet Neanderthal seems to have no problem getting it quite right: https://neanderthal.uncomplicate.org

for graphics? show me a graphics tech demo using neanderthal!

maybe used as a dsl interface for gpu ... I don't know

I think you want the same datastructures moving up and down all layers of stack, otherwise you pay a marshalling penalty at each layer which is terrible for perf. If this is true, then we'd need to see fast immutable memory models inside GPUs that could natively represent persistent trees (moving away from pointers to mutable array buffers). Otherwise, the imperative way will always be at least C times faster, which seems a dealbreaker for the trailblazing researchers who forge the path the application layer follows.


Clojure compiles to Java bytecode so it's use as an embedded script engine will be limited to programs written in Java, or else you'll have a huge overhead with having to run a JVM next to your application.

It's a shame, too, because Scheme/Lisp has proven itself to be a very worthwhile scripting language in its own right.


I really wish people wrote these comments after spending more than three minutes looking at a language.

Don't like the JVM startup time? See Clojurescript and planck. Sub-second startup+compilation+execution on my old laptop.


Clojure is a language, it is not necessarily bound to the JVM (although the common implementation is the one using the JVM). Like CPython to Python.

The JVM also brings with it the entire JVM ecosystem which makes clojure very practical.

Finally, there is actually an implementation of clojure for the browser (clojurescript) - which is a better idea than anything I've seen on the front-end (react-redux is OK, but clojurescript wrappers to react are just awesome). There is also an implementation for the CLR.

Clojure runs everywhere, it has a rich ecosystem since it utilizes good hosts (JVM, CLR, js engines (with google compiler optimizations)). It is, IMO, the most practical lisp out there.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: