It is the most explicit to date abandonment of the age-old Lispers' Dream, which was "Lisp All The Way Down." Clojure is the antithesis of the Lisp Machine. Rather than a crystalline pyramid of comprehensible mutually-interlocking concepts, behind every Clojure primitive there lurks Black Magic. The Clojure user who is missing some routine or other will run crying to Java for help, rather than implementing the feature himself correctly - that is, as a natural part of the entire language, built on the same concepts. Clojure pisses on everything I've ever loved about Lisp. It rejects even what little consistency and intellectual rigor there is to be found in an abomination like Common Lisp.
Clojure is the False Lisp, which Reeketh of the Cube Farm.
I don't care if everybody really is more productive in Clojure than in Common Lisp. The latter is not my standard of comparison (Symbolics Genera, the state of the art, is. Or, if you want something that is distinctly inferior but is currently available on the market: Mathematica.) Clojure is the gutted corpse of a dream.
I am tired of this abomination being hailed as the future of Lisp. Users of real Lisps, such as myself, will keep hoping, dreaming, and working on systems which do not betray the original virtues of the language.
But why is this a desirable dream? It seems to suggest no other languages or technologies have any merit, except Lisp. Unix and C are brilliant pieces of work, and Lisp's hang up of refusing to work well with them really set Lisp back. Compared to, say, Perl, Python and Ruby which really embraced Unix concepts (pipes, scripting C programs, etc.).
I happen to think the JVM is a nice piece of work, too, and I'm happy to see a Lisp that works seamlessly with it and benefits from its strengths. Cooperation and interoperability are good things.
"which Reeketh of the Cube Farm."
Cube Farm coders deserve good programming languages, too.
None of these things are necessary when every piece of information on your machine lives in a Lisp world image. As it should. And would, if the machines we are now using were the descendants of real computers, rather than of early-1980s children's toys.
And your last link would be applicable, save for the "ocean" actually having been "boiled" in this manner. The Lisp Machine existed. You cannot un-create it. It proves that all of the supposedly impossible or impractical pipe dreams of the Unix haters are in fact possible.
"The fact that C produced faster code, was easier to master, was easier to use in groups, and ran well on less expensive hardware were not considerations that Gabriel found important. But others did. On those metrics, the dominance of C as a programming language was an example of better is better, not worse is better."
So, C is better than Lisp, depending on your metrics.
As for the ocean, it is no longer the machine on which you are typing your program. It is all of the information, data, and code available to you on Internet. That's a lot of ocean to boil. Any Lisp needs to work well with as much of it as possible to be considered a useful language today.
As a genuine interest, please could you clear up what is meant by "Lisp All The Way Down."? I've implemented my own Lisp interpreter in Scheme before, and I looked at the interpreter from McCarthy's original paper, but in both of these, the interpreter still required some language (albeit one with list-processing procedures) to begin with. My experience would be that there's nothing special with Lisp in this regard, just that its _really_ easy to parse, but I still hear this claim often enough - am I missing something here?
And to answer your post, I'd say that Clojure does give you what you need - it gives you the axioms, it gives you defmacro - the rest is history. All those other library functions you can think of in a more abstract sense ("George's problem"), and in the end these should be machine code no matter which HLL they were originally written in.
The Lisp way has always been very dynamic - you can change things, even when they're orbiting Mars, but Clojure is functional, you generally don't change things. So in that sense, Clojure is not a true Lisp, but only because it chooses a different set of trade-offs, and if you don't like that then its fine, but it doesn't make Clojure evil, just different.
> please could you clear up what is meant by "Lisp All The Way Down."
"Lisp All The Way Down" is when you can select any part of the system - and the full source (written in the same basic concepts as the rest of the system, including the program you are writing) of the element will pop up. Any changes you make will take effect immediately and in real time. We had this in 1985. Why can't we have it now?
> it doesn't make Clojure evil, just different
The language is being promoted as the long-awaited, Messiah-like successor to Common Lisp. That is the main source of my annoyance.
"The language is being promoted as the long-awaited, Messiah-like successor to Common Lisp."
sigh
Rich Hickey explains what he thinks are the good parts of his language, but I would hardly say he has a messianic complex about it. A lot of people (myself included) like it because it combines interesting ideas with the pragmatism of fitting into the popular Java ecosystem. Perhaps a desire to have everything converted to a single language is more indicative of messianic thinking?
Common Lisp is a wonderful language, certainly worth learning. But it somehow seems to induce a bunker mentality in its users.
No, a language is a way of communicating ideas, and hence, of thinking about them. (I wouldn't call English or kuchi shoga or tensor diagram notation "tools", even though they can be incredibly useful.) A programming language implementation can be used as a tool.
I think this is the core of our differences. Lisp has traditionally been used mostly as a language -- as Dijkstra put it, it "has assisted a number of our most gifted fellow humans in thinking previously impossible thoughts". Most other programming languages have been used as tools.
Clojure looks like a perfectly competent Lisp-on-JVM (with immutability!), but I don't see it changing how I think about anything. It's thoroughly on the "tool" side of things, which understandably leads to people in the "language" camp being miffed that anybody thinks it's comparable to classic Lisps from the "language" camp.
The Lisp engine in Abuse was a tool, too, and highly productive for its uses, but at least nobody claimed it was a successor to the Lisp line.
I dislike arguments over semantics. Saying something is a tool does not mean it is not also a language. For instance, mathematics can be considered both a tool and a language.
It's true that Clojure is a programming language designed to be practical, but I fail to see why this would make it any less useful as a language for communicating ideas, or any less able to change how people think about things.
Could you provide an example of an idea that is more easily communicated in another Lisp, rather than in Clojure?
"Clojure looks like a perfectly competent Lisp-on-JVM (with immutability!), but I don't see it changing how I think about anything."
The immutability is a non-trivial change in thinking about how to write a program. You can write a program without mutation in Common Lisp, and you can mutate things in Clojure, but the greater cost of implementing mutation in Clojure means you are less likely to do so. This leads to different programs.
No, that is not as big of a change in thinking as Lisp was compared to the languages of the day when McCarthy created it. But I find it a fine addition to the family of Lisps, nonetheless.
Clearly you have never met a tool which afterwards made you regard all other supposedly similar tools as stone knives and bearskins. Perhaps this day is still in store for you.
For the sake of your career as a programmer (assuming this is your trade), you'd better hope not.
You're correct that I haven't come across such a tool. But I have also programmed in Lisp, and whilst I like it a lot, I don't consider it the pariah of programming languages as you seem to do.
What makes you believe Lisp is so superior to other classes of programming language? For instance, what makes a Lisp superior to a stack-based language like Forth or Factor?
This isn't meant to refute your argument or insult anyone I just that pariah probably doesn't mean what you think it means and you might like to know.
From Wiktionary:
1. An outcast
2. A demographic group, species, or community that is generally despised.
3. Someone in exile
4. A member of one of the oppressed social castes in India.
> What makes you believe Lisp is so superior to other classes of programming language?
Because I feel straightjacketed whenever I cannot redefine my language on a whim in every conceivable way, or am forced to write a whit of boilerplate. (Yes, none of the existing Lisps fully meet this ideal, but nothing else comes close.)
Hold on. Why is a stack based language any less able to do that? If anything, it's closer to your ideal, as it has less inherent syntax than Lisp.
And for that matter, why do you think Clojure is so awful? It has no reader macros, true, but it does provide a more succinct way of writing hashes and vectors than most other lisps. Other than that, well, it's a Lisp.
> Why is a stack based language any less able to do that?
Having to juggle the stack at every step (i.e. FORTH) is a mental burden. It forces you to organize your thoughts in a certain - not necessarily harmful - but very specific way.
> Other than that, well, it's a Lisp
A Lisp unworthy of the name; one which encourages users to pepper their code with opaque routines which have no underlying Lispiness. A Lisp which barfs Java stack traces.
Forgive my bluntness, but I was rather hoping to peel back the emotive language and ideological rhetoric and have an objective and technical discussion.
I gauge programming languages by their effectiveness at solving practical problems. Do you believe Clojure's design makes it less effective at solving problems than other Lisps? Or are your objections to it purely ideological?
Yes, Clojure is less effective than other Lisps. And Symbolics Genera remains my standard of comparison. You cannot un-create it. The standard of comparison for a technology will always be everything previously achieved by mankind, rather than what is available on the market today.
Bu I will try to explain my real objections.
"Beauty is the first test: there is no permanent place in the world for ugly mathematics." (G. H. Hardy)
I was drawn into programming at an early age because it seemed beautiful and allowed me to run wild with creative ideas. Thus I insist on systems with a purely minimal amount of accidental complexity - so as to leave the most room in my head for the ideas I am trying to put into practice. I have zero interest in "programming as construction work" - implementing the ideas of others, or for that matter doing anything which isn't in some important sense original. Thus I want my programming environment to resemble pure mathematics, where absolutely nothing is present for which there is not an ironclad and inevitable logical reason, where everything is open to question, and the answer is never "because the drunken intern wrote it that way five years ago."
Most of the "practical problems" programmers supposedly "solve" shouldn't even exist. They are created through the idiocy of other programmers. In particular, I am talking about any situation where the output of one program requires "glue" to match the input requirements of another; or any situation where formats are converted; and many other cases. If the moronic mistakes of the past four decades were undone, most of the people who are calling themselves programmers would have to find honest work.
Screw "practicality." Programmers are the proverbial firemen who start fires when they aren't at the station-house, and gloriously put them out, hailed as heroes every time.
Clojure will never be beautiful because it promotes the use of undigestable foreign matter in Lisp code: primitives on which you cannot pop the hood to reveal more Lisp, with turtles all the way down. Its claim to the name Lisp is in an important sense fraudulent. I rest my case.
Your criticism of Clojure is based upon aesthetic ideals, then, rather than practical or technical considerations. However valid this criticism is, it's not something I have a particularly strong interest in, I'm afraid.
So aesthetics of any kind are foreign to you? Or just in relation to programming? Are you equally eager to sit down on a pile of bricks as on an Aeron?
No, but I'd rather have a fast ugly car than a slow pretty one.
I think I disagree with you on two points. I place a higher emphasis on efficiency than I do aesthetics; I want a language that allows me to do things quickly. I also have a slightly different idea of programming language beauty than you do. I like languages that have an small but expressive syntax and a concise core library. Clojure fits that definition better than any other Lisp I've found, so from my point of view, it's the most beautiful Lisp as well as the most practical.
But aesthetics are subjective, which is why I dislike arguments about them. I don't particularly care what language a library function is written in, just so long as it integrates elegantly with the language I'm working in. You might think that using external libraries is aesthetically displeasing, but that's a subjective opinion.
What I am interested in discussing are technical details about a language, as they are entirely objective. We can discuss whether a particular language produces more concise code than another, and come up with examples to prove it. I like those sorts of discussions, because even if I don't agree, I'll always come away with a better understanding of the favoured language of the other guy.
By contrast, arguments over aesthetics just seem pointless, as neither participant will learn anything or change their view.
I fully understand you -- and I don't understand you at all.
If you got all of the revolutionary Lisp ideas, you don't have to worry about the rest -- currently I'm more or less forced to use C++ instead of CL (which is still my favorite language) -- but this doesn't prevent my mind to still 'think' in Lisp.
If you have some really revolutionary idea, and you want to break it through the history grown walls, you always need to introduce it in homeopathic manner -- but since your new idea is really, really much more ingenious than all of the previous ones, it will finally and actually replace the more limited ones.
Butthistakestime.
If you limit yourself to think either black or white, you will never win...
There's been talk about integrating this functionality into Clojure. Incanter is an example of a project by someone who wanted to get primitive support for matrix math and has had quite a bit of success.
"How do you know I don't do my research by making provocative statements on public fora?"
There is something to be said for that. I would not have thought to Google for "R-like interactive environment, built with Clojure and Java charting and numerics libraries." But now that I know it exists, I'm pretty excited about the possibilities.
Wow, that looks potentially awesome. Certainly looks like it has the potential to compete with NumPy and R. Nice use of Clojure to tie together unrelated Java libraries with a useful syntax and REPL style exploration.
The impedance mismatch between Scheme and the JVM is a lot less than what the writer thinks. Here's the comment I just added to the post:
You don’t need primitive-static-method in Kawa as of several years ago. It’s just there for backward compatibility. Newer syntax for calling Java methods is much easier:
I can only remember two cases where I had to turn on the full-tail-call option to the Kawa compiler, and even then performance wasn't a problem. The SISC Scheme interpreter does reasonably well too, and it has full tail call optimization all the time.
Using Java libraries from a non-Java language on the JVM is one of the happiest parts of my day, every day.
However that being said, the real reason why Clojure is going to become the most important Lisp is the same reason why Java has been displacing the theoretically-faster C++: multicore.
You're claiming that (a) Java is better at multicore than C++; and (b) Java has been displacing C++; and (c) that the former is due to the latter. I haven't heard this idea before. Can you elaborate please?
Threading the locks have been integrated into Java from the beginning, with every object having a monitor and functions like wait() and notify(), and a synchronized keyword. These are nice, and do help, but they don't solve the problem. I've heard that Java fixed up some problems with these in version 1.5 or 1.6, but haven't looked at the details.
My understanding is that none of the imperative languages are very good at multicore. Erlang is said to be effective at it (because of pure message passing); and some people claim FP will help because it overcomes the problem of shared state by making it immutable (while this is true, I'm not convinced it's particularly helpful in large projects, in practice).
As limited as the threading model of synchronization is, it can be used to fairly easily write programs that scale up to some reasonable number of threads (10-100). This is less to do with the convenience of having keywords like 'synchronized' and more to with the fact that modern Java has a proper memory model. The C++ people are trying to put together a memory model, but it's not here yet and there's some doubt about the ability to integrate it successfully with existing (pthreads etc.) code.
Without a memory model you're going to be writing more code that is less correct at a substantially higher cost. If you're interested in this google for a Google Tech Talk by Josh Bloch on 'Java Memory Model'. This basically comes down to my ability to get at architecture specific stuff (CAS, fencing) in a platform independent way and have it work. Cliff Click has a really neat tech talk on writing a lock-free hash table in Java, don't bother trying to write it in C++.
Ultimately we will need a better model for concurrency, which will be some combination of functional programming and lightweight processes (actors). Erlang/Scala/Clojure/Haskell are in the right neighborhood (maybe Gambit Scheme, don't know much about it), but Erlang and Haskell are really hard to teach to mere mortals like myself. I don't know enough category theory to understand 'hello world' in Haskell, but I can write Scala and Clojure programs that work (and leverage my existing investment in the JVM).
edit: i left off something imporant. if anything saves imperative programming it will be transactional memory, which is exremely hard. hardware TM and code written for TM are in a chicken-and-egg situation. the answer is hybrid hardware/software TM and the sun guys are way ahead on this too.
if anything saves imperative programming it will be transactional memory, which is exremely hard. hardware TM and code written for TM are in a chicken-and-egg situation. the answer is hybrid hardware/software TM and the sun guys are way ahead on this too.
"but Erlang and Haskell are really hard to teach to mere mortals like myself."
No, just Haskell. Learning Erlang is roughly on par with learning Javascript well; it may involve a couple of new paradigms (like Javascript imposes event-oriented programming and prototypes, which may be new to you), but the language itself is straightforward. Different, but straightforward. The concurrency may blow your mind, but the syntax and basic mode of operation is unlikely to do so, in contrast to Haskell.
I definitely intend to give Erlang another try soon, if for no other reason than because I really want to hack on RabbitMQ. Last time I tried I failed though.
Thanks, I appreciate that. You've addressed (a), but what of (b) and (c)? Or is it more that sensible people would prefer C++ for multicore
(who are largely recognizable as being sensible people by the mere fact that they hold this view.)
Incidentally, I have an entertaining alternative theory, that silicon will becomes specialized in the way that organs in the body are (and firms), perhaps mimicking the architecture of large programs which have intercommunication patterns that are established and predictable. A games examples: a Lua CPU, a physics CPU, an AI CPU. The Enterprise is even more ripe for this, as they have been trying to separate architectural components with SOA (which is a form of concurrency). It doesn't seem to have gone well, but something may have been learned there. Another example of large-scale concurrency is the internet: perhaps replicate a google server farm on a chip?
These ideas are a bit horrific, because they are horribly inefficient through standardizing on large scale components, so we lose the opportunity to optimize between them. But they work.
My assertion that Java is displacing C++ is admittedly very anecdotal, this is a difficult claim to discuss in rigorous terms. My claim is based on personal experience, the experience of colleagues, and "some people I've talked to on the interwebs", so by all means take with a grain of salt and do some research yourself if you're skeptical.
For any long-running process the JIT is going to turn your Java (Clojure, Scala, ...) into extremely well-optimized machine code. In fact, the amount of knowledge that the JIT has about the actual run time behavior of the software allows it to do better optimizations than you ever could by hand in certain situations (an ever-growing number of situations).
GC is another point, the HotSpot GC is arguably faster at allocating small, short-lived objects than malloc is. It is reeeaaaaallllyyy good these days, probably the best GC that you'll find in production just about anywhere these days. The thing is a monster. And the new one, the G1 collector, was sent from the future by SkyNet to kill John Conner.
Add to this the fact that you can easily exploit common multiprocessing patterns (thread pool, consumer/producer) with almost no code, and that advanced multiprocessing code can be written in a platform independent way, and it's a pretty small subset of applications that still make sense to write in C++.
Add to all of that the fact that even Java (let alone all the other amazing JVM languages) are orders of magnitude more productive than C++ for your programming staff, and the set of situations in which it makes sense to use C++ rather than the JVM is looking pretty tiny.
Oh and then there's portability. C++ isn't even portable across compilers (versions of the same compiler!), and for all the early trouble with Java portability, it really is write-once-run-anywhere now. Really.
Good to see a fellow Java fan here. But C/C++ is still faster than Java (and assembly is faster still), if you really need performance. A big minus for Java is hard realtime performance (illustrative eg: it's hard to get regular framerates); which of course doesn't matter on the server you're discussing. A counter-argument is that if you need multicore, it's because you need performance; and if you need performance, you probably need C/C++.
Java has many benefits as you mention, especially ease of development, and I think these are what drive its adoption rather than any multicore benefit. So, I accept your (a); and (b) certainly has some truth, but I doubt the causality of (c).
The stats (terribly unreliable) show that C and C++ combined are ahead of Java, and the proportions are fairly static (C and C++ are steady, but Java is itself being javaed by even easier languages like Python (which incidentally supports my reasoning above):
http://www.tiobe.com/index.php/content/paperinfo/tpci/index.... (about as reliable as a slashdot poll, but it's probably basically right).
An interesting case of the deep-nested delay affecting a cool-headed and respectful discussion. Law of unintended consequences I guess.
I would respectfully contend that outside of micro-benchmarks (which kill Java because of long startup time and no chance to JIT) that Java is as fast as C++, and in the presence of lots of cores probably faster. This is again based in large part on my direct anecdotal experience, I would be very interested to hear about everyone else's experience in the matter.
But the real point is that with Moore's law now working on cores instead of mhz, single-threaded performance just doesn't matter that much going forward. Exponential increase and all that.
For a look at where this shit is going take a look a look at Azul.
I think that was its intended consequence (though it didn't affect my replies.) I'm glad I put you on the spot, because your reasoning was worth hearing.
You're making an interesting case. I haven't heard any claim that Java actually is faster than C++ in practice, just that it theoretically could be, because of JIT optimizations (then again, I don't think C/C++ is used that much in the server, anyway.) I also haven't heard of Java getting a comparative advantage due to multicore, just that FP would. But that's not to say it isn't true. You could be right. Interesting.
(The link seems to be commodity computing, like Sun blade servers).
I'm looking at JVM performance optimisations at the moment (I'm working on cache-friendly data structures for DBMSs, and interested in how running this stuff on a JVM affects things). As just a little test, I wrote an identical binary search in Java and C over a large array (with randomly increasing contents), with a very large number of iterations. I pregenerated all the search numbers to eliminate discrepancies with regards to time taken for RNG. It turned out that for completely predictable searches C was always a bit faster, and over smaller array sizes C also beat out java. As the array got over about a million elements in size, Java caught up to and then got significantly faster than C.
On further investigation of a 150 million element dataset, Java was performing 1/3 again as many instructions, but missing cache half as much. I've been spending some time looking at all the JVM performance optimisations, but I still haven't worked out why this is - a random binary search isn't exactly a test case that's run time optimisation-specific. It's fascinating stuff, anyway.
Let's carry on this conversation via email, my address is in my profile. I can share with you some of the sources that have led me to form this opinion.
I would love to chat with one or both of you about my opinions on the JVM if you're interested. Please email me and we can meet up on IRC or something.
...there is no email address in your profile. But I don't think there's any problem putting the links here, from HN's point of view (unless you want to keep them private, of course).
Why do you say "don't bother trying to write it in C++" referring to Click's lock-free hash table? His algorithms are based on compare-and-swap, which is the same operation I'd use to implement a lock-free data structure in C/C++.
The algorithm requires a fence too. I found remarks at one point by Cliff about why he couldn't do it in c++, if I find them again I will post them here.
edit: on iPhone so I can't post the link, but the comments section of cliff's blog has a discussion about java vs. C for this data structure. summary: maybe you could do it but it would be tricky and extremely non-portable, I.e. almost definitely not worth it. hope I didn't overstate, wasn't trying to dismissive (need to watch my language online).
It is possible to make it portable, you just need a portable API for synchronization. That's not too hard to do - take a look at atomic operations in the Linux kernel.
Click was probably talking about the fact that Java has a consistent and defined memory model. C and C++ have no such memory model, but there always is one. It's just the memory model of the underlying architecture - which is sometimes extremely subtle.
It's also that AFAIK, it's the first major lisp in a while to be completely redesigned without regard to compatibility with CL. There's a ton of stuff that was just waiting to be done, but couldn't because people kept using car and cdr instead of first and rest.
No, Common Lisp has both sets of operators; it has car, cdr and all the combinations of c(a|d)+r you can handle, it also has the ordinal number words, first, second .. tenth, and rest.
The difference is that first, rest et al. are used when the data being handled are proper lists. A list of items terminated by nil.
Car, cdr and others can be used with lists as well, but they're more primitive operations that mean cons cells; a pair data structure that has two parts. You can view a proper list as "a primitive pair data structure that has the first item of the list as its first cell, and has its second items a pointer to rest of the list" .. and construct the rest inductively.
You can't use first and rest to operate on associative lists, graph structures cyclical or otherwise and a host of other data structures.
Some programmers decide to make their intent clear when operating on proper lists, while others prefer to stick to car and cdr out of habit, nostalgia, not-knowing better, street cred, or as a cheapwaytoadvancepointers with one compact operation.
CAR and CDR are less relevant now that people use CLOS, CL's Object System, and to a lesser extent structures. Defstruct gives you the accessor FREE, it constructs the accessor functions as classname-slotname for the reader and (setf classname-slotname) for the update/write function. With CLOS you can choose the name for it with an initialization argument (:initarg :accessor, :reader, or :writer)
Please update your Common Lisp prejudice check-list.
but couldn't because people kept using car and cdr instead of first and rest
Huh? What do names have to do with it?
If I recall correctly, first and rest were introduced 6 months after car and cdr were. The difference between 50 years and 49.5 years is hardly great enough to have affected the subsequent evolution of Lisp.
It is the most explicit to date abandonment of the age-old Lispers' Dream, which was "Lisp All The Way Down." Clojure is the antithesis of the Lisp Machine. Rather than a crystalline pyramid of comprehensible mutually-interlocking concepts, behind every Clojure primitive there lurks Black Magic. The Clojure user who is missing some routine or other will run crying to Java for help, rather than implementing the feature himself correctly - that is, as a natural part of the entire language, built on the same concepts. Clojure pisses on everything I've ever loved about Lisp. It rejects even what little consistency and intellectual rigor there is to be found in an abomination like Common Lisp.
Clojure is the False Lisp, which Reeketh of the Cube Farm.
I don't care if everybody really is more productive in Clojure than in Common Lisp. The latter is not my standard of comparison (Symbolics Genera, the state of the art, is. Or, if you want something that is distinctly inferior but is currently available on the market: Mathematica.) Clojure is the gutted corpse of a dream.
I am tired of this abomination being hailed as the future of Lisp. Users of real Lisps, such as myself, will keep hoping, dreaming, and working on systems which do not betray the original virtues of the language.