Hacker News new | past | comments | ask | show | jobs | submit login
Steve Yegge v. Rich Hickey re: "Clojure just needs to start saying Yes" (groups.google.com)
218 points by cemerick on Apr 20, 2011 | hide | past | favorite | 154 comments

One of my favorite things about Clojure is the things it's said "yes" to, as first-class language builtins: char, vector, set, and map notation; first class "vars" (actually thread locals), ways to manage and dereference state, Java interop that doesn't set your hair on fire, namespaces, keywords and namespaced keywords, and a whole bunch of features I'm probably forgetting.

Instead, Steve Yegge is asking for things that don't seem terribly important to me. Excluding loops from the language core is obvious; Clojure makes it really easy to use the functional style instead and loops would serve as a newbie trap. For the last two weeks I've been programming Clojure full-time and haven't used a single loop macro or loop-recur. He complains about the lack of circular dependencies; but no circular dependencies across namespaces is A Good Thing just as you shouldn't have circular dependencies across Java packages or C++ libraries. The guy he cites who declared macros evil is obviously not a part of 'mainstream' Clojure culture, and maybe someone can explain Yegge's anger about single-pass compilation, because I don't get it.

And of course, Clojure is a highly extensible language that has implicitly said Yes to a vast ideascape.

> The guy he cites who declared macros evil is obviously not a part of 'mainstream' Clojure culture

I think Yegge was referring to Christophe Grand's talk on DSL's and macros. Grand has written several notable Clojure libraries, so I do consider him part of 'mainstream' Clojure culture. But he didn't say "macros are dangerous and you should never use them". Quoting Phil Hagelberg's response in the linked discussion:

  The talk I saw was about how functions are generally more
  composable than macros, so they should be the first tool
  you reach for. I think it was more directed at folks who
  come to lisp and are drunk with the power of macros.
  (Assuming it was this talk: http://clojure.blip.tv/file/4522250/)
Towards the end of the talk, Grand discusses how macros can be useful for optimizing or adding syntactic sugar to DSL's.

This is not a "Clojure thing."

The advice to noobs not to go hog wild with macros is common among experienced Lisp programmers in all dialects. Paul Graham himself refers to it in _On Lisp_: Chapter 8 is titled "When to Use Macros" and the first section is "When Nothing Else Will Do."

I think the issue with single-pass compilation is that you can't have a function a in namespace A call a function b in namespace B which in turn calls a, unless you've first done a forward declaration of the var(s). I can't see how the hell 2-pass compilation can be reconciled with Clojure's macro system; either Steve or I have not thought this one through. Personally, I've run into this once [1] and it was easy to resolve with an

  (def missing-function)
At the top of the file. I'm used to worse than that from other languages, mind. (cough C++ cough)

I don't get what the fuss is about, either. The debugger is pretty much the only suggestion I like - I've used JSwat with Clojure and it's not very pleasant. Obviously some kind of continuation-based debugging facility would be nice, but it's basically not going to happen because of the JVM, not because Rich Hickey is an Evil Bastard (he's not). Either way, a usable debugger will materialise sooner or later, Yegge-tantrum or not.

I think Clojure is being managed very well. I admittedly haven't followed the community for a while; the mailing list got a bit too high traffic for me, and I can't use the JVM for most of my projects. But whenever I am using Clojure and have a problem, it's usually because I haven't found the right bit in the docs, or because I missed/misunderstood a major feature that was added recently, or because the functional style sometimes is just a bit beyond the capabilities of my brain. (I'll write the code in imperative style, then translate it to functional, and usually learn something in the process; done) I don't see that as a failing of the language, but my inexperience with it. Doing simple things with it is simple.

I have a theory, though. Yegge talks about porting Java code to Clojure. I think that's the problem right there. Sure, if you're porting line-by-line, it's not going to be especially nice. It's a bit like translating German to English sentence-by-sentence. Most of the time it works, despite sounding a bit stilted. But sometimes there isn't a direct equivalent, and you have to change the surrounding sentences. The structure of your average Java program is so wildly different of your average Clojure program, no wonder it doesn't port directly.

[1] Okay, I'm not massively experienced with Clojure. I've written maybe a few tens of thousands of LOC of it, and no really big projects.

I can relate to the debugging part. Useless stack traces have been the reason I simply gave up on Clojure.

I wouldn't call them useless, but I guess I'm used to C++, or worse, just a hex stack dump (yay, game consoles). Clojure mangles names somewhat to make them Java-compatible; you can easily do the reverse translation in your head. Still, this could be automated rather easily.

As far as I remember clojure just spit out "This is not a function" or something like that, without any indication where the error occured.

There is a trace macro but it doesn't provide complete traces without explicitly adding every function you want to trace, which becomes tricky very soon. Manual tracing with printf()s like in non-functional languages isn't built-in either. There are half-baked macros for this on blogs if you google hard enough, but that is not the kind of thing that leaves a solid impression of a language.

Maybe I was just working in a non-clojuresque way.

I agree its a problem. I use the repl a lot and bild up the functions and debug with prints witch is ok because you can print anything (not like in VB). I know that they are working on it so I hope it will get better.

For me, however, its not a reason to quit.

I have been working on a fork of Clojure off-and-on that tries to improve the debugging experience (https://github.com/qbg/clojure). How do you think the stack traces could be improved?

This is one example of how to remove uninteresting stack frames from Clojure stack traces. http://j.mp/h3xzdN

Suppose you accidentally put an extra pair of braces deeply inside a function. The message you'll get is something like "could not cast to function" without any meaningful information where this happened.

I'm astonished to discover Clojure does do single-pass compilation. It used to be done primarily because some intermediate representations of programs could not fit entirely in memory, but virtually every compiler now is multi-pass because so many optimization opportunities are lost otherwise.

I have minimum knowledge of Clojure, so is there some motivation for single-pass other than perhaps simplicity?

Edit: A brief outline of the tradeoffs: http://en.wikipedia.org/wiki/Compiler#One-pass_versus_multi-...

The issue is not single-pass vs multi-pass. It is instead, what constitutes a compilation unit, i.e., a pass over what?

Clojure, like many Lisps before it, does not have a strong notion of a compilation unit. Lisps were designed to receive a set of interactions/forms via a REPL, not to compile files/modules/programs etc. This means you can build up a Lisp program interactively in very small pieces, switching between namespaces as you go, etc. It is a very valuable part of the Lisp programming experience. It implies that you can stream fragments of Lisp programs as small as a single form over sockets, and have them be compiled and evaluated as they arrive. It implies that you can define a macro and immediately have the compiler incorporate it in the compilation of the next form, or evaluate some small section of an otherwise broken file. Etc, etc. That "joke from the 1980's" still has legs, and can enable things large-unit/multi-unit compilers cannot. FWIW, Clojure's compiler is two-pass, but the units are tiny (top-level forms).

What Yegge is really asking for is multi-unit (and larger unit) compilation for circular reference, whereby one unit can refer to another, and vice versa, and the compilation of both units will leave hanging some references that can only be resolved after consideration of the other, and tying things together in a subsequent 'pass'. What would constitute such a unit in Clojure? Should Clojure start requiring files and defining semantics for them? (it does not now)

Forward reference need not require multi-pass nor compilation units. Common Lisp allows references to undeclared and undefined things, and generates runtime errors should they not be defined by then. Clojure could have taken the same approach. The tradeoffs with that are as follows:

1) less help at compilation time 2) interning clashes

While #1 is arguably the fundamental dynamic language tradeoff, there is no doubt that this checking is convenient and useful. Clojure supports 'declare' so you are not forced to define your functions in any particular order.

#2 is the devil in the details. Clojure, like Common Lisp, is designed to be compiled, and does not in general look things up by name at runtime. (You can of course design fast languages that look things up, as do good Smalltalk implementations, but remember these languages focus on dealing with dictionary-carrying objects, Lisps do not). So, both Clojure and CL reify names into things whose addresses can be bound in the compiled code (symbols for CL, vars for Clojure). These reified things are 'interned', such that any reference to the same name refers to the same object, and thus compilation can proceed referring to things whose values are not yet defined.

But, what should happen here, when the compiler has never before seen bar?

    (defn foo [] (bar))
or in CL:

    (defun foo () (bar))
CL happily compiles it, and if bar is never defined, a runtime error will occur. Ok, but, what reified thing (symbol) did it use for bar during compilation? The symbol it interned when the form was read. So, what happens when you get the runtime error and realize that bar is defined in another package you forgot to import. You try to import other-package and, BAM!, another error - conflict, other-package:bar conflicts with read-in-package:bar. Then you go learn about uninterning.

In Clojure, the form doesn't compile, you get a message, and no var is interned for bar. You require other-namespace and continue.

I vastly prefer this experience, and so made these tradeoffs. Many other benefits came about from using a non-interning reader, and interning only on definition/declaration. I'm not inclined to give them up, nor the benefits mentioned earlier, in order to support circular reference.


Most of the user annoyance vanishes if declare were to support qualified names: (declare other-namespace/symbol). Is there a technical limitation here?


One problem is what to do if the other-namespace doesn't already exist. It would have to be created, and that initialization is unlikely to be the same as the declared one. Possibly follow-on effects when it is subsequently required. The other option, with similar issues, is to allow fully qualified references to non-existent vars in code.

If it doesn't already exist, then you issue a compilation error. Better then than the way it works today, IMO.

Don't sign comments. Your username is visible.

CL is a language. Not an implementation. One may need to differentiate between a language and an implementation. CL is defined so that different implementation strategies are possible.

CL itself has two different compilation interfaces COMPILE and COMPILE-FILE. A file compiler may implement a multipass strategy, where calls to later in that file provided functions are resolved.

A compiler may also provide different kinds of compilation units. IIRC CMUCL does that. See "block compilation".

Thank you, Rich, very informative.

What is the rationale behind not letting 'declare' declare vars in other namespaces? I've run into that when dealing with circular dependencies, and sometimes it seems that would help.

Such declarations are possible (as would be accepting fully-qualified references to not-yet-existing things), but the devil's in the details again - e.g. what if the other ns doesn't yet exist?

And the complexity/utility tradeoffs must be considered.

"CL happily compiles it, and if bar is never defined, a runtime error will occur. ... You try to import other-package and, BAM!, another error - conflict, other-package:bar conflicts with read-in-package:bar. Then you go learn about uninterning.

fwiw, you get a warning at compile time regarding the undefined symbol. And in the AllegroCL IDE, for example, uninterning is just a matter of hitting [Return] when you get the error message (dialog). I believe uninterning the conflicting symbol is a common restart in other implementations as well.

"what constitutes a compilation unit, i.e., a pass over what?"

So, how is this compilation unit different from all other compilation units?

One way is that Rich's example compilation units (forms input at a REPL) don't all exist at the same time, like compilation units that are files often do.

Consider how much powerful macros can be if you don't have to deal with out of order definitions.

Macros are wonderful, but intuitively it seems very limited in scope wrt compile-time optimizations.

For example, how would you do dataflow analysis at compile-time (which is necessary for a plethora of optimizations) with just macros?

JVM does that for you. However you're free to do all sorts of kinds of optimizations that the JVM simply can't do. Think about a in-memory logic engine that's running during compilation - you can introduce relations which influence how code further down the line is compiled via macros.

Please explain?

Christophe Grand is very much a part of mainstream Clojure culture. He is currently co-authoring a book on Clojure. Furthermore, he never said that macros are evil and should never be used. I'm tempted to say that Steve may have never even watched the talk in the first place after reading his comments. I was there when that talk was made, and I did not even remotely get the feeling that he was telling me that macros were evil and should never be used. That is totally missing the point.

I am very new to Clojure and only semi-experienced in CL, but I ran into a problem today that left a pretty bad taste in my mouth. I had something like:

  (defn foo [bar]
    (let [whatnot (+ bar 3)]
      (frobnicate whatnot)))
  (defn frobnicate [quux]
    (println "Hello" quux))

And it threw an error, since frobnicate hadn't been defined while it was dealing with foo. But I didn't know that. And using and commenting out my (ns spam.core)--and trying various compilation incantations with SLIME--were confounding variables such that the call to frobnicate sometimes worked if the ns call was removed and you held your nose right. Finally I gave up and posted to Stack Overflow, thinking I'd jacked up my namespace somehow. Instead they told me to either move frobnicate's definition to be above foo, or else declare it.

It never occured to me that a modern, not-low-level well-received language would have this limitation. And I feel like I'm part of the target demographic for Clojure. I'm an experienced Java and Python coder. I know and like Lisp. I'm interested in learning a new language. And I did what I thought I was supposed to--I read the list of differences between Clojure and other Lisps on the official site. I read a tutorial that concisely addressed Clojure syntax, data structures, special forms and so forth in a way designed for people who already knew how to program.

And there's a lot of things I like--better data structures, the lambda reader macro, the judicious use of brackets to avoid paren soup, Java interop, and more things that I'm only discovering as I gain more experience.

But I'm still kind of in shock about having to declare methods. A completely unexpected wart.

There are two warts - having to declare methods, and the obscure error message. A smart error handler could have seen that frobnicate had been defined further down, and told you exactly how to fix it. And it would be ok when that error handler didn't work in the REPL.

Agreed. Here's the exact error message I got:

  error: java.lang.Exception: Unable to resolve symbol: frobnicate in this context
So I really thought I'd misspelled it or messed up my namespace or just broken the compiler :/

I don't know much about Clojure, but I know an argument from authority when I see it, and Yegge is making an argument from authority. He does a lot of, "You need users and you don't know anything about language marketing and building an ecosystem." With the implication being that Yegge does know all about language marketing. I'm disappointed Yegge would go there...he's a smart guy. But, we all have bad days.

Unfortunately for Yegge's argument, he's never built a sizable language ecosystem from scratch, while Hickey has. So, he's making an argument based on authority that he doesn't have and the person he's arguing with does.

Hickey has done a brilliant job stripping off all the "I know better than you" bits of Yegge's comments, and brings it back down to the discussion of the language and nothing else. Frankly, it was pretty devastating, and I'm surprised Yegge walked so cockily into it. If I had any dogs in this fight, I know whose side I'd be taking.

According to his post to the list, Mr. Yegge is responsible for the attitude transplant that the Python community has experienced. It's amazing what he can accomplish from his desk, dashing off memos to language communities. Jack Welch wishes he had that kind of power as CEO of GE.

The most puzzling thing he says is that Clojure's language adoption efforts have been a failure because Clojure is over three years old and hasn't broken the top ten in some pointless programming language survey.

How long did it take Python to get there? I'm guessing at least a decade. And does it even matter?

Beyond that, I'm pretty sure Clojure got rolled into Lisp on Tiobe, which is why it had such a big spike a while back.

The issue with Clojure is that it really gained too much popularity before it was ready. Ready has a lot of implications. IDE support, debuggers, libraries, documentation, books, etc. While it is a good language and has a number of things going for it. It seems that the books and community added too much hype too soon. This caused new users to come, then go.

I don't know if I agree with you that Clojure's too popular right now, but I completely agree with you that the phenomenon exists in general. Rails back in the early days experienced a huge influx of PHP refugees, who significantly lowered the signal-to-noise ratio in the community.

I use a language's Freenode channel is a general barometer of the state of its community's health, and I notice no more questions from bandwagon jumpers who have no clue whatsoever of what they're getting into on #clojure than I ever did on #scheme. I use those two channels for comparison because the sort of questions in question are similar: based on received assumptions that are counterproductive to being effective.

I didn't notice any arguing from authority. He claims to observe a pattern of rejectionism (for want of a better word) developing in the culture and argues that this is bad for adoption. Either could be wrong, but neither relies on authority.

If Hickey has a clear vision for his language he should certainly go for it. But I agree with Yegge that if the culture takes a turn toward you're-doing-it-wrong purism, its growth will suffer.

(As an aside, Yegge has achieved something noteworthy in language marketing: his writings about programming languages have a large following. Perhaps he exaggerates his influence, but for better or worse it's non-negligible.)

I feel obliged to quote Adam Chlipala of Ur/Web fame:

I also want to emphasize that I'm not trying to maximize adoption of Ur/Web. Rather, I'm trying to maximize the effectiveness of people who do choose to use it. This means that I'm completely happy if basic features of Ur/Web mean that 90% of programmers will never be able to use it.


I think that "rejectionism" could be quite justified. And I think that Adam point is applicable here.

"Either could be wrong, but neither relies on authority."

Since Yegge provides no evidence, and a lot of criticism of the current culture of Clojure, I must assume he is relying on his authority to make his case. There's nothing else to back up any of his assertions. It's either argument from authority, or argument for the sake of argument with no rhyme or reason at all. While argument from authority is weak, it'll convince a few people who think he has authority to speak on the subject.

But, his arguments are clearly not technical. He makes no case for why those languishing patches deserve to be in Clojure...just that they exist, and that's enough for him to believe they have merit.

"As an aside, Yegge has achieved something noteworthy in language marketing: his writings about programming languages have a large following. Perhaps he exaggerates his influence, but for better or worse it's non-negligible."

I don't disagree. I enjoyed Yegge's blog immensely over the years. But, talking loud on a blog, and building a language community aren't necessarily the same skills. I see people suggesting he had some hand in Python's rise to it's current position of importance, but I was a Python developer before Yegge started talking about it, on a major Open Source project...Python was doing just fine. It was actually entirely news to me that Yegge had anything to do with the Python community, in fact (it's been a few years since I was working in Python, so I haven't followed it since). Python made Python popular, not Yegge.

Good discussion. I agree with Rich in principle: a language's design should not say "yes" to everything, but I agree with Steve in some of the particulars.

After programming in Clojure for two weeks, I had a list of complaints about the language. It has now been two years, and nearly all of my objections went away as the language evolved and as I adapted to using it. Two years later, I just have two notable things to grumble about. Coincidentally, they seem to be closely related to Steve Yegge's concerns.

1. No condition system. This is a big deal. Clojure piggy-backs on Java's exceptions, and currently lacks a good mechanism for user-defined error handling. Unfortunately, Java exception handling also happens to suck, since it throws away data in all stack frames from the throw to the catch. Implementing better error handling is under discussion on the Clojure development page (http://dev.clojure.org/display/design/Error+Handling). (Steve complains about the lack of non-local transfer of control, and he has a point: it could be used to make an arbitrarily powerful condition system, see http://www.nhplace.com/kent/Papers/Condition-Handling-2001.h...).

2. No built-in debugger. The way CL code doesn't just crash and throw away stack information is amazing. The lack of this feature in other languages is appalling.

In addition, I sort of agree that being forced to forward-declare things is annoying. I got used to it, but I don't really like the restriction. I do understand the reason behind it, though: auto-interning symbols in the reader (as Common Lisp does) can be confusing and occasionally problematic.

This is a fascinating case study. I got sucked into reading the entire thread. Steve Yegge is talking about cultural and marketing issues that seem obvious to me. The responses on the list may not be representative of the community, but assuming they are, one can hazard a guess about the long-term trend: there's a clear failure to connect with what Yegge is saying. (Edit: I deleted an unnecessarily personal example here.)

Yegge isn't arguing for the abandonment of taste and rigour in a race to incorporate every kitchen appliance into the language. He's arguing that languages and communities that take a prescriptive (someone said "paternalistic") stance end up marginalizing themselves by their own rigidity, and that the antidote for this -- as well as the passageway toward wider adoption -- is to actively listen to and court new users. I couldn't agree more.

(Side note, this is why I like Common Lisp. Its loosey-goosey flexibility that always assumes the programmer knows best leads to an awesome fluidity that finds its around any obstacle. CL is unpopular, but not because of its pluralism. Qua language it has a deep respect for the user.)

There's another point here. Whether you're a fan of Steve Yegge or not (I didn't use to be, but after nodding with everything he said here I am now), he has a proven ability to mobilize a significant body of programmer opinion. To ignore what this guy says about the marketing of programming languages itself already displays a foolish disregard for the market.

Funny you should mention Common Lisp, which has become the ultimate "no" language -- its spec has been cast in stone for a quarter century now. No threads, no module system (beyond packages), no new collection types, etc. etc. etc.

I'm not saying this is a good thing, but you seem to contradict yourself by liking CL at the same time you agree with Steve Yegge.

Common Lisp, which has become the ultimate "no" language

By historical accident. I'm talking about language design. Yegge says much the same thing about CL in the OP, by the way.

I think Python is both prescriptive and widely popular.

I have found Python to break down as I scale it into a larger system spanning multiple directories and modules.

It's fine for bashing out 3-4 file programs. It's a decently high-entropy language.

It's a bloody lousy hacking language because of the prescriptiveness.

I'd rather use Common Lisp, and I do, for personal stuff.

It's also one of the more readable languages in existence and it's implicitly championed by an IT juggernaut.

It was popular before Google started championing it.

Not for production code it isn't.

As compared to ? Python is very popular.

Good point. I would be interested to learn about the history of Python's adoption from this point of view.

I'm not sure why Yegge is talking about the Shangri-la of non-local exits, but... He is right that the culture of Clojure has some issues and often rejects ideas even when they're obvious, and it's not clear why.

Examples I've run into:

1. The current clojure lambda shortcut-syntax is atrocious, and we can do better. Why don't we do better?

2. Clojure could really benefit from a scheme-esque Macro-by-example library. A few exist but they seem largely ignored by the community; despite well-known-benefits to such a system in the normal daily use of macros.

3. A strange hatred of macros. Yes, some people are reasonable and argue that functions should have primacy over macros because of composability (and they're right). But then there are people who will tell you macros are always bad, and if you show up in freenode #clojure to ask for help with them they will actively laugh at you.

I love Clojure and I feel like I know it pretty well, so I'm not trying to say Clojure is considered harmful, etc. But I do think that some of Yegge's criticisms—while poorly delivered and sometimes poorly expressed—have an element of truth to them.

* Full disclosure: I was involved in an effort to write a Clojure book for O'Reilly until I got involved with a new startup and had to terminate my involvement in the effort. I may not be the most unbiased judge of Clojure.

if you show up in freenode #clojure to ask for help with them they will actively laugh at you

Oh dear. That is a bad sign. It has always seemed to me that building a healthier community from the ground up was the biggest thing a successful new Lisp would have to offer. The behavior you describe is all too reminiscent. No one should ever be "actively laughed at" for asking an honest question, and Clojure experts (if they know what's good for them) ought to take an aggressive stance against that kind of behavior. These communities are delicate things. They can become diseased.

It makes me mad to see anyone get "actively laughed at", let alone by bullies who assuage the beast of their own insecurity by doing the intellectual equivalent of beating up children.

Edit: oh, and: they're being laughed at for asking about macros? This is a Lisp, right?

The best way to deal with trolls is to simply not feed them. My experience on Clojure IRC the past 3 years is that its orders of magnitude less condescending than the Python, Ruby, or Node.js IRC channels.

I won't name names, but you'd know the names I could name. It is true that, so long as you stay away from a few key third rails, the #clojure channel is generally a great place. One need only to go to #scala on a bad day to see what a few bad actors can do to make a place feel oppressive.

There's no value in such arguments tho. It's an undeniable truth that the Clojure community (and its libraries) are far less macro-friendly and macro-centric than other lisps. Even PLT Scheme seems more comfortable with defmacro usage.

There's a joke here about Scala and actors, I'm just sure of it.

Oh good. That's exactly what I was hoping someone who knows would say.

Edit: I'm not talking about trolls, though, but about respected members of a community acting this way. If bullying is confined to trolls, along with all the other bad things trolls do, that's a healthy sign.

> if you show up in freenode #clojure to ask for help with them they will actively laugh at you.

#clojure has consistently been one of the most helpful resources for me, and the most helpful IRC channel I've ever been in. I've never seen anyone laughed at there.

That is a silly thing to say. I've never seen a language that had such a friendly community. I don't think that in my two years of being active in the Clojure community, I have ever seen a single person get 'laughed at' for any reason at all.

Sure, there are trolls every now and then. You can't control everybody. The ratio of trolls to helpful participants is excellent and always has been.

1. I'm curious as to what you think would be better?

2. Would be nice, but I think a high level of macro-understanding is not broadly distributed yet in the Clojure community, so scheme-esque macro libraries for most people seem like "yet another thing to learn".

3. I don't find this to be the case in general at all.

1. Scala certainly has a better in-place lambda syntax. Way better.

     //; someNumbers is a vector of numbers
     someNumbers.reduceLeft( _ + _ )
     (reduce #(+ %1 %2) someNumbers)

     //; A more complex example
     someNumbers.foldLeft(0)(_ + _ * 2)
     (reduce #(+ %1 (* 2 %2)) 0 someNumbers)
I'm fluent in prefix notation, but I still have to watch the clojure code that uses #() very carefully.

2. I think that Macros By Example would be a better tool for teaching macro than the raw compiler extensions that defmacro provides. MBE is easier, not harder. Defmacro is the machinery, not the interface for most tasks.

3. I won't name names, but I've had high-profile people in #clojure do exactly this to me. And when I barked back, I was told to not be rude. It was a terrible experience caused by a prominent member of #clojure who otherwise seems like a smart individual. I've since received commiserations from people who have faced similar condemnations.

(note: added simplified example)

> I'm fluent in prefix notation, but I still have to watch the clojure code that uses #() very carefully.

I think part of this is to encourage you to think about whether another approach would be more appropriate.

Any time I have a #() form with more than three or four tokens in it, it could almost always be better expressed as a private defn or a for expression. And sometimes comp or partial is more appropriate, provided it's still only a handful of functions involved.

I agree. The wildcard syntax from Scala breaks down just as quickly (and in fact allows fewer possible expressions since it doesn't appreciate nesting).

I see these restrictions and simplifications as a feature.

What is so unreadable about '#(+ %1 (* 2 %2))'? Can Scala use arguments in an order other than the order in which they're given? If not, then Scala's simplicity is at the cost of less flexibility.

If you find that too hairy, just write '(fn [a b] (+ a (* 2 b)))'. The shortcut syntax is intended for simple cases.

This kind of attitude is what frustrates me.

The clojure version is obviously more noisy. Your argument is that the #() syntax is more robust (i.e., allowing multiple references to a single argument, and repeats of an argument). While strictly true, you then go on to invalidate it by saying, "The shortcut syntax is intended for simple cases." I feel like this is a resistance to change that is more a product of the rivalry between Scala and Clojure than anything else (and please forgive me if I am projecting onto you, but that's how arguing on the internet goes).

Additional complexity or capability at the cost of a perlesque show should not be the mission of a convenience syntax. Scala has taken a lot of good ideas from Clojure, it seems only fair that Clojure pull back a little. What's more, it's not very hard to write a macro that does most of what the wildcard syntax does... but without access to the reader's symbol macros it's very difficult to make that kind of change grow into the community.

I don't see any resistance to change; it's more like Clojure folks are generally going to demand that changes be unequivocally positive. I remember the fixed-position args in Scala function literals being particularly irritating in certain circumstances; whatever one's gripe about the chosen sigils, being able to write #(%2 %) is damn handy.

This widespread Scala/Clojure rift is a myth AFAICT, outside of various spitball fights on Twitter.

Oh, and if you want to have userland reader macros in Clojure, have at it: http://briancarper.net/blog/449/clojure-reader-macros ;-)

I've been learning Scala and after being initially excited by Scala's _ notation, I'm disappointed by how often I end up resorting to the long form for expressions that feel simple.

Also, I think Clojure's syntax for the 1st argument, 2nd argument, etc. is more intuitive. I mean, the second _ doesn't refer to the same argument as the first _? That violates all my instincts as a programmer. I was completely baffled by it until it was explained to me, and it still seems very clunky, because it forces me to declare names for arguments more often than I would like to. In my opinion, the Clojure way is immediately obvious and more concise (because it can be used in many cases where Scala requires named arguments.) It's just better all around, for any expression more complicated than { _ + _ }.

I am not an accredited Clojure style expert, but I wouldn't (defensibly) use the shortcut syntax in an any more complicated way than calling a function, providing missing arguments and rearranging those that have been passed. So, something like this:

    (frob #(mumble :foo %2 %1) bar)
Anything more complicated, I'd either use `fn' or define a free-standing function.

I'm not disputing that the Clojure community has assholes, though I've hung out on #clojure a fair amount and no one springs to mind. But the `fn' shortcut sugar seems like a completely reasonable design. The design choice, however, does encourage the adoption of certain conventions.

your frustrated by people who have different priorities than you do? Who values you things differently?

when you start using words like 'obviously', it really doesn't speak well of the comments to follow. noisy? beautiful? elegant? these are all aesthetic judgements. for you to say something is obviously more noisy is to place your judgement about others and to discount the validity of their view- that kind of attitude frustrates me.

> someNumbers.reduceLeft( _ + _ )

> (reduce #(+ %1 %2) someNumbers)

In math, you can write:

  f(x,y)= x+2y
  A= {1,2,3}
  reduce(f, A)
A more readable syntax for people who did math at school could be:

  reduce(x+2y, SomeNumbers)

Frankly, I'd prefer:

    %1 + %2 * 2

    _ + _ * 2
With the former syntax, it's much more obvious what the function is doing.


Because _ and _ refer to two different values using the same name. Adding a third _ would refer to yet another value, etc.

Not to be excessively glib, but I can't resist pointing out that children successfully comprehend this notation daily, it's called "fill in the blank."

It's not a matter of it not being comprehensible once you know what's going on; it's a matter of consistency with the rest of the language. Everywhere else, each symbol refers to one and only one value within a given lexical context. Breaking from that expectation would require some compelling benefits.

Well, does it mean:

    (x, y) -> x + y

    (x) -> x + x
If I didn't know the language, I'd assume the latter, so it fails the principle of least surprise (at least for me).

1. Comon, that nip picking. The clojure one has more feature (reordering, %&) more and you have to write 3 more chars because of it. I think thats not a bad traid. How cares about these 3 chars?

2. I don't know what that is and I never saw a anouncment something like that on the mailing list. If somebody wirtes something like that and put it on the mailinglist im sure people will pick it up.

3. Im in #clojure often and I never even heard something like that. Sure often people say "macros should not be uses in that case" but often enought I saw how people heroiclly showed new programmers how to do macros.

Re: 1. In your Scala example the two underscores are placeholders for two different arguments. The underscore is also used in Scala for catch all clauses with 'match'. How is this syntax then way better than %1 and %2 for arguments in shorthand syntax in Clojure?

how does scala handle the case where the order of arguments to the lambda is not the same as the order in which they appear in its body?

It doesn't. Use a real lambda. Why? See: http://news.ycombinator.com/item?id=2468038

I've seen too much code where lambdas do little more than twiddle argument orders. I'd rather pay a tiny bit more visual weight for straightforward argument usage and be able to use the same tool to reorder them.

The Scala version seems (to me) like it praises its own purity over day-to-day utility.

I'm not a Clojure programmer but I disagree on point 1. My advise (if it's a huge sticking point) would be to allow both forms.

Sorry to pick nits but the clojure example could be:

    (reduce + some-numbers)

Sorry to pick nits, but you magnificently managed to completely miss his point.

Within any culture there are individuals who will act like dickweeds. Likewise, not every feature that you propose is likely to be accepted. Are these cultural problems?

I'm not saying someone in #clojure was a meanie, therefore clojure is doomed.

I'm not trying to give you a hard time, I really am curious about the answer. These kinds of discussions tend to be anecdotal (understandably so) or too nebulous and so I'm trying to dive deeper.


I'll be honest: the direction 1.3 went was not a direction I saw as valuable.

Similarly, I felt like the 1.2 work on protocols and datatypes was about 3/5 of what a person would need to use them in general programming. I've talked with some people I consider experts in Clojure and they suggested to me they have some similar feelings.

The community dislike of macros, and a curious insistance on using the most difficult and bug-prone version (not to mention least debuggable) of a macro implementation is another example of a troubling bias in the direction of clojure.

I am not a genius, nor do I claim to have superior information on which direction Clojure should go. All I have is my biases and intuitions, but I don't think I'm alone in their current values.

What exactly did you not like about 1.3? It's faster and a lot numeric stuff no longer needs to be written Java.

As far as 1.2, the only thing I find lacking around protocols and datatypes is reader support and a default constructor fn.

As far as community dislike of macros, I'm not convinced.

I also don't see anything stopping anyone from submitting their CA and pitching a friendlier macro front-end. But as far I can tell most people in the Clojure community are not familiar with Scheme style macros. You'd have to come up with the code, write the tutorials, and market your approach. That it's a lot of work is the only reason I see that it hasn't been done yet - not because anyone is against the idea.

There is a spike of scheme-esque macros for Clojure here:


Not sure of the status/quality of that impl, but there you go.

Oh god I so hope that Rich ignores this. A "just say yes" mindset is incredibly dangerous; yeah, ok, a trillion people use C++ but that doesn't make it a good language. Clojure to date has been built with great discipline, and it would be tragic to see it go off those rails in the hopes of satisfying a huge mass-market that a) Clojure is fundamentally unsuited for and b) will never be happy regardless. When I say that Clojure is fundamentally unsuited for a certain "mass-market" I don't mean that it shouldn't or won't catch on and into i.e., the TIOBE top 10 -- just that it can't be everything to everyone, and shouldn't try to be. It's a wonderful language that understands what it tries to be, and I hope the Rich never forgets those underlying intuitions on (state, identity, functionalism). Abandoning that discipline own't make it any more powerful, but will just muddy the language to try and satisfy people who won't be impressed because of it.

Youngme Moon, in her book "Different", offers the idea that one should say Yes where others say No, and No where others say Yes to produce meaningful difference. Once you go this way, you iterate and improve on the ways you say Yes to accentuate the difference. There are similar suggestions in other recent books on differentiation (Moore's "Dealing with Darwin", for example)

Look at all the ways Clojure says Yes, where other lisps/languages have said no. Being a lisp, embracing the JVM, immutable by default and everywhere, first-class concurrent primitives -- clojure should probably say No if it compromises these core goals. It should say YES when it furthers these tenets.

I don't use it enough to really understand Yegge's gripes, but the best move for clojure is probably to make these things possible as libraries if it doesn't want to embrace it in the core, and iterate on the core so that it's the only possible choice when you are in the situation that requires it.

If you can't say no, what do you say to the users who are begging you to say no because they like the language as it is? Every language design decision is a compromise, and a person empowered to make those decisions is going to make people unhappy. Just look at Perl and Python: Perl said no to people who demanded orthogonality, and Python said no to people who wanted a free-for-all TIMTOWTDI language. Java said no to deterministic finalizers. C++ said no to exceptions that could be restarted. All those languages are doing fine.

A better idea would be to never turn your back on any class of users -- ignoring their specific requests, perhaps, but always making sure they can solve their problem using their language. Even that strategy doesn't require language support for everything. CPython does just fine with the scientific computing community by punting to C bindings, for example.

The examples Yegge provides don't make any sense to me. The need to port Java code to Clojure is questionable, since Clojure provides good Java bindings. If you want a more Clojure-y version of a Java library, then a straight line-for-line port is not much of an improvement. As for the LOOP macro, you don't have to add everybody's little helper function into the standard library. I'm a big fan of languages adding helper functions to standard libraries if there's one obvious way to write them and the act of adding them will save everyone else from including their own version in all their projects, but a LOOP macro is absolutely NOT that.

(Unless it's just a straight-up reimplementation of Common Lisp's LOOP macro, which would do exactly one job, which is allowing Common Lisp programmers to be more comfortable in Clojure. Hey, guess what -- Common Lisp programmers already have one of the easiest paths to learning Clojure, since they know a Lisp already. Many non-Lisp programmers are tackling the learning curve and embracing Clojure, so CLers can't complain that it's too hard. Plus, many Common Lisp fans regard LOOP as an abomination and never use it anyway. Writing control structures in a complex DSL that few programmers bother to learn completely is not one of Common Lisp's best features. Saying "no" to a Common Lisp-style LOOP macro would be the right thing to do.)

a complex DSL that few programmers bother to learn

What? Plenty of CL programmers learn LOOP. It's unbelievably convenient; as far as I'm concerned it is a masterpiece of usability. An abomination? Perhaps. Crack is an abomination and it's a masterpiece of usability too. But I don't care. I'm not a purist, and LOOP makes my work easier and more fun.

Complex DSL? Complex to specify, perhaps. Complex to implement? Not that bad. I wrote a pretty big subset of it for Parenscript and it wasn't hard. Complex to use? Not at all.

The reason many Lispers dislike LOOP is that it's un-Lispy. It's a foreign body implanted in the Lisp organism. Interestingly, though, CL's immune system doesn't reject it, meaning that it interoperates fine with everything else. If there were an impedance mismatch, LOOP wouldn't have lasted.

I use loop for simple things, but iterate is more general. For example, you can put your collect forms inside other forms, not just at the loop "toplevel". Also, editing larger chunks in an iterate form is easier, and automatic indenting works better.

ITERATE is one of those tastefully designed macros, very intuitive to a LOOPer.

"what do you say to the users who are begging you to say no because they like the language as it is?"

Java had this problem with Generics. A sizeable portion of the Java community had a massive allergy to them because they'd previously been burned by C++ templates. Of course there was the typical language war response "you're just not smart enough", but no matter how many times the people railroading it in protested that Generics weren't templates the facts remained:

They looked like templates

They had lots of funny little side effects and gotchas (just like templates)

Nobody really understood them, and even if they did, nobody could explain them. See also:

Class Enum<E extends Enum<E>>

In other words, it looks like a duck, it quacks like a duck (etc).

It is entirely possible that with different syntax (not the C++ style angle brackets) and different keywords (extends in the above example is clearly different from the normal extends in an object context) there would have been less cognitive dissonance from the C++ refugees and less pushback.

But ultimately what it boiled down to was this: the JCP (java community process) was a massive failure because you couldn't vote against things. You could only vote yes, no was not even an option, and abstentions were ignored (and therefore useless). It is like Communism - they call it democracy, but there's only one party to vote for...

Yeah and the major problem with Java generics is they aren't more like C++ templates.

Today it is 100% clear that the naysayers were not only wrong about Java generics. They are wrong about generic programming in general. Unfortunately they managed to cripple Java before this realisation became obvious.

Actually, I despise Java generics and have a much higher tolerance for C++ templates. Just more proof that you can't please everybody :-)

I like Common Lisp's LOOP, personally.

It is an abomination until you learn how to use it, then it is completely awesome.

same with `format', they're easy to bad-mouth until you've used them a couple of times, then you don't want to contemplate programming without them

Everything is an abomination until you learn how to use it.

The difficulty with loop is it takes something trivial and makes it more complex than it needs to be. Simple things are not simple. Complex things are rare and loop doesn't simplify them enough.

That might be true. It is indeed complex if you look at the BNF in the hyper-spec. (http://www.lispworks.com/documentation/HyperSpec/Body/m_loop...)

If you ignore the hyperspec, however, and get a feel for it by looking at examples and writing code using it, it becomes quite simple. It is very useful when you are translating C or Java code.


I find it quite readable in comparison to many of common lisp alternatives for iteration. And I kind of disagree that simple things aren't simple.

For example: (loop for i from 0 below 10 do ...)

Is pretty much the simplest construct that you normally need. Granted, I would likely just use dotimes in that case... but lets say you are iterating between 5 and 15, dotimes becomes unwieldy, where the for loop is pretty much the same code with the numbers changed.

(loop for i from 5 below 15 do ....)

Now try incrementing by 2

(loop for i from 5 below 15 by 2 do ....)

compare to the similar do* code

(do* ((i 5 (+ i 2)))

    ((>= i 15))

kind of a toss up to me.

Not that I don't understand your point... in fact, as little as a year ago, I felt that way too... but I have since changed my mind.

My current preference is to use do* if writing a macro that needs iteration, and to use loop when I am doing something similar to a list comprehension or array traversal.

I love Common Lisp's LOOP, it's very handy. For those who prefer something lispier, there's always ITERATE


(common-lisp.net seems to be down)

ITERATE is an excellent gateway-drug to LOOP.

I sometimes joke my programming language is LOOP instead of CL: http://www.reddit.com/r/science/comments/bc91w/on_the_8th_da...

Loop and Format are only useful for getting work done, but they fall flat for waxing poetic on LISP beauty.

Yegge points to the Tiobe language index as a metric to language adoption, and implies that "No" languages have low uptake. What about Lua? The Lua implementers don't accept public patches, rather they will rewrite public patches that are submitted, and only if the change makes sense (citation: http://lua-users.org/lists/lua-l/2008-06/msg00407.html). Yet Lua is #12, up from 20.

Saying "no" to language suggestions is not a bad thing, it all depends on the goals of the language and the philosophy of the language authors. Perhaps Yegge should try and understand the reasons why patches aren't accepted, rather than force their acceptance via his soapbox.

Lua is probably a bit of a special case, as it grew extremely popular due to its original niche (trivial integration in complex C and C++ projects): Lua was born in 1993, and started getting used in big-name games in the late 90s already (Baldur's Gate was scripted in Lua in 1998, so was Grim Fandango the same year). From there, it took over most of the gaming industry and could only leak out as a popular language.

the other point of view is that that makes it nothing more than the new tcl, and tcl managed to lose its popularity.

tcl lost its popularity because it was displaced. By Lua. Because Lua had a lower dissonance with C and C++ (actual data types & al, for instance).

Could lua itself be displaced? Sure, but there's no contender I see right now. And I don't think that's an other point of view.

from what i remember, tcl's popularity waned long before lua caught on in a big way. people were looking to ruby and python as extension languages, even though they had a far higher impedance mismatch than tcl, simply because they were vastly more popular as languages by then

I'd say be careful about reading into this thread too much. It's a public, ad hoc discussion, and won't be as carefully phrased or researched as a blog entry. Cherry-picking one quote, like "Clojure just needs to start saying yes" from a discussion like that and treating it like a mantra doesn't improve discussion nor is it fair to any of the participants.

It's an interesting thread, I'm glad it was linked, but take it for what it really is. Hickey's response is a good counter-argument to "languages should always say yes" but basically ignores any of the other subtleties of the discussion. His comment is a starting point, but he hasn't engaged and no one has responded (yet).

I actually thought it was as fair and representative a quote as I could pull from the discussion; my apologies to Mr. Yegge if it's uncharitable from his perspective.

I understand, you've got a very limited space to work with. Just saying that anyone following the link should take your heading with the grain of salt it deserves.

(And the point is much less relevant now that Rich and Steve have both continued the discussion here)

Clojure is an aesthetically clean, wonderful language for parallel manipulation of lazy sequences. However after using it as my main language for slightly over two years, I found myself all-to-often butting heads with its paternalistic functionalism (trust me, I'm a grown up, I can manage a little bit of mutable state without shooting myself in the foot).

While I would definitely use Clojure as a first choice for any project which was primarily defined by a need for massive parallelism, I am now happily using common lisp as my main language, and I'm a noticeably happier and more productive programmer (while most tasks can be transformed into a purely functional lazy sequence manipulation, the process often takes time and results in code which is harder to read and maintain).

Clojure already says YES where other lisp-based languages said NO to. (Instead of wrapping all java, let's use it for its strength. Don't just use list, use vector, map, etc. No tail-recursive? Aight, let's do it anyway and find a work-around)

Now, we make to differentiate users and what they ask. Is it a new clojure user who is used to C++ and try to code C++ in clojure and suggest missing features? Or an experienced clojure user offering useful patches to the community. In the first case, the right answer should probably be "Oh, but XYZ is already in Clojure, it's a little bit different than what you've been used in C++, but in fact, it's even more powerful. Here's how you can do it [...]". In the second case, it's more complicated.. but we should go toward the Yes if it adds real value to a day to day task. I mean, even thought clojure is great, you can't guess everything that will be needed.. so you shouldn't be shy to add missing stuff.

Still, Rich's answer is pretty great IMO.

One of the best things I like about clojure actually is how it is currently keeping away new users.

It is probably the best thing for clojure right now, looking at the rapid changes that have been happening from 1.0 to 1.2, 1.2 to 1.3 .. Hickey has a clear vision for what he wants the language to be and it doesn't look like he has finished thinking about it yet. It is good that the community is still small and the tools are immature. When the language design is at a stable state then would be the time to start evangelizing.

I keep thinking about what happened with ruby .. it is such a beautiful language but the community grew too fast and it is now stuck with so many conventions that could have been better thought out.

See also the Haskell "avoid success at all costs" unofficial slogan, e.g. http://www.computerworld.com.au/article/261007/a-z_programmi...

See also the slow adoption of Python 3.

This whole "I used the language for 2 weeks so now I am qualified to change it radically" attitude is well-established in the Lisp community (yes I know Steve Yegge is a veteran lisper, but this still applies, especially to some of the other commenters). The authority on this is Brucio, the fictional author of "Lisp at Light Speed" http://replay.web.archive.org/20080722232746/http://brucio.b... Bruce's First Law of Lisp is "If it does not do exactly what you expect with zero hours consideration or experience, it is a bug in Lisp that should be fixed."

I don't understand Rich Hickey's point. How does his thought experiment address Steve Yegge's points?

Would someone care to explain?

Only one point is addressed, and that is the broad idea that "languages must say yes" (to be popular).

None of the details or more subtle points are addressed.

Reduction to absurdity: if (almost) everybody got to add something, you would have an undifferentiated mass of scar tissue. There needs to be some discrimination of what changes "fit" the language.

Of course, "fit" and beauty are in the eye of the beholder :-(

It doesn't quite work, because Steve says "you need to be more open to fixing annoyances" and Rich responds "Look what a mess would it be if we fixed everyone's annoyance!". There is a big continuum between the place Clojure is now and the hypothetical absurd described by Rich.

My quest for languages that say "Yes" ended with assembly. I started in BASIC and found the strait jacket it imposed much too restrictive. Then Pascal was the big teaching language but it has it's own ways. It wasn't till I got good as x86 assembly that I felt totally in control and able to do things just the way I wanted. Having achieved that feeling 20ish years ago I've been running away from it ever since. C and C++ proved that at some size of code base even the most brilliant programmers can't do memory management correctly[1] so I love that modern language said "No" to memory management. I've worked at large software companies where the programming style guides ran to 80+ pages. That's 80 pages of documentation of how people will format their code so that it's done consistently. I love that Python said "No" to letting people format their code however they wanted. It's likely that these aren't the kinds of things that Steve Yegge is talking about, but there are a lot of things it's worth saying "No" to, and if they bug you (like they used to bug me) there's always assembly language.

[1] http://research.microsoft.com/pubs/70226/tr-2005-139.pdf

For starters, I dislike the title of the post. The thread language design and community/eco-system creation around the language.

I am not aware of how if at all Steve is qualified to make a well judged statement on either language design or the communities that develop around languages since afaik he has not done either.

I have been following the thread from the beginning since I am on the list, Steve Yegge comes of as delusional with his claims on how he influenced the Python community with one of his blog posts.

On the couple of instances where Steve was requested about what he contributed in tangible fashion, he mentions months of effort an JSwat command line support, everyone interested should check out the project and his contributions on code.google.com

On the second occasion he refuses to publish anything that the community can see or use or enhance.

I hope he proves me wrong, but at this time, he is blowing a lot of hot air, off the wrong end, and should stay away from the blog post he intends to write, his facade of being a real voice for a developer community is crumbling fast

I'll just quote Antoine de Saint-Exupery here: "You know you’ve achieved perfection in design, Not when you have nothing more to add, But when you have nothing more to take away."

McCarthy already wrote that perfectly minimialist language in 1960. It only had seven functions and two special forms. Most programmers use more than nine keywords today.


I see a lot of people bashing Yegge, bashing Tiobe, praising Clojure, and praising Rich Hickey. All of that is strawman bullshit though as far as I can tell.

Virtually no one seems to be addressing what I think is Yegge's core point: neglect.

Is it true that there is a considerable body of extant patches/libraries/fixes out there that are being neglected so that a sizable portion of Clojure's user base is feeling neglected.

If so, then Yegge has a valid point. If not, then argue why he is wrong about neglect.

I wonder... have you heard Rich Hickey ever talk about many of those patches and why they haven't been accepted yet? Clojure has a great deal of conformity across its abstractions and interfaces. It is really easy to compose things together. A lot of thought and careful planning goes into that. Many patches etc run afoul of that because they don't meet that standard.

As to libraries, you don't need to be blessed for a library to go. How exactly does clojure core neglect a library? By not taking it into the core?

I didn't read this as being about neglect. I think his argument was that you need to give people what they want or they will leave. I don't find any validity in that. There are plenty of BDFL projects that do quite well.

I think the real message is... 'I want these things. Why won't get you give them to me.' If this was anything other than a bit of a childish rant, there would have been much better channels to carry it out in.

If you want people to change, you don't attack them in public. If you want to make yourself feel good and vent, you attack people in public.

Thank you.

This is the kind of response I was hoping to see. I'm not qualified to comment at all on Clojure, but I was interested to see what people thought who were part of this community.

I wish more people would address these points. If Clojure really is fine the way it is then argue that point positively rather than chip away at Strawman arguments, which, at best, were only tangential.

Finally, I'm not so sure about the public-private point you are making. There's a reason we protect free (public) speech: being publicly criticized is much harder to ignore than private requests.

This rant has become front page material on Hacker News. That's pressure. If there really is dissatisfaction amongst the Clojure community, this won't be the last time we'll be reading about this particular exchange.

The public/private was...

If you want to influence people, attacking them in public isn't the best way to go about it. Do it privately, discuss your concerns etc etc. If you just go for the public 'you are doing it wrong' approach, you breed resentment & distrust. OTOH, a good rant in public sure can make you feel good even if it probably isn't going to sway your target to your side. Then again, that wasn't really the reason it was done, was it?

Or put another way... if you are going to rant, rant. Don't pretend like your public ranting at someone is an attempt to get them to agree with you. It is about making yourself feel better. If you need/want to convince someone to join your side of an argument, quiet diplomacy is going to be far more effective.

> Many patches etc run afoul of that because they don't meet that standard.

The problem is not that they aren't applied, it's that they are ignored without discussion of their faults.

I think more important than always saying Yes is providing the ability to make anything possible.

All turing complete languages make anything possible, right? The only question there is whether or not you detour through Greenspun's Tenth Rule to get where you're going.

I think Steve's point about Clojure having a limited time to gain acceptance before it's considered "over" is valid, even if he doesn't necessarily show us a way out.

    All turing complete languages make 
    anything possible, right?
But there must exist a point of diminishing returns. There is a difference between Turing Complete and Gosling Complete.

"Gosling Complete" is such an awesome phrase.

> Steve's point about Clojure having a limited time to gain acceptance before it's considered "over" is valid

Is there an hourglass somewhere?

In any case, nothing's even close to "over" or "dead at the starting gate". If that were the case, you wouldn't have messages like this: http://groups.google.com/group/clojure/msg/661747c952310b41

Programming language adoption happens one developer at a time. If the silly TIOBE index is one's preferred benchmark, it's worth noting that every single language in the top 50 has been around for decades (with the exception of Go and F#, which have generous and powerful corporate sponsors).


If your language is, ipso facto, saying Yes, then the core language can be left alone and features can be built by those who need them.

Sort of like assembly. :-)

Isn't this how we got C++?

According to Ken Thompson, yes:

"Stroustrup campaigned for years and years and years, way beyond any sort of technical contributions he made to the language, to get it adopted and used. And he sort of ran all the standards committees with a whip and a chair. And he said 'no' to no one. He put every feature in that language that ever existed. It wasn’t cleanly designed—it was just the union of everything that came along. And I think it suffered drastically from that." (from Coders at Work)

Rather, PHP.

To be fair, it is actually fairly popular, thats why.

We just got done hearing about how PHP's early years included "official" abstention from frameworky things (http://codefury.net/2011/04/why-php-was-a-ghetto/). PHP "won" because of distribution, just like Javascript is "winning" now.

And Perl.

While Perl is an eclectic language and has the kitchen-sink nature, the only thing it really bolted on without much forethought was its object system, which it borrowed from circa-1994 Python. Unfortunately for Perl, this wasn't just bolted on (bad in and of itself), but was also a bad object system, which is a nasty flaw; Python had a much smaller user base and was able to transition from its old-style classes to much better new-style classes without a horrible hubbub, but Perl's much larger userbase (at the time) made it very difficult to switch to something better, which the community mostly only got around to in the last two years (incidentally, Perl's much better replacement for its old object system was by and large another borrowing, this time from Common Lisp).

My understanding of what Steve Yegge is saying is that the community needs to be open to ideas and debate, and in his experience it is not. He pulled a few examples from the top of his head.

At least that's how I read it.

Mr. Yegge says, "it only takes a few people to poison a community"

Yes, and it is the kvetchers that are putting a black cloud over a language and community that has been evolving quite nicely otherwise.

I think the term you are looking for is 'wreckers'.

No. They are kvetching.

I was making a joke using communist terminology. To be more direct, I don't think Steve is casting a dark cloud over the community, in fact the opposite metaphor is better: he's bringing some important issues to light.

Yegge is full of it. Python said "no" over and over again, and it's gone like gangbusters.

Let's see how Steve's statements stand against evidence.

> to get users, languages have to say Yes

What languages in wide use today got popular because they said yes? Is it

C, Java, Python, Lua?

I don't think those qualify as "yes" languages. Oh, maybe it is


Perhaps if you count the CPAN.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact