Hacker News new | past | comments | ask | show | jobs | submit login

From the perspective of a (fairly large-scale at this point) app developer: I find it great that Clojure places such emphasis on backwards compatibility. In general, migration to newer Clojure versions is completely painless.

The language has been designed by experienced and mature people and doesn't go through "let's throw everything out and start again" phases like so many other languages do.




> The language has been designed by experienced and mature people and doesn't go through "let's throw everything out and start again" phases like so many other languages do.

Swift was designed by highly experienced and mature compiler devs and is notorious for making breaking changes. It's just a difference of opinion. I prefer to suffer the occasional upgrade pain in exchange for clean and consistent APIs. And I totally understand those who don't want to deal with that.

Although to be fair to them, Swift is much younger than Clojure and the number of breaking changes has steadily declined with each major release. In seven years maybe they will take the backwards compatible approach too.


> Swift was designed by highly experienced and mature compiler devs

"Compiler devs". I think that is exactly my point. I should have said "app developers" or "system designers". If you listen to a Rich Hickey talk, you'll see that he is all about Getting Things Done in the real world, and much less about theoretical concepts. Language design is subservient to app developer needs.


I would put it simply: Clojure was designed by a thinker, who creates when he is away from the keyboard, not in front of it. When one releases and breaks the code in his head first, very few breaking changes are left for the public releases.


It takes one to know one. (Or so I would like to think.)

Rich's writing, presentations, and example of overall conceptual discipline and maturity have helped me focus on the essentials in ways that I could not overstate. I'm glad (but not surprised) to see so much appreciation for him around here, even among non-Clojurists (like myself).

At the risk of fanboyism, I am constantly referencing his ideas* to my team, and I give them my blessing to watch any of his talks as soon as they come out.

* That is, the old but sometimes obscure ideas whose importance he's brought to his audience.


I made the mistake of showing the 'hammock driven development' talk to an employer of mine. He got annoyed by it and said he thought that it was just providing programmers with an excuse for staying away from the computer when they should be writing code.


It is kind of heartbreaking.

But, although my current manager is excellent at supporting his team and would never think such a thing, let alone say it, the truth is that, unless you're a professional researcher, you really do have to do the hammock-driven thing on your own time, either by working it into your routine or by taking a long sabbatical (as Rich did). It's something I struggle with, even at a very good workplace.

"Nothing is more precious" than the chance to think through a problem over a long term, and as things stand, the best tool for thinking is still the mind, not the keyboard. I can see how that takes a leap of faith when your chief deliverable is code.

Anyway, maybe a better introduction for your typical manager is,

> The most expensive problems are problems of misconception.


>>it was just providing programmers with an excuse for staying away from the computer when they should be writing code.

That would be code as carpentry development model. Often encouraged by career managers.

There are plenty of those in the industry, people who have had rapid promotions before they could do non-trivial engineering work.


Has he ever written code that was used in production and he was in charge of the project?


He became a project manager when he found he wasn't good at writing code after becoming a coder when he found he wasn't a good physics researcher.


Rich Hickey is a great thinker. But he's also an application developer (Datomic). So breaking changes don't just break "other people's code" in the abstract. The pain is real and immediate. By wearing multiple hats he's finding the right balance in a way that language designers who don't code day-to-day simply can't.


The deification of Rich Hickey in this thread is amusing. No doubt he's a great developer and language designer, but dogfooding your own language is not exactly novel.


I think the original statement was implicitly about using the language to build large complex systems in industry, not just about "using" the language which I agree would be kind of silly - who would work on a language they don't use in some way? But I always had the impression that most language designers were academics, e.g Alan Kay, Stroustrup, or professional programmers building what were initially 'hobby' languages (Ruby, python) rather than building something they would use in production in industry.

Standards must be low if that's now "deification" :)


I got the point of the original statement, but again, even building a large and complex system is not unique to Clojure. Go was used internally within Google, Swift within Apple, and Rust within Mozilla, to name a few.


I'm not really sure where you see this "deification" — but as for me, I simply have a lot of respect for him. Mostly because every year or so he comes back with another solution to one of my problems. I don't agree with all his choices (I find some of his naming compromises particularly bad), but I still respect (not "deify") him.

Listen to some of his talks and you will see why people respect Rich.


>The deification of Rich Hickey in this thread is amusing.

Hi Trevor,

funny that you should say that - I've met Rich Hickey and had some great conversations with him - and what I always tell everybody is how normal, humble, human, down-to-Earth he is. (Which is probably deifying him even more, from your point of view).


The "getting things done in the real world" theme is overrated and mostly a myth.

In static FP languages, the exotic features being added are almost always demanded by users in the industry. As a fun fact for Haskell, developers in academia are more likely to stick with the Haskell 98 language for stability and compatibility.

As an example Simon Peyton Jones in a recent chat with Martin Odersky at Scala eXchange was describing how he was approached by developers at GitHub, who are using Haskell for language analysis, saying that they can't wait for GHC 8.6 because they want "quantified constraints", a new and very exotic GHC extension.

The "real world" you're talking about wants expressive static type systems where they matter. And the problem with static typing is that the language acquires more features and thus becomes more prone to suffering from backwards incompatible changes. And note that more features doesn't mean more complexity, rather it can mean more tools to cope with complexity, otherwise we'd all be programming in RISC assembly.

When it comes to Clojure, sure it's stable and that's a good thing, however it has no static type system (therefore it's apples versus oranges) and by encouraging people to work with raw data (e.g. everything is a list, a vector or a map) in a sense it's actually a step backwards from classic OOP.

Rich Hickey was fond of saying that OOP complects data with the operations done on that data. However the need for OOP's encapsulation and polymorphism came from the experience people had with procedural languages like C, and that experience has been forgotten, but the fact remains that some of the biggest software projects around are built with static OOP languages.

Correlation doesn't imply causation of course, it might be that such projects are built in static OOP languages because these languages are popular (chicken and egg issue), but at the very least we have empirical evidence that they work, whereas the empirical evidence for what makes LISP great pretty much doesn't exist.

Basically as an application grows, so do the data structures, being inevitable and to be able to cope with that: (1) you need good encapsulation and (2) you need the ability to do refactoring cheaply. And I think standard Clojure, like other dynamic or LISP languages before it, fails hard at both.

My point being that ... it's really not impossible to keep compatibility in a language that doesn't care much about correctness.

And I also believe that the "real world programmer" meme is a symptom of anti-intellectualism.


Lisps are so flexible you aren't going to be able to make a solid feature-based argument why one is bad. They can do any logical feature expressively and easily if you want it for your own work. If another language does something better then it isn't hard to lift it either. Lisps fall down on writing performant code easily, on ease-of-learning and on community, but you're setting yourself an interesting challenge claiming that there are technical issues. Common Lisp for example is famous for having everything, usually before and usually better than other languages.

Static typing isn't a bad idea, but it wasn't implemented because it had interesting logical implications. Languages like, eg, C, used static typing because it is critical to the entire design of C that you know how large an object is in memory, and that like is where I suspect most of its popularity comes from - I assume that to write high-performance code, you need fine-grained to control RAM. For the benefits of a static type system, Clojure is decoupling that implementation detail of size from the interesting logical implications of dynamic typing using the spec library (which has been present since 1.9.0, and is likely to be a pretty strict improvement on static typing for anything except raw performance). I doubt it is the first language to do that, but it is the first I've used.

As for encapsulation and refactoring it would be interesting to hear about where you think it falls down. I havn't used the language for anything large-scale, but for personal projects Clojure's approach to encapsulation with protocols is far more likely to capture what-I-actually-wanted than an inheritance system like the classic model in C++. Clojure makes it at least as easy to refactor as any other language.

Basically, Clojure cares a huge amount about correctness, that has been a focus for 1.9.0 and now 1.10.0. They just think that static types are a bad way of solving for correctness, because static typing puts a bunch more constraints on an object than are required.


"Lisps fall down on writing performant code easily"

I disagree. I would say it is actually easier to write performant code in lisps than in other languages. In languages with traditional syntax (ie. not homoiconic) it is usually tradeoff between performance and readability. As soon as you try to maximise performance, the program starts becoming unreadable.

In lisps, on the other hand, due to how you can have full control over every aspect of translation between notation and generated code it is much easier to write programs that don't hurt performance just because you want nice notation or DSL.

As an example, take printf. This function makes it easier to format strings and is available in many programming languages.

printf("%d", 7)

If no special compiler optimization is used (ie. compiler having hardcoded optimization just for printf) this results in code that will have to parse "%d" every time just to figure out, every time, it needs to take next argument and output it as integer.

On the other hand typical Common Lisp format macro looks very similarly:

(format t "~d" 7)

but you, as an author of a macro, have option to figure out that the format is an immutable string, so you can, at the compile time, replace call to format with an equivalent call to a simpler function that will take integer as an argument and immediately print it. This way you don't have to parse the format string every time and you don't have to pay for an extra stack frame as the expanded form of the macro takes place of the macro invocation.

For another example take a look at Peter Seibel's Practical Common Lisp chapter on parsing binary files (http://www.gigamonkeys.com/book/practical-parsing-binary-fil...)

This chapter shows a system of macros that take very high level of description of binary data structures and generates efficient code to parse, access parsed data and serialize it to the binary format.

In a typical language the requirement to have flexible description of the messages would likely result in a compromised performance. This is why so many "high performance" solution involve some variant of code generation but at the cost of additional complexity.


Have you heard that Idris 2 is implementated in Scheme, and runs faster than when it was on the GHC. Check out Edwin's tweet stream, he's loving Scheme and can't stop raving about it. The idea of Idris on Scheme is so sweet and surreal it's got to be to the combo of the year.


AFAIK Idris 2 (Blodwen) as a compiler still runs on the default C-backend used by Idris 1


I'm in the same boat; I work for a company that's been around for awhile, and as a result they've accrued a lot of Java code, a lot of it using deprecated APIs. It's quite frustrating to dig through them, and I would have been happier if Oracle had just phased out a lot of these things from the compiler (though my understanding is that JDK11 is making steps for that).


Didn't Clojure change the + function (hardly obscure!) from automatically updating ints to bigints and therefore never overflowing to throwing an exception if you added ints that were too big?

I've got a Clojure book with a spirited defense of why it's good that the simple symbol + just does what you'd expect instead of throwing exceptions based on an implementation detail, and it kind of falls flat now.


That was over 7 years ago, so not particularly recent. And you can get the auto-promoting behavior with +' instead.


Yes, very early in Clojure's life and only after testing the change across several popular libraries.


I was a bit sad that the changelog was so small. Then I realized that maybe core features were stable and that's it, what was needed wasn't big beside spec and a few other things.


A lot of the year's effort went into things outside core - 2 releases of spec, 2 releases of core.specs, many releases of tools.deps.alpha / clj, tools.gitlibs, new release of REBL, etc. Lots of small fixes and enhancements in core too.


blame it on rich, I'm addicted to revolutionary talks about collections that brings speed and expressiveness enhancement for free :p


S-expressions are perhaps partly responsible as well. They don't suffer the same brittleness as regular programming languages (is there a word for non-s-expression languages like c/python/etc?).



that's a great link, cheers


I think so as well. When you have macros and can add syntax and features to a language at will, then the language and core library does not need to be very big. This probably helps immensely when trying to avoid breaking changes.

New macros can very well be prototyped and battle-hardened in external libraries before being added to core.


Which languages threw everything out and started again?


Python 3, Perl 6 it looks like Scala 3 is heading that way too.


Python 3 is quite similar to Python 2. They made some breaking changes, but nowhere near like Perl 6. The real stumbling block was str vs unicode, which they didn't realize would be such a pain when making the decision.

There's breaking changes and there's writing a new language. It's a spectrum.


My computer has at least three different Python versions installed. I'd say that's a pretty big backwards compatibility problem.

There is also Perl 6, and I've recently read about Go 2. Swift changes every couple of months in incompatible ways.


I also have several Pythons installed. For me that's because of incompatibility between community packages I've installed, rather than differences in Python versions. Unfortunately, the various smaller communities can't always stay in sync. Right now there's a conflict between XGBoost and NumPy using MKL on MacOS. That's not a Python problem, exactly, as the issue is conflicting use of OpenMP, written in C, C++, and Fortran.


It also seems like all of the changes in Python are all changes that Perl5 has done over the years without breaking backwards compatibility. Unfortunately this does mean you often have to opt-in to the new functionality.

    use v5.24;
    use feature 'unicode_strings';
    use feature 'fc';
    use feature 'postderef';

    …
Note that the `'postderef'` feature is now always enabled so that line is pointless if you use a new enough version of Perl5. The `'unicode_strings'` also isn't always necessary.


Clojure's emphasis on backwards compatibility is commendable, but keep in mind Clojure is still a very young language ( only about 10 years old ) so there isn't much backwards compatibility to worry about.


Maybe, but this was true even when Clojure was younger still. The language changes are overwhelmingly additive; Hickey places great stock in not breaking existing code.


I'm going to bring up again how + used to autopromote ints if your result was out of the primitive int range, and now throws an error instead.


If you want to demand perfection, then Clojure isn't going to meet your standards (what would?).

The issue here is that, based on the way they (particularly Rich) talk, we can expect they will not do Bloody Stupid things like the old Python 2->3 changeover. Things like big, breaking changes that require architectural changes to keep current, or break all the available tutorials.

If the problem can be fixed with a macro or single function (like a change to plus behavior) it is an irritant, not a threat to productivity. That is not a concern.


You don't think a breaking change to + is any more significant than a breaking change to some other part of the language?


As has been answered elsewhere in this discussion:

That was over 7 years ago, so not particularly recent. And you can get the auto-promoting behavior with +' instead.


I don't see how a change could be much more minor, honestly.

It is a breaking change, that sort of change could introduce bugs into existing codebases. If that matters, you can't upgrade your Clojure version without careful change management anyway.

It isn't like reduce is being depreciated or something. I don't need to change how I think about the language.


A breaking release over 7 years ago, doesn't mean that having no (as far as I know) breaking changes for the last 6 years is any less impressive.


Was your application affected? Did you have to perform an extensive rewrite?

Engineering is all about compromises.


I didn't say there were no breaking changes, just fewer than most languages.


There isn't much to break anyway. Remember it's a Lisp.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: