Hacker News new | past | comments | ask | show | jobs | submit login
Java's Comb-Over (raganwald.posterous.com)
154 points by ColinWright on Sept 9, 2011 | hide | past | web | favorite | 60 comments



I think you are being very unfair to Java here. There are two very important issues to consider here.

1) One reason people like Java is reliability. Therefore adding Lambdas must not break any existing code at all. This includes no breaking changes to any part of the standard library.

2) As we can't make substantial changes to existing code, the lambdas we add have to fit well with existing code bases, without adding too many nasty corner cases to lambdas.

3) Once added, you can't ever make any changes to lambdas. This is a one-time opportunity. So you better get it exactly right, because you can't go back.

You say it is "never urgent" to make existing Java users rewrite their code. It is not "never urgent", it is a fundamental pillar of Java that users will never have to rewrite their code, and that is one of the major reasons that Java is so popular with business. You can argue that Java shouldn't make that decision, and should end up with something like the long drawn-out slow change to Python 3, or the Perl 6, but it was a conscious choice, not just something the developers can't be bothered to do.

Under those kinds of restrictions, I would argue adding lambdas to any system would be difficult. There were similar difficulties adding them to C++, and I would say in many ways Java is getting it better than C++ (but that is largely due to the much higher level of complexity of the C++ language).


Java is an artefact, not a person. It is not possible to be “unfair” to it, nor is it possible to be “fair” to it. What I describe is a process by which a series of good decisions over time lead to a result that is sub-optimal in one axis, namely velocity of progress.

If you want to argue that this is optimal overall, even if it is less than optimal for the velocity of progress, I do not stand in your way.

It is very similar to the “Innovator’s Dilemma,” where companies can make a series of perfectly logical decisions to maximize their profits and wind up missing a big new market that disrupts their existing market. I leave it to you to decide whether Java has missed a big new market for programmers by virtue of being trapped into serving its legacy customer base. The thing I see as similar is the cumulative effects of many small decisions over time.


Your article appears to be written in various ways which imply Java is in a bad state. Consider:

What is my client to think? That I have given them a marvellous, well-architected application? Or that years of bolting a feature on here and hacking a workaround there have created a nightmarish miss-mash where the velocity of progress is asymptotically approaching zero?

While not directly applying to Java, I feel the implication here is that Java is in the second category. However, the comparison with Java breaks down. I am sure the Java designers could have cleanly added lambdas in a very short period of time, and all other kinds of exciting features, if they abandoned backwards compatibility. There is a huge difference between modifying an app and a language in use by millions of programmers.

There is an interesting discussion to be had an apps become a "nightmarish mish-mash", and on languages becoming parallelised by backwards compatibility, but I am not sure they are as directly comparable as you wish. However of course, each to their own opinion.


It's interesting to consider Java in light of the Innovator's Dilemma, since companies facing the Innovator's Dilemma would like to avoid it so they can continue making money.

But Java is a tool more than it is a company, so maybe it is actually good that it maintain its course, even if it means its gradual irrelevance for many of the tasks it was once suited for.


The separation of Java the language from Java the virtual machine might allow Sun/Oracle avoid the Innovator's Dilemma. They can maintain Java as a conservative language, the "COBOL of the 21st Century", but innovate with new (but conveniently binary-compatible) languages like Scala and Clojure.


Right. For the JDK developers, the First Commandment is, "Thou shalt not break existing code, ever." Almost all of the things people dislike about post-1.1 Java stems from that, especially the non-reified generics.

It's valid to say "I think they should value some things higher than backwards compatibility." It's not valid to say "they're idiots for doing it this way." (Which I don't see in this article, but I see in plenty others.)


> It is not "never urgent", it is a fundamental pillar of Java that users will never have to rewrite their code, and that is one of the major reasons that Java is so popular with business.

During my (admittedly short) time at a Java shop we were running all of our stuff on Java X, where X was at least 2 versions less than the currently released version. The newer versions had some nice features that would have been great to use, but we never upgraded for fear of breaking something.

I got the hell out of Java-land as soon as I could, so I don't really know whether people were justifiably afraid or just overly paranoid, but either way this seems to be at odds with the "fundamental pillar" you mentioned.

My experience could very well have been an uncommon outlier though.

EDIT: Fixing formatting. I wish HN used Markdown or something instead of its own unique snowflake of a markup format.


I share your experience. We had to do a huge amount of maintenance to upgrade from Java 5 to Java 6. It was definitely not just drop-in.

Does that mean the code was written badly? Possibly. It wasn't amazing. Still, that seems more like the rule than the exception.


I'd be interested to know what problems you had. We literally dropped Java 6 into our million line project without problems. Despite what the OP implies, this was huge for us. If Java were maintained like, say, Rails we'd still be on 1.3.


In my team's case, the only problem we had was having to modify a bunch of decorators that implemented the java.sql.Connection interface. They added a bunch of methods to the interface in the upgrade.


I'd like to know as well -- my team (~1MLOC Java product) upgraded from Java 5 to Java 6, and then Java 6 to Java 7, by changing a line in our build script, and everything worked fine.


It shouldn't matter how well the code was written. Binary compatibility should be binary compatibility, period. If it compiles it should be compatible across JVMs. The fact that this is not the case is just another nail in the coffin.


It's not unfair to java at all, C# has had all these features for years despite being pretty much the same language.

With C# you have two options, keep compiling your code with the old compiler, or fix your code and compile it with the new compiler. I bet there will be a standardized and working version of lambdas in C++ before Java, so I'm not sure what the big deal with backwards compat is.


C++ has lambdas now. C++11 is approved and published as ISO/IEC 14882:2011(E) standard recently. Lambda part of that standard is implemented in icc, gcc and msvc. But the difference is that C++ has relatively minimalistic runtime compared to JVM, so it is not nearly as fragile.

Also lambdas for C++ have been under work for a very long time. The first version of proposal to add them to C++0x is from 2006 (as far as I can tell) and it refers to papers dating back to 1988 which discuss adding "Lexical closures" to C++.

So based on that, C++ committee used over 5 years to get lambdas on a release from a proposal. 2-3 years for Java doesn't sound that bad after all.


Backwards compatibility is great and all, but at what cost? One of the main reasons why Windows eventually started to collapse under its own weight was backwards compatibility.

If a piece of software is so tied to backwards compatibility that it cannot change it will, necessarily, die.

That said, statements like "Once added, you can't ever make any changes to lambdas" are obviously bullshit. Programming languages can and do successfully undergo significant changes. Ruby 1.9.2 is a completely different beast than 1.8.0 (including significant changes to lambda sytax), and I'd expect there are few (if any) applications that could be upgraded without significant changes. The same could be said for PHP 4 -> PHP 5, Perl 5 -> Perl 6, and Python 2 -> 3. Contrast those languages to Java, which is largely the same since 2001.

Sure, there's a cost to everybody when you have to upgrade your app, but there's a much greater benefit when it is all said and done.


Ruby breaks these things. But Ruby is not generally even considered for a lot of what Java is used for.

Those two things aren't independent.

Don't get me wrong, I like Ruby far better than Java. But Ruby is simply not considered a stable enough language for your huge app that you plan to maintain for 20+ years... Nor should it be.

Java is that stable. On the flip side, Ruby is improving much more quickly than Java. I write small apps I plan to maintain frequently far more often than I write apps that need to sail onward for 20+ years.

Two different market niches, defined by two different groups of customers.


One of the main reasons why Windows eventually started to collapse under its own weight was backwards compatibility.

True. On the other hand, that backwards compatibility is an important reason why Windows was so utterly dominant for so long. It's (as always) a tradeoff.


I think one of the biggest issues with this whole thing is the time-to-market. Lambdas are planned as part of Java 8. At the moment, that's in about a year from now.

Then consider when you'll actually upgrade your production code to use Java 8...I have a feel that by the time you see some stable lambda code in production (depending on how bleeding-edge you are), it will be quite a while.

Who knows by then how things will be. You might even be coding in Scala or Clojure...


Brian Goetz's talk on Virtual Extension Methods at the JVM Language Summit 2011 was pretty interesting in this regard:

http://medianetwork.oracle.com/media/show/16999

He explains why adding lambdas isn't that straightforward, due to backward compatibility, but also because it wouldn't make sense to add lambdas without updating the standard APIs to benefit from these lambdas. Some of these APIs, such as the collections framework, are interface-based. Adding methods to an interface would break backward compatibility, which led them to Virtual Extension Methods.

For example, we could add a forEach method to the Collection interface without breaking backward compatibility:

    interface Collection<T> {
        // existing methods
        
        void forEach(Block<T> block) default Collections.<T>forEach;
    
    }
When not overriden, this virtual extension method would default to calling a static method, Collections.forEach(Collection<T>, Block<T>), thus preserving backward compatibility.


Is Java just a vehicle for a colorful metaphor about code rot?


I'm going to skip past the Java bit and get to the comb-over bit.

The fact of the matter is that too many people think that projects can be run through the interface of stories and feature lists without paying attention to the quality of the software underneath. And, when you don't pay attention to it, it suffers. This, really, is _Joel's Law of Leaky Abstractions_ applied to process. Business wants to see features, and if that abstraction is their only view of the project, they will be blindsided by creeping quality issues. It's nearly inevitable.


That’s the most compelling argument I’ve heard for regularly paying down technical debt, and/or for acquiring it with great care. The craftsmanship argument — made by Uncle Bob and others — holds guilty appeal for me as a hacker but strikes me as self-indulgent. What’s compelling about your perspective here is its rational basis in the interest of the product owner. I had not thought of user stories as an abstraction, and therefore a probably leaky abstraction.


I saw recently someone comparing ongoing development of an application to working as a chef in a restaurant - there's always demand to keep preparing the food, but the longer you put off cleaning your work area, getting rid of rubbish etc the harder and harder it becomes to work.

I maintain a moderately complex application that's had features added left, right and centre over the last 5 years or so. This analogy helped me to explain to the client why we really needed to factor in some time for refactoring/cleaning up the code. I've been doing that anyway, for quite a while now (code base was a nightmare) but it was all in my own time. This was a (successful) attempt at getting the client to understand and pay for that time.


i use the lumberjack and axe analogy. do you get on with the job (chopping trees) or stop refactor/ learn new skills (sharpening your axe) if you dont sharpen youll slow down. if you only sharpen you wont be chopping trees. one must compromise. sent from iphone.


I am confused by the point that the author is trying to make. Perhaps someone could clarify?

The argument seems to be:

1. It took 2--5 years to add lambdas to Java.

2. Therefore, Java is like a comb-over. It's old and trying unsuccessfully to cover up that it's long in the tooth.

3. (Possibly?) Java programmers should not program in Java.

I'm not sure that any of these things follow, or if they do, that they have anything to do with Java specifically.

At this stage, Java has existed for 15 years. Once a language has seen serious adoption by regular programmers for a decade, it obviously becomes very difficult to change (for training, IDE, etc. reasons). It's also unclear if programmers of the language necessarily want it to change.

Now, let's be clear. I absolutely despise Java for a variety of aesthetic reasons. I'm as happy as anyone else to have a chance to make fun of it.

However, if we look at C standards, there's C89, C99, C1X. C++ standards, there's C++98, C++03, C++11. These languages, used by millions of programmers (and with very similar syntax, feature set) change every 5--10 years. What exactly is special about Java? Why is the headline here anything other than: "Java: Used by millions, evolves similarly to similarly popular languages used for similar applications"?


No, you've completely misinterpreted the argument.

The fundamental problem is that this essay isn't really about Java. Java is just a handy example. The current title of the actual essay ("Software's Receding Hairline") is a much better one than the one here on HN. It looks (from the URL) like the essay has been retitled? That was a good choice, and that retitling should be reflected in the HN headline as well.

I don't believe that the essay ever suggests that Java programmers should not program in Java. Indeed, it explicitly suggests that you should avoid making harsh judgements on people with combovers!


The retitling is definitely useful information. I didn't notice that!

However, you seem to have forgotten to explain the actual argument.

Is the actual argument that sometimes some software, which may or may not include Java, asymptotically becomes old or bad or unchangeable, but that's OK?


Let me confess: I'm failing to rephrase the argument because I'm not sure it works. ;)

I see three threads in this essay, all of which are basically correct:

A) Some software is like a combover, in that it starts out great, but gets gradually uglier, but the ugliness grows so slowly that it never stands out as an urgent problem so you never really confront it.

B) If you postpone confronting a problem in software it just gets harder and harder to fix when you eventually do confront it.

C) Java has now spent, like, fifteen years without lambdas, so adding them now is extraordinarily painful.

... but I can't justify the implied connection between any of these threads.

A and B are disconnected threads because combovers don't really become more difficult to fix over time. Whether you notice your combover on the day you lose your first three hairs, or ten years too late, the solutions are the same and are pretty simple: Get a better haircut that doesn't try to hide your baldness, shave yourself bald and embrace the baldness, or buy a hairpiece. This is quite unlike the way software works -- most of real life is quite unlike the way software works; that's why programmers always feel misunderstood -- so the metaphor just breaks here. Better be sure to abandon your metaphor before this point. ;)

And A and C are disconnected threads because, while some projects have the combover nature, I don't really think Java is one of them. Combovers happen slowly, below the radar, but the "problem" of missing lambdas didn't slowly creep up on Java's designers: Java is missing lambdas because its original designers deliberately left them out. (What, you're going to claim with a straight face that a team with Guy Steele on it just accidentally forgot about closures?)

The world has changed around Java, such that it now seems like a good idea (to some people) to add closures after all, and the language is certainly well into middle age and therefore hard to change, but this isn't really a "combover" situation.


Great, another "Java analogy" post!

No real analysis, no quantification or evidence of anything. Just assertions and implications.

You could at least bother to apply your analogy to other languages. There aren't many that wouldn't meet the criteria. Maybe C# would get a pass. I dunno.

What is the new "reasoning by analogy" virus that has infected HN? "x is bad and I can draw some prosaic analogies between y and x therefore y is bad!"

Wait... I think it's infecting... Me!

"Who are Ruby developers fooling?"

"You know, when I see someone who obviously has hair plugs I just feel bad for them. They obviously have some gross genetic inadequacies that they feel compelled to hide. Hair plugs are a technology that's not good for anyone. And so what is monkey patching in Ruby? /code/ plugs. I feel bad for who uses them and they obviously must suffer from some gross genetic inadequacies."


I don’t disagree with your points, however if generalizing from anecdotal experience bothers you, I think you will find blogs extremely tiresome.


Well, I can understand the convenience of lambdas for the folks who are used to them.

But overall I think that this can be viewed as "just another feature" that wasn't in their list of priorities when they designed java - nor was it widely even popular in the communities that the java language was targeting. Because of that, it probably caused a lot of grief on implementation. Slap on the Design by Committee mentality that is at the root of Java innovation and i'm surprised it was implemented at all.

To me, it seems like a middle-aged guy getting his ear pierced and some ripped jeans to look cool again with the young crowd.


I am entirely unclear on why it is being added to Java. Is there really someone out there who will stick with Java if lambdas are added in the next twelve months, but switch to C# if they aren’t? Is there someone thinking about a new project who will choose Java if it has lambdas?

It strikes me as being one of those check-box features. Someone says “We’ll use Java,” and a C-level executive asks, “Shouldn’t we go with C# and Windows? The Microsoft Solutions Sales Guy tells me you people will be more productive using its advanced features?”

Now the team lead can reply, “Oh, Java now has all that fancy stuff like lambdas," and go ahead writing the code he’s always written.


There's a powerpoint you can find at www.javac.info/bloch-closures-controversy.ppt.

In it is a slide titled "Why Are We Considering Better Support for Closures?", with the following explanation:

Fine-Grained Concurrecy - Passing snippets of code to fork-join frameworks using anonymous classes is a pain

Resource Managememnt - Using try-finally blocks for resource management is a pain, and causes resource leaks


Oh sure, there are good reasons for a language to have support for lambdas. Thanks for the link. My question is not whether they offer benefits, but whether those benefits are going to sway anyone’s decision one way or another.

This is like asking whether any product should have feature X. We go out and do a survey, and find some number, “C," of our customers would use the feature if we implement it. But this is not the number we need. We need the number of existing customers we would lose to competitors if we don’t implement the feature in the near future (“L"), and the number of new customers we would win away from competitors if we implement the feature (“W").

The salient number to consider isn’t C, but rather (L+W). That’s the number that tells us the return in market terms of implementing the feature. In my comment, I was asking if (L+W) is significant. The reasons given for lambdas speak to C and perhaps L.


It seems like you're thinking about it in terms of a programming language market place, and you're trying to optimize Java's placement in that market place. Which is a valid perspective, but probably not the one they're coming from.

I think their perspective is more, "Given that Java is the programming language some people will choose, how can we make their life easier?" I think that's a valid perspective as well, since many people are forced to use a particular programming language.


Perhaps they are being added just because current Java programmers believe it will make their lives better?


c.f. the Saphir Whorf conjecture, those Java programmers who know that simplified access to anonymous functions will make their lives better is by default restricted to that minority that has encountered them elsewhere. Many of these people, in my experience, would rather not be writing Java.


But that's not the same as saying they are not writing Java.


True enough. It just seems like it'd be disheartening to be plugging away at Java with enough foreknowledge to push through the JCP if you really wanted to be playing in a different playground.


I know and like lambda functionality and I also use Java. People pay me good money to write Java for them and I like the language and platform well enough so the chance to get closures in Java is great. At some point I'd love to use Scala or Groovy or Gosu at work but that's not the case now and I'm fine with it.


Hey, I'm all about the generalization.


There is a certain convenience to common things being commonly available in common languages. It's just nice to be able to jump between languages and have a lot of things work the same way, or to be there at all.

I suspect that lambdas have gained enough modern mind share with developers that it's just a nice thing to have. A lot of tasks these days probably have the look of "put lambda here" to a lot of people.

Whether Sales Guy and Manager are playing chess over the matter, it's just nice to be able to write the code the way that you would naturally write it.


Is there really someone out there who will stick with Java if lambdas are added in the next twelve months, but switch to C# if they aren’t? Is there someone thinking about a new project who will choose Java if it has lambdas?

On the margin, sure. It's like asking "will we really sell more widgets if we reduce the price by 1%". If you want a specific case, look at Apache Wicket. It uses tons of anonymous inner classes to emulate lambdas and developing in it would be a much better experience with less cumbersome syntax.


Closures/lambdas in Java may affect how implementations of other JVM languages implement their corresponding features.

This may affect interoperability between languages positively. Currently JVM languages interoperate nicely with Java but advanced non-java-like features between other JVM-based languages do not match so well (can you pass a ruby block from JRuby to Scala?)

I think it was Neal Gafter who raised this as a good reason for closures in Java, but I could be mistaken.


The good bits (if any) were inspired by this HN discussion:

http://news.ycombinator.com/item?id=2976300

The bad bits are entirely my own invention.


Java is C for the JVM. I stopped caring about closures in Java when I realized that there are a bunch of great languages[1] on the JVM that have every feature I want, integrate with other Java libraries, and can even be a testbed for brand new language features.

This is exactly why people aren't clamoring for new features in C, because we've more-or-less moved on to building compact, high-level code in other languages.

[1] Clojure, Scala, JRuby, Jython, Rhino, etc.


"Mercifully, nobody of character is paying attention. Nobody who matters judges a man by his hair, and nobody who judges a man by his hair matters."

Actually, the evidence overwhelmingly shows that people who matter (including those with "character") do judge others by superficial characteristics such as their hair.


>What is my client to think?

That you're trying to screw them. They will then proceed to hire someone else who will happily do in a few months what you claimed would take years. It'll probably involve throwing out all your code and starting from scratch in the new programmers favorite language/framework du jour.

To extend concept to programming languages: When you need / really want a feature and the language is failing to give it to you there's a decent chance you'll throw out the current one and switch to one that will give it to you.


And then they discover that transitioning undocumented business logic, retraining their user base, and adapting their business processes to use the new tools takes way longer than a few months, and indeed has a fairly high chance of entirely failing to dislodge the existing system.

Likewise, if you have half a million lines of code in a language, life will rarely be as simple as discovering that you need a feature and then switching to a language that has that feature.

For new project starts, switching languages is fairly easy. For individual developers, it may even make sense to switch employers to move to better tools. But if your business runs on a particular codebase and that codebase has grown large and powerful over a significant period of time, you throw it out and do a rewrite either after very careful consideration, or at your own peril.


Or they'll hire someone else who is willing cut even more corners to hack the thing together in a few months, but leaves the codebase in an even worse state.


On a related note. I remember reading "The design and evolution of C++" where Bjarne Stroustrup explains a lot of the decisions and trade offs he had to make when he evolved C++. It is a great book because it teaches you that changing the design of software that is widely used is very difficult.

Java had it easy at the beginning as it was able to fix many of the problems that Bjarne could not because he had to maintain compatibility. Now that Java is widely used it has become more difficult to change its design.


Wouldn't it be rather trivial to introduce a what-compiler-version-should-i-use annotation? That way, a .java file without it would compile with "traditional" and newer versions of the language could be used without the risk of breaking existing code. As long as the bytecode produced from "classic" Java plays well with bytecode generated from "New" Java, everybody should be happy.

Since minds much larger than mine have been dedicating inordinate amounts of time to solve this problem, I must assume I am wrong. I would, however, appreciate some feedback on why I am wrong.


For me, the issue with lambdas isn't adding them at all, it's why they've decided to never break backwards compatibility in any way. It's like deciding you'll never ever be bald and you're not willing to have a hair-piece. So by any means necessary you'll cover up the baldness with your own hair. Instead of embracing your new look of baldness you'll comb any hair over, whip it around in a circle, whatever it takes. They had basically a perfect opportunity to just say "Java 6 is what it is" and make Java 7/8 the new thing with reified generics, lambdas, etc. Lambdas got added through the regular Java process of never breaking anything so they'll be there and I'll use them, it's just a missed opportunity. But I don't think it's a case of multiple valid choices leading to a mess, it's more one big foundational choice leading to a mess.


I think PG wrote the original combover metaphor: http://paulgraham.com/essay.html


There's no reason – none – to continue developing the Java programming language.

It can be frozen to current release, with bugfixes, while the developers transition to another, modern, JVM language without the Java baggage. The old code is still accessible through the new languages.


raganwald, Posterous reportedly has an option to override their user-hostile decision to force body text to render without in a subpixel anti-aliasing. Would you be so kind as to throw that switch? Currently, the readability of your blog is negatively impacted.


This is a user-agent issue. Tell your user-agent to ignore Posterous' rendering hints.


It's a site issue, obviously. But I do have a bookmark which I use to undo the damage.

The style in question can have its places and uses, but not for body text.

I have written Posterous and they just don't want to hear it.


Age and complexity had absolutely nothing to do with why Java didn't get closures in Java 6 or 7. It's purely political.


Software is like this. Bad software doesn't really start with bad developers. It starts with good, decent people who make decisions that seem right on the day but in aggregate, considered over longer timeframes, are indefensible. This is a great paradox.

Thank you for this. It's a great insight. Programmers have a "95th-percentile complex" where all of them think they're in the top 5%, because there's so much crappy software (more specifically, bad software architecture) out there. Not to say that good programmers aren't substantially more capable than average ones, but the fact is that bad code can very easily come from talented developers.

A few things are missed in that discussion. First, architecture is much harder than it looks. Second, most of the mechanisms people use to judge programmer ability are superficial, and it's not clear even what a "95th-percentile developer" means. Third, most bad things that happen in software were either Done By No One (e.g. that refactoring that everyone needs but no one has the time to do) or Done By Too Many (e.g. legacy code that's been passed over by too many hands for its own good).

I think the "broken windows" effect also applies, and it's important to be aware of this. When code gets ugly, people stop taking the time to read and understand it. It may not be possible. So each modification, even if the developers are individually quite capable, makes it a little worse-- methods get a little longer, latency increases a little bit, architectural integrity gets lost, subtle interactions multiply.




Registration is open for Startup School 2019. Classes start July 22nd.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: