Hacker News new | comments | show | ask | jobs | submit login
Java 9 with GPU processing, Java 10 will be all-OOP without primitives (javaworld.com)
194 points by Mitt 1760 days ago | hide | past | web | 134 comments | favorite

Wow, how can you seriously plan releases of a programming language out to 2021? That is an eternity in this industry. Even 2015 for JDK 9 seems mighty far off, especially accounting for the usual slippage of release dates with big software projects.

> Wow, how can you seriously plan releases of a programming language out to 2021?

"A plan is useless, but planning is essential" [1]

More true than ever, if you ask me. Sure, you make plans, why wouldn't you? But you also retain the flexibility to adapt the plan as the environment changes. You don't chisel the damn plan into a stack of stone tablets and render it immutable for all time.

I don't see any conflict whatsoever in planning out through 2021. I would say that in doing so, there's an implicit assumption that the plan - especially the farther reaches of it - are subject to change and are based on current best guesses.

[1]: paraphrase of a quote attributed to Dwight Eisenhower, which was probably in turn a paraphrase of another quote. http://en.wikiquote.org/wiki/Dwight_D._Eisenhower

Exactly, there is nothing wrong with long range planning. Planning this far out allows you LOTS of time to iterate on your plans, making them better.

I think ultra long term planning is what you call 'premature optimization'

Premature optimizations are actually carried out before their future impact is known. Planning is different.

Maybe I'm too cynical, but I don't think you can seriously plan releases so far in future. What you can do is delude yourself (which happens all the time at big bureaucracies) or make it look like you're planning seriously in order to appear responsible and visionary.

It pretty much depends on industry. Look at mobile, 2007: J2ME, Symbian, Windows Mobile, BlackBerry -> 2012: iOS, Android, Windows Phone. Now look at the web: it was more of a evolution than revolution.

What happened at back-end, like server-side functionality of business apps? They didn't really change and probably won't change for a longer period of time. Even stuff like Hadoop for big data or Groovy for DSLs were built upon the existing JVM.

It's hard to say what devices we'll have in 5 years, or what internet would look like, but i'm pretty sure servers would run good old Java.

Even on the "evolutionary" web, I don't think anyone in 2007 would have predicted the rise (and now... fall?) of MVC frameworks or the explosion of jQuery, both of which existed and were relatively popular at the time. Likewise the NoSQL rebellion was a surprise. Five years from now, we'll all be using WebGL-based dyanmic UIs. Or will we?

I think the point is more that 5+ year planning in the tech industry is simply pointless, not that all technologies will be replaced. Sure, there are some constants. We can all safely predict that in five years we'll still be writing drivers and middleware in C, still be using zlib and libjpeg, etc...

Yes, it is extremely difficult to predict which technologies will win over a 5-10 year period vs their competitors. Eg, predicting that jQuery would appear vs the evolution of some other system (ExtJS?).

On the other hand, predicting what changes will occur to a legacy enterprise system tend to be much easier.

In the overall scope of things, Java hasn't really changed that much over the past five years. I think that some long-range planning of the kinds of changes they are thinking about actually makes sense. The need for closures, reified generic types, and an improved type system is unlikely to change in the next 5-10 years.

> stuff like Hadoop for big data or Groovy for DSLs were built upon the existing JVM

Hadoop become popular for handling big data, but I'm not sure if many are using Groovy for DSL's. The 2 big uses for Groovy in industry seem to be (1) scripting on Grails, and (2) quick standalone scripts for testing or booting Java code. For these it rocks but the stuff added after Groovy 1.0, such as DSL's, isn't really being used much.

"Look at mobile, 2007: J2ME, Symbian, Windows Mobile, BlackBerry -> 2012: iOS, Android, Windows Phone."

Blackberry remains much more relevant than windows phone. You also forgot to mention Bada which also has more share than wp and is growing. I'm not trying to beat up on wp but as it stands reality conflicts significantly with marketing.

Edit: why the down mod? if I'm in error, please point out where.

Their new platform (BB 10) is incompatible with old apps. And Bada was merged into Tizen, so it's future looks uncertain for me.

Considering it's Java and some people are still thinking about whether to upgrade from 1.5... Yeah - it may make sense in their case.

And considering you own industry, and language basically. YES WE CAN!

Why don't they start now and build Java 2.0 from ground up instead of Java 10 for 2017?

That would be way faster and they learned so much by now to make a great successor. Who wants a Java 1.x in 2040?

Because they learned from netscape's mistake, that's why.


That article is an incredibly useful warning against the temptation for casually taking code rewrites, but it's incomplete.

Sometimes full rewrites are necessary and good. The trick is doing it correctly. Rewrites are riskier, more difficult, and require more development resources than greenfield development. More so when you consider that you can't just abandon the old code until the new code is mature and has proven itself.

Most people do rewrites the wrong way, and they get into trouble, but that doesn't mean there isn't a right way. There are many examples of unsuccessful rewrites in history, but also many examples of highly successful ones.

"You are wasting an outlandish amount of money writing code that already exists."

On the contrary, there has been a number of times that I've completely rewritten some code to use a framework or library that was already battle-tested and more fully-featured than the previous effort. I ended up saving myself from writing code that already exists.

Right, but you're one person and you're redoing the plumbing.

At this point oracle can't rewrite Java. If they don't support the current programs, nobody would use it, if they do, they will really be rewriting code that already exists.

The netscape rewrite became Firefox.

Yeah no kidding. That's one thing people seem to forget about that Joel article ... the new code base he's mocking went on to become the only browser capable of taking on Microsoft's juggernaut for many years. It was faster & better than everything else out there. Eventually the only thing capable of overtaking it was WebKit ... another from-scratch code base!

That new codebase was iterated on for five years before a viable browser was released, and two more before Firefox 1.0 appeared.

The history of the WebKit codebase stretches back as far as Gecko. The earliest KDE HTML work I can definitively establish is in 1997. We got Konquerer in 2000, but Safari didn't emerge until 2003.

5-7 years before something other than IE could again be competitive... Do you really think that disproves Joel's point? I think it may be quite the opposite...

I didn't say they should rewrite everything, but to make a new Java version with backward incompatibilities. Drop a new 1.x version on java for decades can't be the solution.

To be fair, if you were researching computer languages and rewrites, you would be looking at Python 3.0 and Perl 6.0. I believe both will be successful in time, but for business planing, it looks like a lot of problems and a slow evolution will work better. Oracle is pretty comfortable with the slow evolution route.

There's no way I'd put Python 3 and Perl 6 in the same sentence.

Also, IMO there's no chance of Perl 6 being successful in time (which makes me sad, because 10 years ago or so I was really hopeful about it).

If you are some manager in a big company and looking at the press, there is little distinction. In fact, more stories have been posted about Python 2.x to 3.x conversions (mostly because Perl 6 is not a story). Heck, I would imagine a couple of people on HN have run into the "can't use Python 3 because it isn't compatible with anything"-meme. Carefully considered logical analysis with realistic risk mitigation gets trumped by headlines in ComputerWorld and InfoWorld quite often.

I still have quite a bit of hope for Perl 6. I use Perl 5 as my goto short scripting language. I really want it to work.

You are not getting it, Perl 6 is not designed as a successor to Perl 5 but as a Age proof Perl.

Think of it like pg called a hundred year lisp language. Perl 6 might be the age proof Perl language. When you have such and ambitious goal the time taken is worth to fulfill it. Larry wall figured out quite a while back evolving Perl 5 may fix some warts but it won't solve the larger problem.

The larger problem today is doing language extensibility sane-fully. There are no C based languages that are as much extensible as a Lisp based languages. Perl 6's larger aim is to solve that. While retaining the 'Perl factor'.

Re write is inevitable if you have to solve this problem, no matter what joel says you have to rewrite a few things to fix them. The incremental path is too slow and you will loose out on time while somebody eats your lunch.

Perl 5, Python x.0 and Ruby x.0 series are all great languages but eventually on the very long run they will be plagued with the same technical problem every language runs in to. Providing sane ways of extensibility without bloating too much.

IMO Perl 6 will do well, for the same reasons Lisp has done well.

When you have such and ambitious goal the time taken is worth to fulfill it.

After almost twelve years of rewrite after rewrite after rewrite, it's no wonder people stop caring.

Well, more than anybody else you know it well why most of the failures happened.

But for others. I can understand the obvious disappointment. But Perl 6 is designed such that without many of those failures we couldn't have figured it out earlier what it would take to build Perl 6. Perl 6 has a mutable grammar, which means it should be written in itself. And this created a huge problem, because you don't have ready tools in hand to build such a thing. Many of them had to be built from scratch. And people failed many times exploring strategies doing that. Some people got ill, some people lost jobs, and projects like this which span a lot of time and require volunteer effort without much funding takes toll on people.

In many ways there was a lead, Lisp is so extensible because its written in itself. We really should have understood this from history. But achieving that in a non homoiconic language was difficult and required thinking in direction totally new to C based languages.

But great things have come out of it. Audrey's Pugs taught us so many things. And as she says, the 'Perl 6 on CPAN' thing started long back. Moose has become a very awesome tool for OO programming. Other things borrowed from Perl 6, things like given/when have shown a way to Perl 5 for evolution. Devel::Declare showed a new way to do syntax experiments outside core without using source filters. And many great things have come out of it. Its difficult to imagine how Perl 5 will likely evolve over time.

A few years back none of us could have seen Devel::Declare or even Moose coming. I can only imagine how Perl 5 is going to evolve over time.

Lastly I would say Rome was not built in a day. Perl 6 will take time, but it will come out in some years to come.

Perl 6 is designed such that without many of those failures we couldn't have figured it out earlier what it would take to build Perl 6.

Sure, but those don't account for the past four years of failures. What I see is a pattern of overwhelming desire to throw away code just as it's in danger of becoming useful to actual users.

Phrased from a different angle, the reason we wanted monthly releases was not because monthly releases are interesting in and of themselves, but because they could deliver regular (if incremental) improvements to actual users on a predictable schedule.

Forking Rakudo into an all-but-abandoned master branch and doing monthly releases off of that branch hews to the letter of the idea of monthly releases while violating the spirit of those releases. I understand the reasons why it happened, but that sort of decision has happened often enough in the project that it's a habit--if not culture.

In the context I was talking about it really doesn't matter what I think as I am not a decision maker at some big company. Calling something the N+1 version tends to make people think it is the next version and not a proof of concept. It does matter what is said in the trades these people read:

for example: http://www.infoworld.com/search/google?cx=014839440456418836...

I think Python 3.0 is really well executed. With such a big ecosystem this takes a while, but when you wait too long somebody else eats your lunch.

Because backwards compatibility matters more. There's anyway many other alternatives to Java if you need something else but still use the JVM.

by the time the rebuilt java2.0-NG reaches stability it will be 2017, or whatever.

Isn't that what Scala already does quite successfully for a while now?

That's 9 years.

I've used Ruby for almost as long, and on MRI at least, I can't think of anything as significant. (JRuby FTW)

Think of c#'s evolution. Ruby's. JavaScript's. SQL's. 9 years isn't exactly blazingly aggressive, but in the context of a programming language it's not eyebrow raising either.

I think generally that's a good thing.

Java 6 was released in 2006. Java 7 was released in 2011. Almost 5 years. Java 8's planned release for 2013 may be overly optimistic.

There was the small speedbump of Sun dying and being purchased by Oracle in the process.

I'm no Oracle fanboi, but I bet without the legal and corporate circus, they will iterate faster.

Not when you consider Java as a 'business' for Oracle.

And business planning is always done like this.

If Java no longer remains useful to Oracle expect the same thing what happened to things like flash.

It will be donated to Apache software foundation.

Apache and the Open Source community already have a Java implementation in the Open JDK.

One thing not touched on in the article but would definitely be worth know is if Oracle will start inserting "pay-to-use" features in the language or release a solid JIT compiler and run time companies must pay for in order to use. I know it's been bandied about in the past that Oracle might start restricting access to some language features and with such a long term roadmap, I'm wondering if they're thinking about making some of these selling point features into business revenue features.

Java is already 30 years behind the state of the art, so planning 10 years ahead is very easy. Just do what other languages were doing 20 years ago.

(Fuck boxed primitive types. How about generics that weren't designed by drunk retarded monkeys?)

Yep, Gilad Bracha, Philip Wadler, Martin Odersky... all drunken, retarded monkeys.

The problem really isn't with a particular component in Java. The problem with Java is, Java came as a quick relief to people who had problems with managing memory themselves. Java solved what seemed to be the most pressing problem at that time. Times have changed, today I have completely forgotten some thing called as 'memory management by programmer', today its a given that a modern language today should do that by default.

The problem with Java today is how big and bloated it has become while it doesn't solve fundamental problems programmers face. Its growingly becoming impossible to program in Java without an IDE. Only Java ninjas can probably program Java with a Text editor. Its XML mess all over the place.

Now it takes several tens of lines of code to do trivial file operations and other trivial tasks. This problem was long solved with languages like Perl and Python around 20 years back. Its almost two decades and Java still hasn't caught up. There are still no practical lambdas and good functional programming capabilities.

The issue is something like this. By asking everything to become an API of some sort and writing so many method calls both my unit tests and exception handling code bloats like crazy. My eyes cringe every time I open a Java class file and nearly 60% of the code in there is either try/catch statements or some form get/set methods.

The code to boilerplate ratio is too high. And it just doesn't feel like a language that belongs to the 2010's. Java and its community also promotes heavy use of XML's often used as bad replacements for RDBMS, this leads to building of small buggy and wrongly built DSL equivalents of small parts of SQL all the time.

The path from now is not the make the language bloat like crazy and then provide IDE's to handle that. It is to make the language syntax intelligent enough to take care of many problems you have to other wise worry about. If you see the whole concept of Lisp and other extensible languages like Perl are all about. To provide forms of extensibility that solve the code scalability problem.

And lastly and unfortunately the Java market is full of substandard programmers whose life begins and ends inside eclipse.

A manager or a pointy haired boss might prefer Java for hiring cheap programmers and the 'Oracle factor'. But Java is dead for Start up's and other sexy glorious projects.

I hear you re: code bloat with too much boilerplate.

I stopped using get/set methods years ago in favor of public instance variables. Yeah, I understand the potential problems with kicking encapsulation to the street and into the gutter (I have written several Java books and I have done many projects in Java, so I am not a noob).

I have also started to favor using unchecked exceptions - that also makes code a lot shorter. This is also Controversial.

Forgive my brevity, mate!

Its been a long day, I've been working non stop since 6 am this morning(I stay in India, Bangalore) and its 9 in the night now. I've been working on some Java code.

The AbstractSomethingFactoryFactoryFactoryClasses.java have taken a toll on me. I'm yet to rewind from the depth's of piles and piles of try/catch statements and Object.someMethod() methods buried deep in abyss of com.something.somethingElse.somethingInWonderland.whereTheHellAreWe folders.

But I will take your advice seriously though.

EDIT : After reading your bio and looking at your work on your site. I am your new fan :)

What are your thoughts on C#?

So you're saying that type erasure makes programming easier and less error prone?

No, did I say anything like that? I was merely bemused that you called the designers of generics "drunk retarded monkeys", when they are clearly anything but, based on their work before and after.

Name calling isn't really helpful. I don't like Java either, but there are a lot of very smart people who worked on it, and many of them labored under constraints I do not envy. That's all.

What choice did they have? Either break compatibility, or use type erasure.

Oracle's primary concern isn't to rewrite Java or to spend resources to make Java awesome enough to compete with the latest stuff that is coming.

Their fundamental concern is to milk Java as much and as long as they can to drive their sales for Oracle DB and associated business products. And that makes perfect sense too, they are in business to make money anything else and they won't be doing their job properly. Don't you see what has become of Sun Microsystems? They were making awesome stuff by the day for somebody else's profits. Oracle is not going to make the same mistake again.

Java is oracle's strategic investment in using that as a leverage to drive sales else where. And looking at their segment, its mean't mostly for large corporate programmers and not hacker, start ups and alike.

I bet IBM makes sufficient contribution to COBOL still just to keep their mainframe sales alive and not because they want COBOL to be awesome enough to rule the world.

Similarly Oracle's contributions and investments to Java are going to be for driving their sales not for sake of making Java awesome.

I basically agree with almost everything you wrote here and in the other long one you posted replying to me up thread.

But I downvoted you because this has almost nothing to do with ootachi's comment about the realities that the designers of generics had to cope with. You're just regurgitating the same criticisms that has been leveled against Java for years.

It has a answer to ootachi's comment.

>>What choice did they have?

The only choice Oracle has is to invest in Java in areas that is going to help them sell their products.

That is the only choice Oracle has.

> It has a answer to ootachi's comment.

ootachi was was referring to the designers of Java generics, who were working sometime before Java 1.5 was released in 2004, long before Oracle bought Sun.

The presentation declares, "Java is not the new Cobol."

It shocks me that a major company like Oracle would be so foolish as to make a statement like this. Stating that something is not is one of the most effective ways to imply that it is and the speaker knows it.

Freud somewhere talks about a primitive tribe where the punishment for saying "The king is not an ass" is the same as for saying "The king is an ass".

Edit: I remember reading this years ago and have been unable to track it down. Would love it if somebody did.

Probably in Totem and Taboo? This concept is expressed in a number of his books, however.

Oh if you could find it I would be super grateful. I've done my usual rounds such as searching on Google Books and found nothing. Made me start to wonder if it had been Jung instead.

I haven't read that before, but I am wondering if that is related to that specific phenomenon, or rather if it's disrespectful to refer to the king, and say "ass" in the same sentence. Do you remember?

Yes, the example was to illustrate that the unconscious doesn't process negation the way the conscious mind does. The fact that you said the king is not an ass proves that you had the idea that he is. So I think your alternate interpretation of the principle is not so different from the first. The claim is that negation is a logical/abstract construct that exists at a higher cognitive level not recognized by the (allegedly) more primitive unconscious, which deals in concrete language and images.

Another way to put it is that psychologically, a statement and its negation are not opposites but rather go together. The same idea comes up in hypnosis, where it is said that the unconscious mind drops the "not" and simply receives the images that are given. For the same reason, it's better to say "Remember" than "Don't forget", which is a kind of subliminal invitation to forget. And so on.

It somewhat surprises me that they'd not want Java to become the new COBOL. COBOL is still alive, still runs large systems, and still sells hardware and software. Owning the new COBOL has the potential to be quite lucrative in the long term.

I don't think so, Although I agree with you that COBOL is still alive and kicking. That is totally different that being used for awesome projects and by start ups.

If Oracle's plan is to keep Java alive as a glue for their DB that strategy would work fine.

But it would be a long term disaster in terms of overall innovation, and long term viability for choosing everyday projects.

Choosing Java for everyday projects is a long term disaster?

What about Android apps?

What about solid back-end?

What about Google AppEngine-Java?

People are still using Java for everyday projects and I don't see a disaster coming anytime soon.

According to me the disaster has already occurred. The only section of programmers using Java day in and out for their projects are low price substandard developers who can't write a line of code without the IDE doing autocomplete at every key stroke.

Coming to Android and other Google initiative they tend to happen only because the tooling, documentation support and a large pool of programmers that exist.

Java today is not used because its awesome, its used because for a certain section of industry needs massive supply of low price programmers. And with Java the people, the tooling support and rest is already there.

Hence no doubt Java is the next COBOL.

I do Java in my day job and I agree, I couldn't write much Java code without an IDE, mainly because Java is a horrible language to cut code in and you do need to write a lot of boilerplate for many Java EE patterns.

The IDE removes that problem somewhat.

It's not that Java coders "can't write a line of code without an IDE", it's Java coders don't have the patience to write pure Java in a text editor.

You're also forgetting that a lot of enterprise applications consist of many many .java source files for each class. Trying to deal with all of that manually becomes a real headache

I don't know why anybody would create a headache and then try to cure it.

Intelligent people avoid headaches.

Prevention is better than cure. And IDE is a cure, Avoid what forces you to resort to a cure. Prevent the disease itself.

I use a few prog. langs. and still prefer Java in many cases because the tooling is more advanced than any other language platforms.

Thank you for insulting Hadoop, Hbase, Cassandra, GWT commiters and developers at Google, Twitter, LinkedIN, and many more smart people who happened to choose Java.

Nice reply with no factual support.

I don't if its 'chose' or 'forced to choose' by pointy haired managers, mate!

By the way when given the choice its clear what the founders choose. Larry Page and Sergey Brin chose C++ and Python and Mark Zuckerberg chose Php. Twitter started out on Rails.

So its clear the Java mess seeps in only when the hackers are cleared from the scene and the layers of management begin to take control.

Yup, Paul Buchheit was somehow forced to use Java to write GMail with q gun in his head or something. And somehow Twitter engineers was forced to use Java because management said so...

I don't even want to know how you came into these conclusions. They don't make any sense at all.

Java was 3 years old when Google started off. Did you get the link where one of the founders was asking question about Java back in 98 regarding web development? Did you know that Java has been used for many many years inside Google when these so called hackers were still working there?

Did you know that Twitter actually gets more stable when they migrated to Scala and Java? Did you check the presentation by Twitter engineer as to why they moved away? Those are technical presentations, not management business type of presentation. Are you suggesting that rails hackers were the culprit of all Twitter scalability iscsues?

I am unsure if we live in the same world mate.

I don't get the impression that Oracle really cares about startups. It costs a shitload to license Oracle, and the companies that fork out that kind of money do so because they're running SAP or Siebel, PeopleSoft, data warehousing, legacy LOB apps, etc.

There's lots of old, conservative, slowly changing companies out there that have craploads of money and need (for certain meanings of the term) Oracle. Having control over Java gives them a lot more leverage in those markets, too.

It's never occurred to me to think of Oracle as being a company concerned with any of those things. It seems totally outside their character, everything they do screams "huge, established customers only".

The current crop of smaller companies and bootstrappers will evolve in to the next generation's "established" companies. No one is starting a company today with COBOL. How many are starting today with Java? Not rhetorical - I know some are - just wondering how this will play out over time...

There are only three factors Java is getting used today.

    1. 'Hiring the cheap programmer' factor.
    2. 'The Oracle' factor.
    3. 'The pointy haired boss' factor.
And those are not awesome reasons to stay alive.

Interesting. I'm actually using Java on projects, but via Groovy. Others I know are coming from Ruby in to Clojure. So, "Java" as such isn't the direct factor, but JVM is a common one. Will it be enough to keep it from COBOL-status?

JVM based languages are totally different.

I remember Bjarne Stroustrup telling Java isn't platform independent, Java is the platform. Meaning compiled C++ runs on processors directly, but compiled Java runs on JVM. So JVM becomes analogous to a processor.

So comparing JVM based languages is like comparing languages directly compiled to a processor like a ARM processor. Future of JVM is different than future of Java.

Just like future of C++ is different than future of pentium.

On the other hand, not acknowledging what everyone is saying can make you look out of touch.

The only word that's wrong in that language is "new". Java's been COBOL for as long as it's been around.

That's like shouting "Punk's not dead!" as you preen your greying mohawk.

For years, some in the Java community (and programming community in general) have been claiming that Java is the new COBOL, so this is a refutation of that assertion. They are not fools, they just expect their audience is aware of the context.

It seems to me that eliminating primitives could be a really bad idea. If you have a class:

    public class CompoundObject {
        int i;
        int j;
        double k;
This class's data members are all a few bytes away from each other, so once the object is loaded into the CPU cache, all operations should be fast. With boxed types, you might get 3 cache misses when accessing i,j,k.

On the other hand, referential transparency can help here. Given referential transparency, the compiler can transparently unbox Integers and translate method calls. So if Oracle does it right, rather than slowing down unboxed code, this might just speed up boxed code.

They don't say how they're implementing it. Given the compiler always knows the actual type of a primitive at compile-time, they can probably implement most of the Object functionality by plugging in the class meta-info at compile-time. When a primitive would need to be boxed, Java programmers are already used to the performance implications of this. The only object functionality I can think of that would be hard to implement on primitives is wait/notify/synchronization, but I don't actually know how those are done in the JVM either.

Even if these things turn out to be impossible to optimize at compile time, they're probably then betting on JIT functionality to detect primitive objects that are only used as primitives.

The "boxed primitive" types (Integer and so on) should be `final`. This means no inheriting from them, which means no polymorphism, which means their sizes and all operations on them are known statically. This means you can store them as values in the object with no fuss, and you can inline all of their access functions.

Java objects have functionality beyond dispatch tables. Synchronization is one. You can also have x != y where x and y are different objects with the same value.

For integers, implementations are allowed to have unique instances for every value (and must have unique ones for some small values). A conforming implementation can extend that to 'all integers' (It might break existing programs, but such programs would not be standard conforming).

I do not know whether the standard currently allows for code that is guaranteed to produce Double's with equal values that are not object.Equal to each other, but I doubt that making that impossible will introduce many problems.

The only issue that comes up is synchronization, because it requires maintaining identity, localized escape analysis alleviates a lot of the burden here, but the fact remains that anytime you escape an Integer you'll need to allocate it, because you might want to synchronize on it, without the synchronization issue you can come up with schemes to avoid allocating even if the object does escape. Every other issue can be handled by a compiler optimization (either AOT, or JIT).

>You can also have x != y where x and y are different objects with the same value.

May be true, but it's not guaranteed one way or the other. In the Oracle JVM the smaller integers are cached such that

  new Integer(10) == new Integer(10)

  new Integer(1000) != new Integer(1000)

Actually, "new Integer()" is guaranteed to always return a new, distinct object. The caching only comes into play during autoboxing conversions, or when you call Integer.valueOf().

Ah, you're right. I was thinking of autoboxing, i.e.

  Integer a;
  Integer b;

  a = 10;
  b = 10;
  a == b; // true

  a = 1000;
  b = 1000;
  a == b; // false

Was absolutely concerned about this as well, but then say the timeline they are talking about (almost a decade) and decided to keep my concern in check, at least for another 6 years.

I either imagine hardware progressing so far in the next decade that this is absolutely moot or the language changing directions and some surprise coming out of left field to change this gameplan.

You don't need to use a uniform field representation, and the representation in the object doesn't need to match the representation on the stack/in registers.

For example Dylan language doesn't have primitives as such. The d2c compiler passes objects around as a two-element struct containing a type tag and a value (in practice values get passed around in a pair of registers). Now, if you have a field with the declaration <object>, and you store a <double> in there, then the field will consist of two words: a type tag and the double itself. However, if the field is declared as a <double>, then the field will be stored as just the double, and the field accessor code will add the (constant) type tag as the field is read.

C# accomplishes this with structs: http://weblogs.asp.net/dixin/archive/2007/12/20/understandin...

Such a thing could be done in Java if Oracle added structs. They're basically just "unboxed objects". I believe there's more to it than that, but that's the basic idea.

I'm curious how they're going to handle the GPU thing since there really isn't a unified GPU architecture, yet (by 2015, who knows but still). Have vendor specific translations or try to rally around something like OpenCL? I'd love it if GPU coding would become more available and vendor agnostic, so I'm really curious about this.

I know the article didn't mention it, I also wonder what the memory bloat will look like when they turn all primitives into Objects and introduce object overhead just to track int i in a for loop.

Marta Jasinska posted her notes: http://kreskasnotes.blogspot.com/2012/03/qcon-2012-future-of...

These slides were posted last November, so they're probably a bit stale, but might give some insights: http://www.slideshare.net/JAX_London/keynote-to-java-se-8-an...

Video and slides of the same presentation by the same presenter in October 2011: http://lanyrd.com/2011/jax-london-autumn/sgzyx/

It's mind-blowing that they plan on having GPU and FPGA support before they offer arrays of structs and multidimensional arrays. Frustrating, too. I guess I can't complain, since clearly somebody is paying for all this, and it isn't me.

All-OOP? So functions will be objects too? And classes? And contexts?

Not meaning to start a flame, just saying the title's a bit inaccurate IMHO...

I think similar to C# where the primitives are objects is what they want, terribly misleading title I agree. However, I cringe when I think of what this does to performance. The memory overhead of objects in java is pretty high (relative) while primitives are very memory efficient. If they can make the primitives just as efficient, it would be awesome, but I guess its a wait and see. I don't currently mind using the static classes on the object representation of the primitives to do things.

In C# primitives are not objects, but they can be boxed into objects.

So if you have a class:

  public class X {
    public Int32 i;
    public Int32 j;
    public Int32 k;
Then it takes 12 bytes + single object overhead (which in MS .NET I believe is two pointers).

But if you assign a primitive (e.g. Int32) to an variable of type 'object' then it will be boxed and then require the object overhead.

A neat thing about C#/.NET is that generics are baked deeply into the platform, so using primitive (actually, any value-type struct) types as generic arguments will not require object memory or casting overheads.

Well, actually, in C# struct instances ARE objects allocated on the stack, instead of the heap.

A parallel can be drawn with C++ in which the user of a class decides on declaration/initialization where to store the object (either on the stack, or on the heap). The difference with C++ is that in C# the author of the class decides where instances of it should be stored, so this decision happens when the class is declared, not when it's used.

Otherwise there are few differences between structs and classes and all differences stem from the differences in storage. For example struct instances must be passed by value and not reference, because by definition stack-allocated values are short-lived and playing with references to stack-allocated values is dangerous. Structs must also have an implicit constructor because stack-allocated values cannot be NULL (logically, you need a reference to represent NULL).

In my experience, all discussions about what is or isn't an object are counter-productive.

What really bugs me about Java is that you cannot build your own types that behave just like the built-in types. For instance you cannot override operators like "+" (which works for primitives or Strings), you cannot override [] (which works for arrays), you cannot declare other stack-allocated structures, arrays are reified and yet you cannot declare your own reified data-structures and so on. The presence of primitives doesn't bother me as much as lacking the means to build my own primitives.

> Well, actually, in C# struct instances ARE objects allocated on the stack, instead of the heap.

You have it backwards, they're not value types because they're allocated on the stack, structs are allocated on the stack because they're value types.

See Eric Lippert's article "The Stack is an Implementation Detail": http://blogs.msdn.com/b/ericlippert/archive/2009/04/27/the-s...

So no, I don't have it backwards.

Also Eric Lippert nails it when he answers the question of why reference types are not stack allocated and value types are ... “because they can”.

I do agree with the article, but with all due respect to Eric Lippert, C#/.NET was meant to be reasonably fast for all kinds of user-land applications and if structs weren't added, then they had to add special cases (primitives) for dealing with integer and floating point arithmetic, just as Java did.

I do agree that structs have semantic value, but the implementation itself allows the available primitives to be described in terms of structs, which is a really elegant and cost-effective solution to a problem that the JVM engineers are trying to solve with complicated tricks like escape analysis. If you remove the performance/efficiency benefit, there isn't a lot of value left in structs - at least nothing that can't be solved by immutable data-structures and/or a better type system.

Nice article btw, thanks.

>In C# primitives are not objects


>Data types are separated into value types and reference types. Value types are either stack-allocated or allocated inline in a structure. Reference types are heap-allocated. Both reference and value types are derived from the ultimate base class Object.

I assumed they were talking about tagged fixnums, which have primitive-like performance but can still be treated like objects. https://blogs.oracle.com/jrose/entry/fixnums_in_the_vm

Classes are already objects. I don't know whether Java 8 will finally introduce function as a data type, but if it does, I'm pretty sure they won't do it the way Ruby blocks work (i.e. block is not an object).

I'd rather they didn't. It would be redundant with inner classes that are already there, I don't like this piling up of features. It would be more interesting if they allowed inner classes to close over non-final variables.

An inner class is the ugly half-brother of a closure. 99% of the time they're used they define just one method.

When they are used for more than 1 method implementations, well ... they shouldn't be inner classes.

They just exist to fill the gap of having no closures. I say this as a long time and affectionate java user but project lambda in JDK8 is long overdue.

When they are used for more than 1 method implementations, well ... they shouldn't be inner classes.

Why not? Very often, a class will have some data members which are pretty much meaningless elsewhere. Why should the class used to represent that data be exposed elsewhere?

They are there, they have been there for over ten years, much code exists that needs them to be there. You can't take them out of the language.

More syntax sugar for them would be good, adding first class functions IMHO isn't good.

Aren't methods already an object of class java.lang.reflect.Method? Do you mean you should be able to do something like foo(PrintStream.println.method) like you can do foo(PrintStream.class)?

No, a class's methods are not instances of java.lang.reflect.Method. The latter is simply a 'reflection', per the package name, of first class JVM construct e.g. a reflection on a method.

would you care to explain why this is not enough? You can pass them around, invoke them, get their properties.

The interface may be bad but I'm not sure what is missing.

In the original comment, I was merely addressing the misunderstanding of the GP regarding the reflection packages constructs.

The interface is your basic (irreducible) disconnected/unbound procedure invocation API, with all the positive/warts associated. I agree that in principle, we have enough information in the reflection data structure to allow a (specific) JVM implementation to provide a non-standard 'method object' feature. Consensus, possibly? (Good question, really. C. Nutter is one to hit with that one.)

This is a very good point. If functions become objects, interfaces would become literally useless. Somehow it seems that it would not be the case since this would be a drastic change (even thought getting rid of primitives are as well.)

Literally useless? Would you like to elaborate on how specifying a contract that has to be implemented is suddenly useless because you have function types?

I think the experience of Haskell is relevant here. Once Haskell introduced first class functions, everyone stopped using typeclasses.

[edit: apparently my sarcasm was too subtle. HOF and typeclasses are both vitally important pieces of Haskell, which are orthogonal to each other. I'm pretty sure both were in Haskell from day 1. Some code for which both are essential:

    class Monad m where
      (>>=) :: m a -> (a -> m b) -> m b
      (>>) :: m a -> m b -> m b
      return :: a -> m a
      fail :: String -> m a


I'm genuinely curious: is this because Haskell is generally hard to learn[] or because of some inherent property of Haskell that makes type classes unnecessary when you have first class functions?

[] This statement is based purely on my own experience. I find the Haskell documentation a bit unfriendly with its "academic" style. The content is great, but the form makes it a bit hard to assimilate. Monads were especially painful.

I don't think typeclasses were in from day one, they were added quite late in the development of the language (but some years before H98). I don't think they're in Miranda, for example.

Here's a paper introducing them.


I stand corrected. I guess Haskell before H98 was a very different language than it is today.

Once Haskell introduced first class functions, everyone stopped using typeclasses.

What do you mean? From what I've seen, everyone uses typeclasses in Haskell.

Not sure I agree with the GP's point, but I believe he's arguing that when fulfilling the contract of a fictional MouseClickListener:

  interface MouseClickListener {
    void mouseClicked(MouseEvent e);
one often uses an anonymous class to do so:

  void init() {
    mouse.setClickListener(new MouseClickListener() {
      void mouseClicked(MouseEvent e) {
the exact effect could be achieved with a reference to a named function (if Java supported them, which presumably it will once "everything is an object"):

  void init() {
    void mouseClick(MouseEvent e) { println(e); }
given that interfaces are simply syntax for function routing, when you have the ability to reference functions by first-class types (i.e. you have the ability to determine function routing yourself), the set of things that you can only sensibly do with interfaces is a lot smaller.

This is how C# delegates work, right? Any Java -> C# programmer want to comment on whether they rely less on interfaces now and what they use them for?

First class functions pretty much have no bearing on whether or not interfaces are useful. Sure, one use case for inner classes is to implement interfaces in-place in situations where what you really want to do is just pass a function, and in those cases it's probably better to use a function instead, but the equivalence doesn't go the other way.

There are plenty of cases where you really need to pass an object to a function and know that the object supports multiple operations, for instance take a look at Map<K,V>: http://docs.oracle.com/javase/6/docs/api/java/util/Map.html

There are 14 methods there, and when I write code that takes a Map object, I really mean it - I'm not just using the interface as a hack because I want a function pointer, I need an object that supports all of those methods, and I'm probably going to be using several of them.

I bounce between Java and C#. I'm not sure at all what he's talking about. I use interfaces all the time in C#. I even specify delegate properties (as distinct from events, when I don't want them to be multicast) in interfaces.

I think both concepts are probably independent of one another.

Clojure has both function objects and interfaces (called 'protocols'). In fact, it encourages the use of interfaces while strongly discouraging the traditional object inheritance model. Most people who've tried it AFIAK consider Clojure a well-designed language.

Clojure, Golang, Scala and Rust all implement similar interface.

While I agree that I hate interfaces with one method, what about interfaces that actually define several methods (e.g. - perhaps a driver for something that needs to do operations like "open", "close", "get", "put")?

Interfaces still serve a purpose bundling together related operations with replaceable implementations. They're just a nuisance for single-purpose functions/functors (e.g. - "run").

They can plan like this because they are the new cobol - they have an assured install base. If they make 100% back-compatible, and definitely better, everyone will upgrade.

As for the very vague improvements mentioned (GPU, no primitives), these can be seen in response to clear technology trends. Even the embedded GPUs in x86 are getting pretty powerful, though not every business has them - but by 2015 most will. Java has primitives primarily for performance and memory efficiency. Absolutely crucial at Java's inception, they though remain important today, many applications can get away without them (e.g. ruby apps). By 2017. even more apps will be in that category.

Most of the others are similarly responses to clear trends, such as hypervisor-aware and large data support. However, true generics seems problematic because it would break back-compatibility.

My prediction: if they plan to break back-compatibility they will change their mind.

I just want passable functions, or even better, first class functions in module namespace.

I still find it hilarious how even Swing shows how crippled the language is without that feature, where adding event listeners is complicated like new EventListener(Event e) { @override onEvent... etc. When just overriding some declared event handler method would be so much clearer.

At first I thought this might be an early April Fool's Day joke. But I don't think it is.

I'd really like to see some of the deprecated parts of the JDK removed.

April 1st in two days, but I guess some people can't wait ;)

Anyone have a link to the actual slides from Oracle?

Not sure, but maybe this:


I hate those articles not mentioning sources.

I think I will switch to Google Go

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact