Hacker News new | past | comments | ask | show | jobs | submit login
Java is still available at zero-cost (joda.org)
175 points by lemming on Aug 31, 2018 | hide | past | favorite | 162 comments

Honest question: How often do people upgrade their production language?

Me: once in a decade, maybe. I'm still perfectly happy (and productive) with C# 3.5, Java 7 and Python 2. I'm grudgingly moving forward because the old toolchains are being deprecated now, but my codes aren't significantly improved because of a few fancy idioms available in a new language, and it barely affects the design decisions. Upgrading your library or framework can be more often, but I tend to still be fairly conservative. Upgrading your language for production twice a year sounds crazy to me.

In the JavaScript world people use those holiday chocolate calendar things to count the days until a new version of their favorite compiler will come out and can't sleep the night before because they want to be sure they can update at the very instant the release is dropped (and then proceed to complain that their favorite framework didn't update for the quirks of the new compiler in a timely manner).

I wish I was completely joking.

Yep, and that's why JS is moving and everyone else is biting the dust lol.

Progress needs to be made and those who can't follow don't have the right to complain really. Old tools are always available because language compatibility is preserved (which, imo should also be dropped, we would have much better language as a result).

We also stick to LTS (in my case node LTS)

Not to pick a fight but even with all the recent improvements I am of the opinion that it has a long way to go. There’s no explicit Abstract Base Classes, there are base objects you can’t transparently inherit from (Thinkging of promises here in particular. Why can’t I transparently override built in promises for customization? I know you can inherit from it but you can’t modify a parents constructor) and it lacks lazy class invocation (looking at you `new` keyword)

Honestly I wish the ECMAScript board would have just went he other way and doubled down in making JS a function language instead of trying to push more OOP approaches would have been a lot better. It’s always been a little wonky as a Prototype based OO language anyway.

That's a bit dangerous. There's good in both functional approach and type systems but I'd rather see them where they are - in TypeScript and lodash and have vanilla be as vanilla as possible. Time will tell, there's already been a case where things from framework were pulled into ES (coffeescript).

I’m curious what you mean by dangerous? I’m having trouble discerning what you mean.

If you mean going into a functional paradigm I’d say it’s definitely a matter if taste.

However my biggest gripe is that the ECMAScript board seems intent on being all things to all people. So they half implemented classes. And they added some neat data types. But god for bid you want to override their constructors yet if I make a class I can override my constructors as one would expect in a class based language. Though forcing the new keyword onto classes wasn’t a great paradigm to keep imo. Nor do I like how static methods and properties are treated.

Also decorators are only for classes? Why does that make any sense?

It’s really screwy.

At least going the opposite direction would’ve forced correctness on a baseline.

If they want to be an OOP langsuge with classes the Standards need to embrace it more fully and I see little evidence of that

ECMAScript has no strong opinion on what paradigm it wants to be in. Originally, it could have looked like a more Algol-y Self. But so many kludges and 'pragmatic' additions were made early on to make working with the DOM 'easier' that the object-orientation aspect never really took shape in any of the core APIs.

The reason that's critically concerning for the language and its ecosystem is that since 'vanilla JS' has no opinion, all the libraries created for it tend to have to form their own opinion. This then means that when binding to many ES libraries, you're torn between a hundred flavors of functional, half-baked OO, procedural, or just bit buckets of inconsistency.

If the language can't make up its mind, you're guaranteed to see fragmentation by its users.

> Also decorators are only for classes? Why does that make any sense?

The generally thought is that you technically only need them for classes (and really, only for class members and properties). For anything else you can just use a higher order function. Not as pretty, but JS decorator arent about syntax sugar, they're about filling one of the bazillion gaps in the language that opens when you introduce a class keyword.

Of course, the spec for decorator has been thrown away and started over once already (so basically no one using typescript/angular/mobx/whatever with decorators are actually spec compliant these days), and the new spec has been in limbo for years. Decorators aren't part of JS at all right now.

> language compatibility is preserved

I think that Go and ECMAScript has shown that language compatibility is critical. The language changes that really improve quality of life change the way that you write code in that language, which mean you have to rewrite your existing code to opt-in to them. Small languages changes may not actually be worth the cost of adoption.

> which, imo should also be dropped, we would have much better language as a result

This just creates two different languages. The older language only dies when everyone stops using it. The Ruby community managed to do that with 1.9 but Python 2 still exists alongside Python 3.

JavaScript is slow as hell

Multiple times a year. If your system is well designed it should be trivial to move between different versions. As languages evolve they tend to get faster and fix more bugs. Your tests should catch any bugs if the new version breaks anything.

If you have performance concerns you can update a single server and see how it performs relative to the others. Once you're comfortable that everything is working well you can gradually migrate more servers until everything is running the new version.

Each service should have its language version pinned instead of relying on a single global version. That way you can upgrade projects incrementally.

If you're really worried you can keep both versions of the language installed at the same time so you can quickly revert in case something goes wrong.

> If your system is well designed it should be trivial to move between different versions.

Not many systems are well designed.

Is a Java developer I’ve always been happy moving up as soon as possible, if I see a benefit. I was quite happy to move up to 1.5 for generics. Java eight was fantastic because you got streams and lambdas. Some of the updates in 10 and 11 will take bigger changes to the codebase to take advantage of.

But even if you weren’t using the new features, there was basically no downside to upgrading the run time. The newer version is usually ran much better thanks to improvements in threading or garbage collection. And you knew you’d get security patches, even if you ended up getting stuck on that version for a while.

As you said, libraries and frameworks can be a big issue to upgrade.

But the backwards compatibility promises of Java have been so incredibly strong it hasn’t been an issue.

The one time breakage in Java 9 is basically the first I’ve experienced in my career.

More breakages are expected going forward, though. That's the whole idea of releasing early and oftenly (and having customers to pay for backward-compat). Note I'm not saying Oracle has no right to monetize Java, though it could be said what makes Java valuable is the (mostly OSS) ecosystem around it, rather than Java the language itself or the JVM.

> As a Java developer, ...

I've been developing Java for a living since almost the beginning (since 1997), but I'd seriously recommend not tying yourself to a particular language as in "Java developer". Java is but a tool among others, and I generally can work better with devs that know more than just Java, for the pragmatism and utilitarism they bring, rather than Java devs dogmatizing and bragging about the latest fad in Java idioms and libs.

Actually I don't think there will be more breaking changes than there were in the past due to the new release cadence. Mark Reinhold gave a great talk about the values they try to follow when moving the platform forward: https://m.youtube.com/watch?v=HpbchS5kmio

This applies to the new release cadence as well. But time will tell if they will achieve their goals.

With Go: every release. Free performance improvements and backwards compatibility. It also doesn't require anything installed on the server, so it's just updating a ci script.

Go is kind of special, as it is new relatively and the initial implementation of every feature is usually the most basic possible. Thus, every new version there is space for a lot of improvements. What I mean: the garbage collector is/was for all intents and purposes Dijkstra's late 70s ('78) tricolor algorithm.

They eliminated re-scanning of the stack on Go 1.7 by adding a hybrid write barrier which is 90's work (found the proposal: https://github.com/golang/proposal/blob/master/design/17503-...). So Go is by no means state-of-the-art performance wise. That is not bad and has its benefits, one of them being that you can improve with every release.

> So Go is by no means state-of-the-art performance wise.

Yet Go is blazingly fast compared to most other languages that are widely used for web server development (except, maybe, Java).

Well, that's a heavily caveated statement.

It seems, given the end of Moore's "Law", as well as environmental concerns, that language efficiency will get more attention. It'll be interesting to see if languages like Swift, Rust, and Julia can gain traction for web development.

I wonder how uptake has been for IBM Kitura so far...

Same for us for Go. The releases have never caused any trouble and the performance improvements are nice to have. For us it’s just changing one base containers version.

That being said we still use mostly Python 2 for our Python services.

If those Python services are going to last beyond 2020 and there's much risk of attackers probing for security vulnerabilities, I suggest you plan a transition to Python 3 during the remaining ~2-year support lifespan of Python 2.

It's a lot easier of a transition than it used to be. Any other support replacement language would of course also work to keep you secure, but honestly might be more transition work.

If you're a low-value target running unimportant stuff on an airgapped machine, this might not matter.

Not that I rush into using the newest C# language features... But C# > 3.5 has some great changes that are very easy to learn and worthwhile. The new string interpolation feature is so good I don't even allow old-style string.Formats in code reviews.

> The new string interpolation feature is so good I don't even allow old-style string.Formats in code reviews.

String interpolation doesn't fully replace String.Format, because it does not provide a way to re-arrange where the placeholders go in the format string, which is important for localization.

I was burned, badly, when try to port a huge ERP from .NET 1.1 to 2.? (if I remember the numbers correctly). Other projects get the slowdon, but that was truly a nightmare. That my brain trauma make forget much of it.

After that, I move very very fast. I prefer to hit "small" pains in short-term that try to do a huge jump later.

This mean that as long as I'm coding in a project (I'm contractor and do freelancing) I upgrade to whatever stable version my tools trow at me ASAP. And even minor versions or sometimes betas.

Not only language(s) - I keep at the same time different codebases in different languages, one main active plus maintenance-, I try to move everything: OS, RDBMS, IDEs, Editors, Frameworks, Libraries, etc. I probably hit "upgrade" every few days per week!

ONLY after the project stabilize and I do a few changes per-year, I stop doing that. And if the project somehow accelerate again I try to port ASAP.

How crazy is this? If the ecosystem I'm doing is nuts (ie: js) very clearly this reckless behavior show that.. QUICKLY.

...and then I need to pay more attention to my depencencies (keep them small) and choses only stable stuff.

I hit breaking changes not as often. And when see something painfull then I pin versions and keep them for a while.

This mean that I move ASAP and stay current as much as possible. I slow down when enter production, and slow down/stop again when the customer(s) stop requesting work.

This demand to use logging and have some testing, fast deployment/rollback, automate backups and use source control, very soon in the project, but this is good things anyway. (and maybe have a few VM or dockerized stuff but not as many despite more than 20 years doing crazy).

So far, this have work great for me, and never get stuck for months in rewrites like in the past. I get use to rewrite fast, that compensate to the fact that I'm a slow developer.

That is bonkers that someone signed off on a huge project with .NET 1.1. Early .NET is so off. It feels like two teams of programmers at Microsoft weren't talking to each other but working on features for the same language.

Back in the day, MS was the gospel in my country (Colombia). We take everything them say, expect programming in Visual C++.

It's insane, a complete waste of time, and a sign that the software is turning to garbage, releasing major api breaking changes just for their own goals, but to meet the goals of the user. Same goes for frameworks to a lesser extent. If one can't point out any real advantage of upgrading to a major version other than the developers want you to, that's the sign of broken software that's about to be insecure because the developers can't stop or even slow down with it for it to be practical. At my current job, I've been working on a few apps for almost five years and they are all version one. It's almost impossible to keep up with all the upgrades and deprecated packages. The only way to justify it to the business is to point out that the only reason we need to upgrade is because the original developers have abandoned the software and therefore we cannot get security packages. The new versions for the most part, offer nothing new but demand days of research, upgrading, and testing. Ironically the language was the waist l easiest to upgrade in our case (php) because it's one of the few that values backwards compatibility and has sane major version upgrades. The frameworks ... Angular 1 is all I need to say and it just goes on from there. I wish the original developers had thought about this. Finding decent open source software to write apps that will live for more than a couple years is extremely difficult. If only we had some reputation system that rated developers and companies on long term support, that would help.

Part of the issue here is likely the pressure that dev teams feel to be outputting <italics>something</italics> rather than nothing at all. There definitely seems to be a huge overemphasis on quantity over quality.

Yup. I'm convinced this is the reason 95% of updates exist today. If it ain't broke don't fix it has been totally forgotten and it shows in today's broken ass software.

Lol, this cracked me up

Not sure if this is a stylistic choice, but if you want, you can do actual italics like this:

  You can do actual *italics* like this.

Thanks, I'll keep that in mind for the future. What is the markup flavor being used by HN?

My company runs its Scala apps on Java 8. That's the version officially recommended by the Scala project:

> For most users, we still recommend using Java 8 for compiling (and running) Scala code.


I don't know about "people", but I know that the companies and organizations that I've worked for and with for stay reasonably close to the leading edge for security and feature reasons.

And if you have a reasonable testing environment, it's not rocket surgery to figure out how much of a problem a new version might be.

Our stack is quite "boring" in that regard. Common Lisp (SBCL), Free Pascal, Perl and Go. They all run the latest versions and we never had an issue upgrading these.

Not upgrading your language every time a stable version is released seems crazy to me.

Eventually you are going to have to upgrade, and if you are 10 years behind it's going to be a huge, painful migration. If you upgrade every six months, each upgrade is easy and barely noticeable.

What you are doing seems to me like deciding not to change the oil in your car. You might save you a bit in the short term, but it could come back to hurt you in a big way.

Successive versions of Python have some really great features that keep pulling me in. Pathlib for scripting, asyncio for concurrency, f-strings for everyone. Similar in Django.

I'm sure if I had to deploy anywhere else than a server I ultimately control, I'd be a lot more conservative, but until then, this stuff makes things fun.

I’m always an early adopter for new language features on personal projects. I love playing with new things. But I use boring, reliable technology when somebody is paying me for it. As much as I like playing with shiny new things, I’m not going to bet my professional reputation on them.

Imagine you are a software developer that sells a hospital management system built on Java. You need to prove your software is built on secure technology. So if you are using Java 11, and suppose it is known that there are no more security updates, you cannot claim security.

What I understand from the article - Oracle is not supporting it Oracle 11 after 6 months, but other still will, so nothing to worry about.

> Imagine you are a software developer that sells a hospital management system built on Java

Then you're probably earning enough to pay Oracle for continued support. 6mo is just the no-cost timeframe.

Fast: from C++11 to C++14 to C++17 soon..

The problem is that if you stay on old language versions, you're doomed soon or later (see the world moving to Python 3.7, Java 11, C#7, Go 2,..).

If you consider then JavaScript, things are moving at speed of light. Today's framework might be ancient in a couple of months (React version??)

> if you stay on old language versions, you're doomed soon or later (see the world moving to [...] Go 2)

Go 2 is notable on that list because the Go developers have already promised that all Go 1 code will continue to compile and work with Go 2. It's not a 2.0, it's more like 1.15 or 1.20.

I guess they learned from the Python 2->3 transition.

It's not a "learned" thing. Go can pull it off, because it's much younger, and doesn't have so much accumulated cruft. Eventually they will - and then they would either have to make the decision to break compatibility in some future major release - or become the next Java in a sense that there are parts of the language and the library that are warts upon warts, that no new code will ever use, and that confuse new developers, but which are needed for old code to run.

I work in kotlin, I update the language several times a year, with each new stable release.

We upgrade rolling release style all the time.

New features can make us very productive and honestly I don’t regret the aggressive approach

3.5 does not support tls 1.2 hence our move

Swift every year or so. C++ every 3 years.

I have a friend working at a company that is having “very important” discussions about this.

They need to keep up with security patches (for obvious reasons) but the six month limit has them scared. The consensus seems to be that they don’t believe they can keep up with an upgrade schedule that fast and are thus forced to pay for support from someone or hope that the community backports security fixes far enough for the rate at which they feel comfortable upgrading.

It doesn’t sound like they actually know what their upgrade rate would be, but they’re sure that it’s much slower than six months.

I am not sure I completely understand the implications of this. But, if OpenJDK is the reference implementation, and as long as someone is going to maintain Java 11, there really isn't much to worry. I worked for a company that completely relies on Java. They also relied on Apache, so they had on their roles two developers that wrote code for Apache and Tomcat. Making wild guesses here but - if OpenJDK project gets 300 companies with one contributor each to write code for them, they will continue supporting Java 11. Maintaining Java is in some way a responsibility of every company that benefits from it. Or maybe I am so out of touch with technology that I am absolutely wrong.

The question is, and we won’t know the answer until this is been going on for a while, how many releas(es) back will continue to get security updates from the community.

Will they decide on arbitrary long-term releases? Maybe 11 and 15 and then 19 get three years of support?

Or will the community extend the length of security patch releases by some sort of fixed margin, say 6 to 12 months, and leave it at that.

Right now no one truly knows how it’s going to work out.

I find it interesting that a company that pushes out updates much more frequently than every six months is uncomfortable updating the software THEY use every six months.

It’s not like there should be big breaking changes problems, you can rely on the TCK to know that your code will continue to work.

We know that RedHat is aiming to support Java 11 LTS for a minimum of 6 years and OpenJDK8 till the end of life of RedHat7.x

Source https://access.redhat.com/articles/3409141

Hiring developers to maintain a core OSS project that your company depends on is actually a really good idea! Besides securing the future of that OSS, you now have a "insider" on your team that knows the ins and outs of that software. Having some performance problems? Sue compiles the code everyday and can take a look. Need a special filesystem hook? Jane knows the code inside and out and can make it happen. Genius!

What's being missed here is there is a whole slew of vendors that support openjdk. Those patches will come from red hat, ibm, azul,.. as well.

The question is for how long. Three months? Six months? A year?

The community will provide updates, but it this point no one knows what the eventual pattern will be which makes planning based on it difficult.

Not sure that's relevant either. Within the target environments you're still looking at 5 to 10 years just for red hat: https://access.redhat.com/articles/1299013

Azul publishes this as well: https://www.azul.com/products/azul_support_roadmap/

Cursory google searching is all it takes to find these things.

Waiting 6 months also does not make much sense when it comes to security patches. It seems like spending time to engineer processes to allow faster upgrades (CI/CD deployments is one popular route), would be a worthwhile investment.

It's hard to underestimate how fragile some enterprise projects are, no unit tests, no CI, dependencies on an unknown number of 3rd party components, the original developers left years ago and so on.

Up until around 2008, clock speed was increasing rapidly and the bottleneck to systems performance kept changing. Software was being obsoleted in a mater of years. Then things started changing; evidenced perhaps by Windows upgrades suddenly becoming a lot less relevant and Apple starting to steamroll more modular computer builders like Dell.

Mark my words, there will come a day where continuously upgrading software will start to have a clearly higher cost than old, boring and stable. 2018 to me is the first year that a software platform can have had a true decade to stabilise in an environment that isn't constantly shifting and changing at a hardware level.

Maybe trends like in graphics cards will shake things up, but I believe the advantage is going to move to stable, slow upgrading, well supported platforms with large bases of well trained developer. Reducing LTS windows will be a disadvantage for the Java ecosystem.

The drive to move quickly with software wasn't to keep up with ever changing hardware, it was to respond quickly to the market. Treat every release like an experiment. The more experiments you can run the more likely you are to understand and solve the needs of your users.

Why would someone pick software that get feature releases and bug fixes on a yearly cycle when similar software is available that has weekly or daily releases.

>The more experiments you can run the more likely you are to understand and solve the needs of your users.

There is an interesting chapter for this in Donella Meadows thinking in Systems

Oscillations! A single step up in sales causes inventory to drop. The car dealer watches long enough to be sure the higher sales rate is going to last. Then she begins to order more cars to both cover the new rate of sales and bring the inventory up. But it takes time for the orders to come in. During that time inventory drops further, so orders have to go up a little more, to bring inventory back up to ten days’ coverage. Eventually, the larger volume of orders starts arriving, and inventory recovers—and more than recovers, because during the time of uncertainty about the actual trend, the owner has ordered too much. She now sees her mistake, and cuts back, but there are still high past orders coming in, so she orders even less. In fact, almost inevitably, since she still can’t be sure of what is going to happen next, she orders too little. Inventory gets too low again. And so forth, through a series of oscillations around the new desired inventory level. As Figure 33 illustrates, what a difference a few delays make!

Faster is not always better, especially in systems with delays. By reacting to initial cues prematurely you can easily end up in permanent disarray that simply turns into a chaotic feedback loop. It is often worthwile to upgrade patiently, so one can actually gauge long-term effects instead of reacting to noise. Changing the consumer experience is a conversation.

We're talking about two completely different things here. I'm talking about responding to direct customer requests differently which strikes me (potentially naively) as a different thing than managing a supply chain.

Eh, I think we got some time for consumer performance improvements. Graphics cards and HDD/SSDs seem to have been steadily increasing (noticeably) but CPU's and RAM have not for consumers which are major bottlenecks.

My top of the line late 2013 rMBP has got 4 cores, 16GB of ram, and 512GB NVMe SSD. rMBPs just recently upgraded to 6 cores and 32GB of ram (more storage too but that isn't a big deal nowadays since speed was the biggest optimization for storage and it is still NVMe IMO). However, server CPU and RAM has increased. You can now rent an AWS x1e.32xlarge with 64 cores and 4TB of RAM and thanks to AMD, that competition seems to finally be flowing back to consumer computers (At least for CPUs).

Just look at people being amazed at threadrippers 32 freaking cores on a desktop (matches AWS x1 when you consider the x1 has dual sockets) and we still haven't passed down the gains in RAM (2TB of RAM if it matches the x1).

When those performance gains pass down to consumer computers, users will have up to 6.33 times the amount of CPU and 64 freaking times the amount of RAM!

> When those performance gains pass down to consumer computers, users will have up to 6.33 times the amount of CPU and 64 freaking times the amount of RAM!

This will only happen once there is a usecase for that much RAM in a consumer machine that justifies the massive increase in power consumption. I don't see such a usecase yet. (The only thing that I can think of that pushes hardware requirements on consumer machines is VR/AR, but that's mostly a GPU problem.)

Many notebook models (also the previous Macbooks, I think) were stuck at 16 GB RAM because that's the maximum for low-power DDR3.

Java moving to 6 month release cycles is good. OpenJDK is good. RedHat providing long term support is good. All of this is good.

Where I'm worried this fails is with dependencies. Also in the context of developing plugins for other Java projects. For example building a plugin for IntelliJ and the different versions of IntelliJ users may have.

It's rare that Java the language or the JDK breaks backward-compatibility. Code without generics, for example, still compiles with JDK 10, albeit with a bunch of warnings, so long as you specify an appropriate language level.

The ABI, however, can and does break, usually when one of your dependencies updates to a language level not supported by your toolchain or runtime environment. The common solutions are to update your toolchain and language level, "desugar" the artifact if possible, or to not update the dependency. This is something the Android world has been dealing with for a long time (since Java 8 was introduced) and it's not a completely terrible situation.

The problem you mention regarding IntelliJ and its (lack of) stable APIs has more to do with IntelliJ than the languages used or the tools with which it is built.

The biggest issue I've met with the ABI issue is frameworks (Spring etc.) that emit bytecode at runtime.

I tried upgrading our project to Java 10 and Spring 4 and Camel (I can't remember the version) were the main issues because of their runtime generated proxies. The fix is to upgrade Spring to Spring 5, but that's a whole bunch of other work, and I'm unsure of the scope of work to upgrade Camel.

All that said, Java ABI is far more forgiving than Scala ABI. Every Scala artifact is appended with an "underscore Scala version" and SBT uses the '%%' operator to handle this implicitly, and it causes me no end of grief.

Last time I looked at upgrading our Spark code to use JRE 10, Scala broke because JRE 10 support only came in at Scala 2.12 (but was backported to Scala 2.11 recently) and Spark only supports Scala 2.10 and 2.11 at the moment, apparently the work to support 2.12 is still ongoing.

I remember running into an issue sort of like that with Struts (IIRC). I could use Java 8 anywhere in the codebase except on JSP files because Struts would choke on the new bytecodes it didn’t know were now valid.

Actually it works if you use a recent eclipse ejc for compiling the JSPs. i.e. tomcat 8+ will be ok with such feature use like default methods.

Java has legendary backwards compatibility and has had that since the first public release. That’s not going anywhere.

This changed somewhat starting with Java 9. Quite a few libraries broke down due to deprecated classes finally being removed and the version format change.

Java also has a considerable class of libraries and tools that tends to break on every major release: mocking frameworks, dependency injection libraries, annotation processors and everything heavily relying on bytecode generation. I'd say most Java projects contain at least one of these, so migrating to a newer Java is not just a matter of installing a new JDK and changing your JAVA_HOME.

Java nine is sort of a special situation because they finally turned off something that people were never supposed to do in the first place and they been warning people about for a very long time.

My entire career has been as a Java developer, I’ve never had an issue updating Java to a newer version causing breakage.

Upgrading third-party tools, like maven or various other things, can certainly do that. But I haven’t had it happen with Java.

I do wonder if that will be slowly changing under new ownership. Java 9 and 10 had some pretty risky changes, compared to the prior releases. Take a look at the migration guide.


Usually you don't see sentences like 'For every tool and third-party library that you use, you may need to have an updated version that supports at least JDK 9.', nor 'Check the websites for your third-party libraries and your tool vendors for a version of each library or tool that’s designed to work on JDK 9 or 10.' as things to watch for in a Java release.

“Compared to previous releases“ is the issue there.

They removed classes that they had been telling people not to use for... well... since they were put in.

It had to happen someday. The fact that they let it go on for, what, most of two decades? That’s commitment to backwards compatibility.

With the new module system they can make sure that this doesn’t happen again.

My main issue with the Java 9 upgrade was the separation of the javax.* apis into a separate module, but it was easily fixed with a Maven artifact.

Is this a major concern? Isn't this part of why Java works to maintain backwards compatibility?

Java has full backward compatibility.

Just for comparison....

Microsoft supports .Net Core LTS releases for three years....


Did they learn the definition of "long term support" from phone manufacturers?

.net 1.1 (I looked this up when I had to install it earlier this week...) was supported for 10 years until 2013 and as part of Windows Server 2003 (32 bit) it was supported until 2015 so it looks like they're going backwards too.

.Net Core (open source) and .Net Framework (Windows Only) have different support policies.

.Net Framework has a much longer support lifecycle.


How frequently do they release LTS releases?

The six months security patch limit would be a DISASTER in Java world if releases still happened every 2 to 3 years like they used to. Everyone would be stuck with large periods were no security patches were available without paying.

But that’s not happening here. Yes you’re limited to six months of free security patches, but a new version comes out every six months. So you can stick with Oracle and never paid a dime is long as you stay reasonably up to date.

How frequently do they release LTS releases?

It’s in the link.

One could argue that LTS software is just a trap (and a particularly lucrative one). I know that this is likely a couple factors that allow me to say this namely

1. I can rule over our dependency and technology choices with an iron fist if need be

2. I live in on post 3.6+ Python as our main language and for anyone with good CD/CI you can always run your tests against the master branch and latest dev builds at least one or two versions ahead

Okay so with that said, I feel like LTS versions of software are a trap for this reason. You end up in situations where there is so much pain with upgrading that you end up spending more either more time slogging through incompatibilities in the upgrade pipeline the longer you put it off the worst this gets, or you end up in situations where you have to maintain a forked Verizon’s if the code base while you upgrade things or you have to do a lot of monkey patching and work around to just get things running (famously this is the approach GitHub used to upgrade from different versions of Rails)

The alternative is to pay money to companies that will maintain the version of what you are using, but even then you inevitably have to move forward and that leads to nasty vendor lock in

I personally believe that having a robust CD/CI platform and a well tested code base allows you to by pass this whole charade. We always have our code tested against the next next incremental release in testing, the next versions dev branch and a rolling git pull and build that tests our code base. We really do expect things to break but it lets us plan with our working code quite accurately to avoid this problem altogether and now we largely update pretty close to release cadence.

I don’t know if this helps anyone but I suggest doing software development in this fashion will save you a ton of headache. Being up to date with rolling release strategies has helped our productivity immensely because we can think about our dependencies and we have really whittled down to just a few core ones l. At a certain point we also formed some of the smaller ones and find maintaining those ourselves has been a flawless experience

Not upgrading your dependencies in-line with upstream is a form of perverse technical debt -- perverse because most technical debt is incurred to get new features/releases quicker, whereas the technical debt from not upgrading is incurred to simply stay where you are.

People really don't understand the cost of bit rot -- especially business people. If it worked 10 years ago, and it hasn't changed, why can't we use it now? I happen to have an old Rails project that my team has to maintain. Rails 2, Ruby 1.8.7. When we stopped being able to even compile Ruby 1.8.7 with the latest versions of GCC, I finally got approval to upgrade.

Things that business people can't understand is that nobody supports the language version any more and you can't build it with modern tools. You can't run the old OS any more, which has the old tools to build your code because the OS doesn't run on new hardware. I've been in the situation where I've had a server that was practically unique. It was so old that the only hardware that could run it, was the machine that was already running it. If that died, it was game over. (I should probably mention that I tend to specialise in difficult legacy situations in my career ;-) ).

The idea that you need to do maintenance work on your software, even without fixing bugs and without adding features, is incredibly difficult for business people to swallow. So they naively allow the situation to get very, very bad (and then they have to hire someone like me).

Right. If you stick with the herd, it's easy to find other people who are in your situation and share the burden of upkeep. Sticking on a platform past its prime means taking on a far larger share of the support burden yourself.

Nice idea on clarifying what the word "free" means in this context (gratis, not libre) by prefixing it with a dollar sign. This is the first time I saw this, is this usage widespread?

I've never seen it either and found it to be immediately clear. I wonder what similar prefixes could be used for "libre not gratis" and "libre and gratis". I know that "Free" could stand in for the second, but it's more ambiguous than "$free" was in this article.

After I knew what they meant it made sense, initially I wondered if it was a version of that old programmer joke of a shell variable. Like $DIETY.

On a scale from 1 - 10 it gonna suck 15 when the tech you rely on is being bought by Oracle.

Actually Java got saved by Oracle, and Java 8 would never happened under Sun management. let's say Java got mush mush better under Oracle. i know how hard the truth is by telling someone Oracle did something good but actually it did.

I didn't keep up when it happened. What happened with Sun?

Happy to hear that it is not all bad

The TL;DR is that Red Hat and IBM are going to take over from Oracle as primary maintainers of java 8 (in the form of openjdk) after January 2019. That is likely to last through at least 2022.

Going forward Oracle will be primary maintainers of openjdk for each new version of java for six months. Every third version (starting with 11) the hope and expectation is that RH, IBM, et al will take over after those six months are up and continue to support those versions for at least several years.

That's such a weird arrangement from a people perspective, but if that works out, it bodes pretty well for the longevity of Java

That sounds like a disaster, the developers at oracle no longer have any personal incentive when it comes to producing quality, maintainable software because it is no longer a problem they have to deal with. Because they're not feeling any personal pain from bad decisions they will no longer learn from them.

This is close to the consultancy business model that's been at the root of so much bad software.

What makes you say that? The Oracle JDK IS the Open JDK, only with a few extra features. At least that’s my understanding. A shared code base.

If they do a terrible job of sticking a feature in they’re going to have to keep handling it until it’s cleaned up or removed.

The author exposes his agenda by downplaying Oracle's decisions that impact Java availability to businesses and FUDing other options, e.g. he says that RedHat will push security fixes to OpenJDK first and repackage afterward, as if it was something bad.

What does this mean for me as an engineer who writes Java every day?

Use openjdk instead. By default on linux, that's what will continue to happen. You won't really be impacted by this.

There’s no need to use Open JDK.

If that’s your preference or you have some other reason (like the fact it’s probably pre-installed in a Linux distribution) there’s nothing wrong with it.

But there’s absolutely no need for a developer to switch off of Oracle unless you absolutely cannot move to Java 9 or newer.

Java 9 is already no longer supported. If you haven't moved yet go directly to 11.

As after 11 there are no features in OracleJDK that are not in OpenJDK I would think moving to the vendor supported (Open)JDK on your linux system is the smart default thing to do.

how's the performance compared to Oracle's (in Linux)?

They are the exact same. Oracle moved the core bits to openjdk long ago. See: https://jaxenter.com/green-interview-jdk-binaries-137078.htm...

Basically nothing. If you can run Java 9 or newer (because you don’t have dependency issues) then you can continue to run the Oracle JDK and just upgrade for security patches.

> Build OpenJDK yourself. [...] I suspect this not a very realistic choice for most companies.

Are they saying that most firms don't have the ability to build software themselves, or is this a situation particular to Java where they don't backport security patches and most firms aren't on the latest major?

Right now I would expect that basically every company using Java is using the official Oracle version. They’ve never had to worry that they can’t get security patches for the version released 14 months ago.

Practically everyone can use OpenJDK as a drop in replacement. But right now we don’t know how long security patches will continue for each release.

I would hope most firms are on Java 8 now. It wouldn’t surprise me to see people who are still on six or seven.

Nine/ten had a compatibility break (by removing access to internal packages that no one was supposed to be using). Many companies haven’t upgraded because of that. Lots of very popular Java libraries weren’t ready or depended on other libraries that weren’t ready, so upgrading may have been difficult even if you wanted to.

That seems to be a one time thing though. At this point I don’t see why companies shouldn’t be able to upgrade relatively frequently to the latest major release.

I am not well-verse in Java world and curious about one thing.

The valuable part of Oracle JDK is the GC. For example G1 is only going to be available on OpenJDK 9. What if there's a new GC being developed, will OpenJDK have the new GC?

There is a new GC being developed (ZGC), and it's available as part of OpenJDK under GPL license.

See this thread from earlier today: https://news.ycombinator.com/item?id=17875944

My understanding is that Open JDK is where all Java language development takes place now.

It’s only some special enterprise features that aren’t part of the language itself that are restricted to the paid version of the Oracle JDK.

Were (past tense) there is nothing in Oracle JDK 11 that is not in OpenJDK 11.

Oh, I thought there might still be some high end EE stuff.

Thanks for the clarification.

G1 works fine on OpenJDK 8, you just have to pass the command line parameters to enable it. It was made the defaultin OpenJDK 9.

Kudos to RH. Here's hoping that we don't have to keep adding JIRA issues to update Java every 6 months. I have enough of this on the Javascript side as it is..

If Java is open and free, then why did Android decide to rewrite Java? I notice a lot of companies seem to avoid Java so there must be some sort of a catch.

Android is quickly moving away from Java to make Kotlin the default language nowadays, it's probably in part due to the Java lawsuit (and in part to add language features quicker without changing the VM).

They aren't rewriting the Java parts.

Kotlin just gets an additional library to make some SDK usage a bit easier.

No indeed but they are clearly phasing out Java as the default language in the future, I don't see much plan to fully adopt Java 8 (only partial support for now) and Java 9 (no support) for Android. Since they are using their custom ART Runtime, it's not even impossible they would add VM instructions specific to Kotlin in the future (I don't have any knowledge of that yet but that would not surprise me).

That will only happen when Android Frameworks and lower level stack gets rewriten in Kotlin.

No mention of Kotlin in Treble documentation, only Java and C++.

It is like hoping UNIX derivatives will use anything other than C on their kernels.

AOSP commit messages show initial support for Java 9 and their clamp down in reflection for private APIs done in Android P seems to be a step towards Java modules.

I see an easier path just rebooting everything with Fuchsia than having them rewriting Android in Kotlin.

You're a long time out of date at this point. Google merged openjdk a long time ago for their base JVM.


They didn't fully merged OpenJDK, just selected parts of it, across multiple Android versions.

Compatibility is still kind of broken.

It isn't possible to guarantee that any random Java library is usable on Android.

Yes not all of it - there are still android specific bits yet. Enough of it is openjdk though.

And given how updates work....

Because when Sun was still around that wasn't the case and Google didn't want to pay for licenses.

Here is what Gosling has to say on it.


Where is this is going to leave GraalVM and JIT for ARM? I believe both of those are exclusive to Oracle. And I was kind of excited to use GraalVM.

GraalVM has two versions, CE and EE. EE is closed but CE is open.

Boy, that Ask.com toolbar must be super lucrative.

Java: malware the moment you click install!

The irony is that you probably use Windows - the world's most insecure, malware, virus and ransomware ridden OS in the world.

Java is still available at zero-price

Adoptopenjdk! Azul! OpenJRE! J9! choice is a great thing!

what worries me most is the muscle memory of developers downloading Oracle jdk from oracle. If Oracle keeps is dark patterns, thousands of developers will be using the commercial tool and on the hook for license fees.

That is not how it works.

You can download the Oracle JDK and use it forever, commercially, for free. You just won't get updates backported to your version unless you pay for long term support.

Find that in writing. Everything I read says jdk 8 is the last free Oracle jdk.


"This is as in many other projects, except it seems that releases of the LTS version, dubbed to be Oracle JDK, will no longer be free. This means that companies looking to stay on a specific version for more than six months would need to get updates from a commercial operator or apply patches from later free OpenJDK versions manually."

Then read the article you're commenting on?

I did. Here is the money quote:

Firstly, it is not clear that there will be an Oracle JDK that is $free to download. Despite my best attempts, I could not get 100% clarity on this point, but see also this tweet.

Oracle has never announced that they’re charging for their JDK. In all the talk over this stuff over the last few years they’ve never said that. They’ve been quite clear that it will still be free to download.

The only change is that free security patches will now be limited to six months after a release.

If you are a professional writing Java code, nine times out of ten someone in your org will be responsible for volume licensing.

exactly my point. Oracle going to come knocking and shaking down enterprises for millions because they still pull jdk from Oracle instead of openjdk

But as cwyers said, there's a person in charge of volume licensing. That person should have a way of auditing everyone's system to determine if they are using properly licensed products.

They can just check for whether Ask! toolbars are installed in the browser.

Maybe it's time to start moving away from Java?

I'm curious what your preferred alternative to Java is?

.Net Core. It’s maintained by a company that actually knows how to support a platform.

The alternative is, and always has been, a language and runtime lib that is spec'd in the open and has multiple implementations to choose from. Though choices have become limited. Only C, C++, and JavaScript seem to fit the bill AFAICS.

And Java is moving into this world as well. It always had multiple implementations but mostly people used the Sun and then Oracle binaries. Now they are saying get your implementation from your linux vendor (e.g. gcc like) or pay us (intel C compiler like) if you want long term support.


Common Lisp?

Python as well

Isn't .Net Core has 3 years LTS support? and Java LTS are 4 years support? and are you comparing the huge ecosystem around Java by .Net Core?

You aren’t paying Microsoft for long term support for those three years.

Are you saying that .Net doesn’t have a robust ecosystem?

I have a couple of WP devs that want to talk with you.

Because Java has had such a terrible support record. And it’s not like Oracle has any products they support.

Microsoft has quite a history of creating and then abandoning different APIs and technologies.

This is all pointless hyperbole. This isn’t that big change in the Java ecosystem. It’s not like it’s going to cost $20 to run Java at all.

The price is for running Oracle’s Java... that’s older than six months... that you want security updates for.

You can run a different JVM, you can get the security patches from someone else, you can go without them. You can pay someone other than Oracle. Free options still exist.

Microsoft is known for s lot of things, but breaking backwards compatibility and short term support aren’t two of them.

Microsoft broke backwards compatibility when they introduced generics in .NET 2.0. Java, on the other hand, is still getting flac for refusing the break backwards compatibility to introduce reified generics.

Which to me is a good thing, Microsoft ripped the band-aid early and got it over with instead of delaying the inevitable.

I think generics in Java works pretty well.

Should it have been there from the beginning. Yes, but it would have delayed the release of JDK 1.0.

People often complain about type erasure, but I have seen very few examples of everyday programming where it would have helped. I think type erasure was a reasonable compromise.

The only real use case I can see is value types.

You want List<int> to be backed by array of ints, not Integer objects. But there are already plans to add this to JVM, once value types are added.

How did generics break backwards compatibility?

> Bytecode or ‘Intermediate Language’ (IL) changes. The main > place that the implementation of generics in the CLR differs > from the JVM is that they are ‘fully reified’ instead of > using ‘type erasure’, this was possible because the CLR > designers were willing to break backwards compatibility, > whereas the JVM had been around longer so I assume that this > was a much less appealing option. For more discussion on >this issue see Erasure vs reification and Reified Generics > for Java. Update: this HackerNews discussion is also worth a read.


Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact