Me: once in a decade, maybe. I'm still perfectly happy (and productive) with C# 3.5, Java 7 and Python 2. I'm grudgingly moving forward because the old toolchains are being deprecated now, but my codes aren't significantly improved because of a few fancy idioms available in a new language, and it barely affects the design decisions. Upgrading your library or framework can be more often, but I tend to still be fairly conservative. Upgrading your language for production twice a year sounds crazy to me.
I wish I was completely joking.
Progress needs to be made and those who can't follow don't have the right to complain really. Old tools are always available because language compatibility is preserved (which, imo should also be dropped, we would have much better language as a result).
We also stick to LTS (in my case node LTS)
Honestly I wish the ECMAScript board would have just went he other way and doubled down in making JS a function language instead of trying to push more OOP approaches would have been a lot better. It’s always been a little wonky as a Prototype based OO language anyway.
If you mean going into a functional paradigm I’d say it’s definitely a matter if taste.
However my biggest gripe is that the ECMAScript board seems intent on being all things to all people. So they half implemented classes. And they added some neat data types. But god for bid you want to override their constructors yet if I make a class I can override my constructors as one would expect in a class based language. Though forcing the new keyword onto classes wasn’t a great paradigm to keep imo. Nor do I like how static methods and properties are treated.
Also decorators are only for classes? Why does that make any sense?
It’s really screwy.
At least going the opposite direction would’ve forced correctness on a baseline.
If they want to be an OOP langsuge with classes the Standards need to embrace it more fully and I see little evidence of that
The reason that's critically concerning for the language and its ecosystem is that since 'vanilla JS' has no opinion, all the libraries created for it tend to have to form their own opinion. This then means that when binding to many ES libraries, you're torn between a hundred flavors of functional, half-baked OO, procedural, or just bit buckets of inconsistency.
If the language can't make up its mind, you're guaranteed to see fragmentation by its users.
The generally thought is that you technically only need them for classes (and really, only for class members and properties). For anything else you can just use a higher order function. Not as pretty, but JS decorator arent about syntax sugar, they're about filling one of the bazillion gaps in the language that opens when you introduce a class keyword.
Of course, the spec for decorator has been thrown away and started over once already (so basically no one using typescript/angular/mobx/whatever with decorators are actually spec compliant these days), and the new spec has been in limbo for years. Decorators aren't part of JS at all right now.
I think that Go and ECMAScript has shown that language compatibility is critical. The language changes that really improve quality of life change the way that you write code in that language, which mean you have to rewrite your existing code to opt-in to them. Small languages changes may not actually be worth the cost of adoption.
This just creates two different languages. The older language only dies when everyone stops using it. The Ruby community managed to do that with 1.9 but Python 2 still exists alongside Python 3.
If you have performance concerns you can update a single server and see how it performs relative to the others. Once you're comfortable that everything is working well you can gradually migrate more servers until everything is running the new version.
Each service should have its language version pinned instead of relying on a single global version. That way you can upgrade projects incrementally.
If you're really worried you can keep both versions of the language installed at the same time so you can quickly revert in case something goes wrong.
Not many systems are well designed.
But even if you weren’t using the new features, there was basically no downside to upgrading the run time. The newer version is usually ran much better thanks to improvements in threading or garbage collection. And you knew you’d get security patches, even if you ended up getting stuck on that version for a while.
As you said, libraries and frameworks can be a big issue to upgrade.
But the backwards compatibility promises of Java have been so incredibly strong it hasn’t been an issue.
The one time breakage in Java 9 is basically the first I’ve experienced in my career.
> As a Java developer, ...
I've been developing Java for a living since almost the beginning (since 1997), but I'd seriously recommend not tying yourself to a particular language as in "Java developer". Java is but a tool among others, and I generally can work better with devs that know more than just Java, for the pragmatism and utilitarism they bring, rather than Java devs dogmatizing and bragging about the latest fad in Java idioms and libs.
This applies to the new release cadence as well. But time will tell if they will achieve their goals.
They eliminated re-scanning of the stack on Go 1.7 by adding a hybrid write barrier which is 90's work (found the proposal: https://github.com/golang/proposal/blob/master/design/17503-...). So Go is by no means state-of-the-art performance wise. That is not bad and has its benefits, one of them being that you can improve with every release.
Yet Go is blazingly fast compared to most other languages that are widely used for web server development (except, maybe, Java).
It seems, given the end of Moore's "Law", as well as environmental concerns, that language efficiency will get more attention. It'll be interesting to see if languages like Swift, Rust, and Julia can gain traction for web development.
I wonder how uptake has been for IBM Kitura so far...
That being said we still use mostly Python 2 for our Python services.
It's a lot easier of a transition than it used to be. Any other support replacement language would of course also work to keep you secure, but honestly might be more transition work.
If you're a low-value target running unimportant stuff on an airgapped machine, this might not matter.
String interpolation doesn't fully replace String.Format, because it does not provide a way to re-arrange where the placeholders go in the format string, which is important for localization.
After that, I move very very fast. I prefer to hit "small" pains in short-term that try to do a huge jump later.
This mean that as long as I'm coding in a project (I'm contractor and do freelancing) I upgrade to whatever stable version my tools trow at me ASAP. And even minor versions or sometimes betas.
Not only language(s) - I keep at the same time different codebases in different languages, one main active plus maintenance-, I try to move everything: OS, RDBMS, IDEs, Editors, Frameworks, Libraries, etc. I probably hit "upgrade" every few days per week!
ONLY after the project stabilize and I do a few changes per-year, I stop doing that. And if the project somehow accelerate again I try to port ASAP.
How crazy is this? If the ecosystem I'm doing is nuts (ie: js) very clearly this reckless behavior show that.. QUICKLY.
...and then I need to pay more attention to my depencencies (keep them small) and choses only stable stuff.
I hit breaking changes not as often. And when see something painfull then I pin versions and keep them for a while.
This mean that I move ASAP and stay current as much as possible. I slow down when enter production, and slow down/stop again when the customer(s) stop requesting work.
This demand to use logging and have some testing, fast deployment/rollback, automate backups and use source control, very soon in the project, but this is good things anyway. (and maybe have a few VM or dockerized stuff but not as many despite more than 20 years doing crazy).
So far, this have work great for me, and never get stuck for months in rewrites like in the past. I get use to rewrite fast, that compensate to the fact that I'm a slow developer.
You can do actual *italics* like this.
> For most users, we still recommend using Java 8 for compiling (and running) Scala code.
And if you have a reasonable testing environment, it's not rocket surgery to figure out how much of a problem a new version might be.
Eventually you are going to have to upgrade, and if you are 10 years behind it's going to be a huge, painful migration. If you upgrade every six months, each upgrade is easy and barely noticeable.
What you are doing seems to me like deciding not to change the oil in your car. You might save you a bit in the short term, but it could come back to hurt you in a big way.
I'm sure if I had to deploy anywhere else than a server I ultimately control, I'd be a lot more conservative, but until then, this stuff makes things fun.
What I understand from the article - Oracle is not supporting it Oracle 11 after 6 months, but other still will, so nothing to worry about.
Then you're probably earning enough to pay Oracle for continued support. 6mo is just the no-cost timeframe.
The problem is that if you stay on old language versions, you're doomed soon or later (see the world moving to Python 3.7, Java 11, C#7, Go 2,..).
Go 2 is notable on that list because the Go developers have already promised that all Go 1 code will continue to compile and work with Go 2. It's not a 2.0, it's more like 1.15 or 1.20.
I guess they learned from the Python 2->3 transition.
New features can make us very productive and honestly I don’t regret the aggressive approach
They need to keep up with security patches (for obvious reasons) but the six month limit has them scared. The consensus seems to be that they don’t believe they can keep up with an upgrade schedule that fast and are thus forced to pay for support from someone or hope that the community backports security fixes far enough for the rate at which they feel comfortable upgrading.
It doesn’t sound like they actually know what their upgrade rate would be, but they’re sure that it’s much slower than six months.
Will they decide on arbitrary long-term releases? Maybe 11 and 15 and then 19 get three years of support?
Or will the community extend the length of security patch releases by some sort of fixed margin, say 6 to 12 months, and leave it at that.
Right now no one truly knows how it’s going to work out.
I find it interesting that a company that pushes out updates much more frequently than every six months is uncomfortable updating the software THEY use every six months.
It’s not like there should be big breaking changes problems, you can rely on the TCK to know that your code will continue to work.
The community will provide updates, but it this point no one knows what the eventual pattern will be which makes planning based on it difficult.
Azul publishes this as well: https://www.azul.com/products/azul_support_roadmap/
Cursory google searching is all it takes to find these things.
Mark my words, there will come a day where continuously upgrading software will start to have a clearly higher cost than old, boring and stable. 2018 to me is the first year that a software platform can have had a true decade to stabilise in an environment that isn't constantly shifting and changing at a hardware level.
Maybe trends like in graphics cards will shake things up, but I believe the advantage is going to move to stable, slow upgrading, well supported platforms with large bases of well trained developer. Reducing LTS windows will be a disadvantage for the Java ecosystem.
Why would someone pick software that get feature releases and bug fixes on a yearly cycle when similar software is available that has weekly or daily releases.
There is an interesting chapter for this in Donella Meadows thinking in Systems
Oscillations! A single step up in sales causes inventory to drop. The car
dealer watches long enough to be sure the higher sales rate is going to last.
Then she begins to order more cars to both cover the new rate of sales and
bring the inventory up. But it takes time for the orders to come in. During
that time inventory drops further, so orders have to go up a little more, to
bring inventory back up to ten days’ coverage.
Eventually, the larger volume of orders starts arriving, and inventory
recovers—and more than recovers, because during the time of uncertainty
about the actual trend, the owner has ordered too much. She now sees her
mistake, and cuts back, but there are still high past orders coming in, so
she orders even less. In fact, almost inevitably, since she still can’t be sure
of what is going to happen next, she orders too little. Inventory gets too
low again. And so forth, through a series of oscillations around the new
desired inventory level. As Figure 33 illustrates, what a difference a few
Faster is not always better, especially in systems with delays. By reacting to initial cues prematurely you can easily end up in permanent disarray that simply turns into a chaotic feedback loop. It is often worthwile to upgrade patiently, so one can actually gauge long-term effects instead of reacting to noise. Changing the consumer experience is a conversation.
My top of the line late 2013 rMBP has got 4 cores, 16GB of ram, and 512GB NVMe SSD. rMBPs just recently upgraded to 6 cores and 32GB of ram (more storage too but that isn't a big deal nowadays since speed was the biggest optimization for storage and it is still NVMe IMO). However, server CPU and RAM has increased. You can now rent an AWS x1e.32xlarge with 64 cores and 4TB of RAM and thanks to AMD, that competition seems to finally be flowing back to consumer computers (At least for CPUs).
Just look at people being amazed at threadrippers 32 freaking cores on a desktop (matches AWS x1 when you consider the x1 has dual sockets) and we still haven't passed down the gains in RAM (2TB of RAM if it matches the x1).
When those performance gains pass down to consumer computers, users will have up to 6.33 times the amount of CPU and 64 freaking times the amount of RAM!
This will only happen once there is a usecase for that much RAM in a consumer machine that justifies the massive increase in power consumption. I don't see such a usecase yet. (The only thing that I can think of that pushes hardware requirements on consumer machines is VR/AR, but that's mostly a GPU problem.)
Many notebook models (also the previous Macbooks, I think) were stuck at 16 GB RAM because that's the maximum for low-power DDR3.
Where I'm worried this fails is with dependencies. Also in the context of developing plugins for other Java projects. For example building a plugin for IntelliJ and the different versions of IntelliJ users may have.
The ABI, however, can and does break, usually when one of your dependencies updates to a language level not supported by your toolchain or runtime environment. The common solutions are to update your toolchain and language level, "desugar" the artifact if possible, or to not update the dependency. This is something the Android world has been dealing with for a long time (since Java 8 was introduced) and it's not a completely terrible situation.
The problem you mention regarding IntelliJ and its (lack of) stable APIs has more to do with IntelliJ than the languages used or the tools with which it is built.
I tried upgrading our project to Java 10 and Spring 4 and Camel (I can't remember the version) were the main issues because of their runtime generated proxies. The fix is to upgrade Spring to Spring 5, but that's a whole bunch of other work, and I'm unsure of the scope of work to upgrade Camel.
All that said, Java ABI is far more forgiving than Scala ABI. Every Scala artifact is appended with an "underscore Scala version" and SBT uses the '%%' operator to handle this implicitly, and it causes me no end of grief.
Last time I looked at upgrading our Spark code to use JRE 10, Scala broke because JRE 10 support only came in at Scala 2.12 (but was backported to Scala 2.11 recently) and Spark only supports Scala 2.10 and 2.11 at the moment, apparently the work to support 2.12 is still ongoing.
Java also has a considerable class of libraries and tools that tends to break on every major release: mocking frameworks, dependency injection libraries, annotation processors and everything heavily relying on bytecode generation. I'd say most Java projects contain at least one of these, so migrating to a newer Java is not just a matter of installing a new JDK and changing your JAVA_HOME.
My entire career has been as a Java developer, I’ve never had an issue updating Java to a newer version causing breakage.
Upgrading third-party tools, like maven or various other things, can certainly do that. But I haven’t had it happen with Java.
Usually you don't see sentences like 'For every tool and third-party library that you use, you may need to have an updated version that supports at least JDK 9.', nor 'Check the websites for your third-party libraries and your tool vendors for a version of each library or tool that’s designed to work on JDK 9 or 10.' as things to watch for in a Java release.
They removed classes that they had been telling people not to use for... well... since they were put in.
It had to happen someday. The fact that they let it go on for, what, most of two decades? That’s commitment to backwards compatibility.
With the new module system they can make sure that this doesn’t happen again.
Microsoft supports .Net Core LTS releases for three years....
.net 1.1 (I looked this up when I had to install it earlier this week...) was supported for 10 years until 2013 and as part of Windows Server 2003 (32 bit) it was supported until 2015 so it looks like they're going backwards too.
.Net Framework has a much longer support lifecycle.
The six months security patch limit would be a DISASTER in Java world if releases still happened every 2 to 3 years like they used to. Everyone would be stuck with large periods were no security patches were available without paying.
But that’s not happening here. Yes you’re limited to six months of free security patches, but a new version comes out every six months. So you can stick with Oracle and never paid a dime is long as you stay reasonably up to date.
It’s in the link.
1. I can rule over our dependency and technology choices with an iron fist if need be
2. I live in on post 3.6+ Python as our main language and for anyone with good CD/CI you can always run your tests against the master branch and latest dev builds at least one or two versions ahead
Okay so with that said, I feel like LTS versions of software are a trap for this reason. You end up in situations where there is so much pain with upgrading that you end up spending more either more time slogging through incompatibilities in the upgrade pipeline the longer you put it off the worst this gets, or you end up in situations where you have to maintain a forked Verizon’s if the code base while you upgrade things or you have to do a lot of monkey patching and work around to just get things running (famously this is the approach GitHub used to upgrade from different versions of Rails)
The alternative is to pay money to companies that will maintain the version of what you are using, but even then you inevitably have to move forward and that leads to nasty vendor lock in
I personally believe that having a robust CD/CI platform and a well tested code base allows you to by pass this whole charade. We always have our code tested against the next next incremental release in testing, the next versions dev branch and a rolling git pull and build that tests our code base. We really do expect things to break but it lets us plan with our working code quite accurately to avoid this problem altogether and now we largely update pretty close to release cadence.
I don’t know if this helps anyone but I suggest doing software development in this fashion will save you a ton of headache. Being up to date with rolling release strategies has helped our productivity immensely because we can think about our dependencies and we have really whittled down to just a few core ones l. At a certain point we also formed some of the smaller ones and find maintaining those ourselves has been a flawless experience
Things that business people can't understand is that nobody supports the language version any more and you can't build it with modern tools. You can't run the old OS any more, which has the old tools to build your code because the OS doesn't run on new hardware. I've been in the situation where I've had a server that was practically unique. It was so old that the only hardware that could run it, was the machine that was already running it. If that died, it was game over. (I should probably mention that I tend to specialise in difficult legacy situations in my career ;-) ).
The idea that you need to do maintenance work on your software, even without fixing bugs and without adding features, is incredibly difficult for business people to swallow. So they naively allow the situation to get very, very bad (and then they have to hire someone like me).
Going forward Oracle will be primary maintainers of openjdk for each new version of java for six months. Every third version (starting with 11) the hope and expectation is that RH, IBM, et al will take over after those six months are up and continue to support those versions for at least several years.
This is close to the consultancy business model that's been at the root of so much bad software.
If they do a terrible job of sticking a feature in they’re going to have to keep handling it until it’s cleaned up or removed.
If that’s your preference or you have some other reason (like the fact it’s probably pre-installed in a Linux distribution) there’s nothing wrong with it.
But there’s absolutely no need for a developer to switch off of Oracle unless you absolutely cannot move to Java 9 or newer.
As after 11 there are no features in OracleJDK that are not in OpenJDK I would think moving to the vendor supported (Open)JDK on your linux system is the smart default thing to do.
Are they saying that most firms don't have the ability to build software themselves, or is this a situation particular to Java where they don't backport security patches and most firms aren't on the latest major?
Practically everyone can use OpenJDK as a drop in replacement. But right now we don’t know how long security patches will continue for each release.
I would hope most firms are on Java 8 now. It wouldn’t surprise me to see people who are still on six or seven.
Nine/ten had a compatibility break (by removing access to internal packages that no one was supposed to be using). Many companies haven’t upgraded because of that. Lots of very popular Java libraries weren’t ready or depended on other libraries that weren’t ready, so upgrading may have been difficult even if you wanted to.
That seems to be a one time thing though. At this point I don’t see why companies shouldn’t be able to upgrade relatively frequently to the latest major release.
The valuable part of Oracle JDK is the GC. For example G1 is only going to be available on OpenJDK 9. What if there's a new GC being developed, will OpenJDK have the new GC?
See this thread from earlier today: https://news.ycombinator.com/item?id=17875944
It’s only some special enterprise features that aren’t part of the language itself that are restricted to the paid version of the Oracle JDK.
Thanks for the clarification.
Kotlin just gets an additional library to make some SDK usage a bit easier.
No mention of Kotlin in Treble documentation, only Java and C++.
It is like hoping UNIX derivatives will use anything other than C on their kernels.
AOSP commit messages show initial support for Java 9 and their clamp down in reflection for private APIs done in Android P seems to be a step towards Java modules.
I see an easier path just rebooting everything with Fuchsia than having them rewriting Android in Kotlin.
Compatibility is still kind of broken.
It isn't possible to guarantee that any random Java library is usable on Android.
Here is what Gosling has to say on it.
You can download the Oracle JDK and use it forever, commercially, for free. You just won't get updates backported to your version unless you pay for long term support.
"This is as in many other projects, except it seems that releases of the LTS version, dubbed to be Oracle JDK, will no longer be free. This means that companies looking to stay on a specific version for more than six months would need to get updates from a commercial operator or apply patches from later free OpenJDK versions manually."
Firstly, it is not clear that there will be an Oracle JDK that is $free to download. Despite my best attempts, I could not get 100% clarity on this point, but see also this tweet.
The only change is that free security patches will now be limited to six months after a release.
Are you saying that .Net doesn’t have a robust ecosystem?
Microsoft has quite a history of creating and then abandoning different APIs and technologies.
This is all pointless hyperbole. This isn’t that big change in the Java ecosystem. It’s not like it’s going to cost $20 to run Java at all.
The price is for running Oracle’s Java... that’s older than six months... that you want security updates for.
You can run a different JVM, you can get the security patches from someone else, you can go without them. You can pay someone other than Oracle. Free options still exist.
Should it have been there from the beginning. Yes, but it would have delayed the release of JDK 1.0.
People often complain about type erasure, but I have seen very few examples of everyday programming where it would have helped. I think type erasure was a reasonable compromise.
The only real use case I can see is value types.
You want List<int> to be backed by array of ints, not Integer objects. But there are already plans to add this to JVM, once value types are added.