Hacker News new | past | comments | ask | show | jobs | submit login
What Java Modules Are About (inside.java)
124 points by pjmlp 10 days ago | hide | past | favorite | 91 comments





IMO Java modules feature is failure. I'm using Java and I've yet to find a single library using modules. I know that they allowed to modularize JVM, yet I've yet to find any advantage of this fact either. Everyone still downloads and runs an entire JVM. And it's not like JVM become slim and tiny, it's still hundreds of megabytes. Those who need slim apps, use GraalVM which does some kind of tree shaking, AFAIK, rather than relying on modules.

All that work, which broke plenty of Java applications, violating main Java mantra of backward compatibility, all for nothing.


1. The breaking of some libraries in JDK 9 had very little to do with modules and was mostly due to weak encapsulation. Almost all issues stemmed from libraries, written to be non-portable, hacking into JDK internals and becoming entangled with the particulars of JDK 8. When those internals changed, they broke. With modules' strong encapsulation now finally enabled, this won't happen again. In other words, modules weren't the cause of the breakages, but added to prevent them in the future. Unfortunately, they were blamed for the problems they were created to prevent because they were the most famous feature in JDK 9.

2. Modules are becoming the core of Java's security strategy. Without them, your code only does what you think it does unless any library decides it should do something else.

3. The reduction of security and maintenance concerns have already yielded tremendous dividends, as we the JDK developers have been freed to work on other stuff. Your complaint is like someone finally getting a fast internet connection, saying, all that noise and dirt of tearing up the street putting in the cable ducts have been in vain; they should have just given us fast internet, because that's all we wanted.

So, we've got better security and a drastic reduction in future breakages, all for negligible backward-incompatible changes to the spec.


I agree it hasn’t gotten adoption, but you get a pretty slim JVM if you properly modularize. Another poster in this thread said he got his image down to 30mb.

Helidon is probably the only library I know that is modularized [0].

I think once the community finally moves past jdk 8 (I’m hoping spring making 17 the base next year will help) we’ll see more adoption as more libraries modularize.

[0] https://github.com/oracle/helidon


Half agreed, but, there are some mitigating things to consider.

One is that native code is really verbose compared to bytecode, and jlink optimizes bytecode to shrink it when it's modularized (collapses the string tables and other things). For tiny hello world apps native-image can produce smaller downloads. It doesn't take long before a jlinked JVM is smaller, because the JIT compiler is kind of "decompressing" native code on the fly. A minimal jlinked JVM can be small-ish, like 20mb or less, and if you compile a custom JVM with configure flags for size it can be even smaller still. The smallest jlinked hello world I managed to get when I tried this was a 7mb download. Not small for a program that only prints something, but, obviously, most of that is one time cost.

Another is that jlink+modules is way easier to use and configure than native-image. Tree shaking breaks stuff all over the place. You end up needing complex config files, agent-based config auto generation and/or lots of runtime testing to ensure you got everything. The point of "reliable configuration" is to avoid all this by building a less aggressive form of tree shaking into the language itself.

Quite a few libraries support modules at this point, but you may not actually have noticed. It depends on your build tool and things.

One thing that's slowed it down is that a lot of widely used libraries that sit at the bottom of the dependency stack aren't actually well maintained. And, modules can only depend on other modules. SLF4J, Guava etc have been slowing things down a lot.

My feeling is that the module system is kind of underrated and waiting for better tooling to be fully exploited. Quite a few parts of Java are like this. Like, it took quite some years for the ecosystem to start using annotation processors to full advantage, and some popular libraries still have it as a pending feature (like PicoCLI).


Java modules haven't been adopted because people are still getting off of JDK 8 as it becomes deprecated.

It will only make sense for library and framework maintainers to invest in modules once there is a large user base on JDK 11.

Either way, modules are still useful for application development. I used them years ago to close off internal APIs in one of my applications, which helped me to not cross levels of abstraction accidentally.


Here the list of all modules on Maven Central https://github.com/sormuras/modules/blob/main/com.github.sor...

> Everyone still downloads and runs an entire JVM

Nope, we (i know anecdote) use jlink in docker containers, mostly to avoid useless dependencies. The problem of GraalVM native image is that it comes with a subpar GC.

> which broke plenty of Java applications

yes, i agree, modules did not worth the trouble.


> All that work, which broke plenty of Java applications, violating main Java mantra of backward compatibility, all for nothing.

How did modules break Java applications? Having recently upgraded a number of applications from 8 to 11, none of the things we had to fix were due to modules. They were largely due to libraries relying on internal implementation details of Java 8 which needed dealing with, the exact sort of thing that modules are looking to prevent going forwards (see "strong encapsulation" in TFA).

We've ignored modules so far with no issues.


He probably writes about removal of some EE classes that were in JDK (but shouldn't).

Or better protection of Java internals (warnings about using internal classes in JDK 11, and in JDK 17 no warnings - you need to --add-opens or don't use broken lib).


Yeah, the removal of EE classes was probably the biggest burden for my team (completely unrelated to modules).

IMO the main issue with modules is that strong encapsulation is actually a bad goal. If the focus was solely on reliable configuration, I think that the transition would have gone better.

WRT why I think strong encapsulation is a bad goal: I've worked extensively in .net where assembly-private members are the norm for core libraries. It has issues. Classes and methods that are theoretically extensible aren't so in practice because core logic is inaccessible.

In the end, encapsulation, no matter how strong, doesn't prevent people from relying on internal implementation details. If the concern is people relying on "internal" APIs, either suck it up and make those APIs public (with all that entails) or come up with a deprecation strategy (or just freeze them and refactor around them).


> Everyone still downloads and runs an entire JVM.

They do? I work in the Scandinavian public sector and we abandoned Java two or three years ago because of the continued security issues it posed. Software that require an install JVM to operate doesn’t get past our IT department.

I’m not sure modules would help though, as I don’t know much about them. But not everyone still installs JVM.


Java in the browser (Applet) is unsecured, because it can download and run untrusted code in the user machine.

Java in the server, on the other hand, is as secured as any other runtime, i.e only as good as the application written on it.


Alternatives like TeaVM let you code Java in the browser but it compiles quickly to compact JavaScript. This way you can get the best of both worlds. A typed language, full-stack refactorings, and the security of the browser JS VM.

TeaVM Website: https://teavm.org/

Tutorial in Java Magazine: https://blogs.oracle.com/javamagazine/java-in-the-browser-wi...


Typescript has already filled the niche of a typed, OOP language in the browser for me. And it's surely much better supported than TeaVM.

It’s not Java in the browser, it’s the Java virtual environment required to run Java apps.

If that’s not the JVM then my mistake.


I had in my mind people who use Java for their projects, not in general sense.

What kind of security issues? A random native app is more secure?

Let me guess, you've standardized on Electron apps?

Not exactly. Most of our apps are some form of client/server architecture which means most of them are moving to JavaScript clients, but typically it’s react, angular or vue.

Which Scandinavian public sector? Sounds like you are talking about client side java - applets maybe?

I'm not a java developer, but when i've had to use java I've hated the classpath mechanism. (In part because I dont feel like I can use build-tool-magic without understanding what is going on).

This article presents modules as a solution to some classpath problems, but doesnt quite give me the detail on how.

Can anyone elaborate further on how modules will augment/replace/obviate/worsen my need to care about the classpath?


The Java classpath is "flat" by default: you can only have one version of any given class (and therefore de facto only one version of any given library, unless they move the library to a different package for each major version, which some libraries do) on the classpath. This has advantages: you avoid the Rust problem of "Foo is incompatible with Foo" because you had two different versions of Foo in your (transitive) dependencies. But it means that if some library you're using uses a utility library internally, that dependency "leaks" and "pollutes" your classpath: you can't use a different version of that utility library yourself.

Modules try to solve this by distinguishing between exported and internal dependencies. The idea is that if your library depends on some other library and uses that as part of its API, that should be an exported dependency (and a program that uses your library has to use the same version of that). But if your library depends on another library just for internal implementation, that gets hidden away from the application that uses your library.

Personally I'm extremely doubtful of this working, because for it to work, library authors would have to specify their metadata correctly. So I suspect it's only going to lead to bigger problems than before, as the classpath gets more complicated and there are more ways to go wrong. But we'll see.


Your analysis focuses on the wrong aspects of the Java module system.

It's not meant to solve the problem of having different library versions on the same classpath, and it doesn't solve that at all as you still must ensure only one version of each class exists even if no modules expose those packages (see the split-package problem).

The module system is for one thing above all: one more level of visibility, allowing Java developers to specify which packages should be visible to other packages.

If you expose a package and that package has public types that expose another library's public types, then you must re-export those packages to make it clear that you have compile-time dependencies on another library, and that's checked so it's not at all hard to specify the appropriate re-exports. The only thing you need to worry about when writing a library is to expose only those packages you want, allowing you to have public classes that are not visible to the outside world unless they are in an public package.

Another clarification: Java doesn't really have a single, flat classpath: you can create your own "nested" classpath if you want by instantiating your own class loaders. OSGi works by doing that for each "bundle" (library), which then does allow you to have multiple versions of the same library co-existing peacefully as long as no two packages which use different versions of that library attempt to re-export the library packages to each other (which would fail when you tried to load those bundles into the runtime). You can absolutely get the Rust problem you mentioned in Java if you use your own class loaders, and I've seen that happen several times, even when both libraries are of the same version (a class loaded by a different classloader is NOT the same as the class of the same exact name loaded by another, which is why classloaders are hierarchical and not supposed to load classes already loaded by their parents).


"It's not meant to solve the problem of having different library versions on the same classpath"

It kind of is - there's a single API call that lets you instantiate a module graph with every module mapped to its own classloader. At that point modules work as you'd expect, and package overlaps only create errors if they're both being imported into a single module (where it'd be ambiguous at the language level).

As the article explains, this isn't on by default only because (supposedly) some popular frameworks would break due to assumptions about how classloaders are used in practice.

And here we hit a sort of weakness in how Java is developed. I've seen this justification given several times over the years for why module isolation isn't on by default. At no point are the frameworks in question named. Which frameworks break when this is on? And why can't I just flip a command line flag to enable or disable it? This compatibility argument is left hanging but no details are given, which is unfortunate - if the frameworks in question were named, maybe they'd fix their assumptions or other people would and classpath conflicts could be fixed.


> It kind of is - there's a single API call that lets you instantiate a module graph with every module mapped to its own classloader....

This is what I would expect -- and hence would likely solve most of my issues managing the classpath.


> The Java classpath is "flat" by default: you can only have one version of any given class [...] This has advantages: you avoid the Rust problem of "Foo is incompatible with Foo" because you had two different versions of Foo in your (transitive) dependencies.

The Java classpath is "flat" only if you use a single classloader. If you use multiple classloaders (for instance, with a plugin system where each plugin is loaded on its own classloader, to allow it to be dynamically loaded and unloaded), you can have multiple versions of any given class. At work, we have more than once had that "Foo is incompatible with Foo" problem (made more maddening because the "Foo is incompatible with Foo" error messages don't mention which classloader each "Foo" came from).


That's correct. This is especially noticable with application servers like Jboss that deliver their own libraries in the class path. Needing a slightly higher version of Hibernate is... interesting.

Can you elaborate on the Rust problem or give me a pointer to something describing it?

I always thought that Rust is perfectly happy to link in conflicting versions of a module into the same binary.


I believe the problem appears when module A returns a value of type B@0.1, and that gets passed on to module C which actually expected a value of type B@0.2. This is a fundamental problem with having multiple versions of the same library used in the same program. The problem is much worse with any kind of dynamic boxing support, since it visibility of the type can't be guaranteed statically.

Of course, for many programs, those values are not returned outside the original scope and all works well.


Cargo is closer to NPM. It does not require SAT resolution. You can have conflicting major versions of the same dependency.

This was my point as well? You can have different versions of the same library, as long as you do not try to pass around data from one version to the other - which would not work in any language anyway.

Only if there is no interaction between them in your code

My first experience with the JVM classpath was when I started using Clojure with zero Java experience. I quickly understood it more or less works the same as UNIX $PATH (although the devil is in the details, they are not identical), and since then I haven't really hit any issues. nodejs/npm does something similar by instead of being user set by default, it defaults to "./node_modules && ./node_modules/*/node_modules" (recursively).

Most languages deal with paths in mostly similar ways (except Go until recently). What precisely is your issues with the Java classpath that someone who have been writing Clojure code for years haven't hit yet?


One issue is that you can't have a colon in your path names.

Is that something you come across frequently enough to hate the Java classpath? AFAIK, $PATH might be tripped up by that as well, as `:` is part of the syntax for delimiting.

Just because I was curious, I tried searching for all directories containing ":" and besides /sys and /run, I got three hits. One because of "https://", one because of "localhost:8080" (both mirrors made from wget) and one where it actually seems to be on purpose.

It seems, to me at least, that having colon in path names is unpopular enough for this to not be a real problem.

Edit: desktop I tried the command (`# find / -name ':' -type d`) on has been running daily since beginning of 2018 (2196 packages installed atm), with daily pacman updates, so should have enough data for a average workstation.


Colon is an invalid character on Windows systems. I've used it when making log files where they have an intermediate suffix with an ISO 8601 timestamp but indeed I don't see a reason to have it in a path with class or jar files.

How does this work out on Windows?

Windows uses ';' for path delimiter.

Why is that an issue?

Can't you just use an escape character?

Well, if you use maven or gradle, it will manage the classpath for you. If you are doing this manually, you indeed need to read up on how that works. It isn't particularly hard. It's like a file path; but for class and jar files. It pretty much works the same way. Where the class loading mechanism gets complicated is the notion that it's a run-time thing and that Java allows for loading and unloading things at runtime as well as loading jars and classes into separate class loaders that can each have their own copies of the same; or even different versions of class files. Not a lot of applications actually do that these days but it used to be quite common.

I actually used OSGI back in the day (a more powerful module system) that relied on this for managing different components that each had their own dependencies and dependency versions. Neat, but no longer that common to do these days. Osgi actually emerged out of the embedded software devices world. Later Eclipse used it to componentize the Eclipse platform and I know some companies that use OSGi for different things. These days you'd use microservices in some kind of Kubernetes container; or not (I'd consider that overkill for most of the stuff I work on).

The Java module system is much simpler and not really intended for this (at least not primarily). Instead, the main use case is ahead of time compilation. Modules allow parking things that can't be compiled (e.g. because of reflection ahead of time in separate modules, which you can then exclude. Breaking up the jvm platform into modules was a big project. As far as I know, they are not that commonly used for anything else. At least, I've not encountered many projects that rely on this. It's a pretty low level mechanism.

These days for servers, you package up applications as Docker containers, which contain your application and the full jdk. Both Spring Boot, Ktor, and other modern frameworks are commonly packaged up as single jar applications. Nothing magical about that; they are literally zip files with some manifest listing their content (classes, resources, more jar files). Of course modularizing a single jar application is kind of pointless. Likewise, there is no real need to tweak the java runtime you run it with either. Just not a thing. The notion of stuffing zip files in zipfiles is also a bit weird but it kind of works as a packaging mechanism.


OSGI is quite common in Java based CMS platforms, like Liferay, AEM and Hybris.

We still use OSGi today for hot-deployment deployment of customer created content onto existing containers.

I've had the opposite experience. The classpath system in java is, in my experience and opinion, one of the better packaging/module discovery systems among all of the languages I've ever tried.

care to explain ?

I have hate similar levels of hatred as OP toward java packages for years, then with age I started to appreciate the abstraction (independence from the file storage name space etc) but still i find clean python so much nicer 80% of the time.

I'm really curious about your point of view.


As someone moving from java to python recently, I'm more flabbergasted by python's dependency management than I've ever been with java's. "Classpath hell" hasn't been an issue for almost the last decade in my experience, and things generally just works. Maybe because the big dependencies everyone uses moves slow and are great at backwards compatibility, but upgrading something rarely never breaks, so seldom an issue with two different versions of something.

And modules have solved the issue of "you're actually using something internal from our package that might break in the future".

But in python upgrading something minor quickly wreaks havoc. Everything depends on different versions of something, which are quickly incompatible. So have to pin and do lots of manual testing and fight with pip for a while. Very easy to import and use stuff not supposed to be part of the packages' public api etc. And I find it even harder to know what's actually in use between the global packages, the virtual environment etc., than just looking inside my jar.


I feel your pain. Check out poetry if you haven’t seen it already as a modern alternative to pip.

Given a class and its package, I know exactly how to find the original code or class file.

Using the classpath, I can specify exactly the search order of classes, including overrides and whatnot, when the VM searches for them.

Need to debug a Jar? Unzip it (it's just a Zip file).

Jars can contain non-code resources, essentially allowing you to run an entire app using a single file, without the need to first decompress to the filesystem. Java gives you great utilities to find those resources.

Classloaders can be customized and you have complete control over how the JVM sees code.

It's boring, to the point, and simple to understand. I don't get how people have so many issues with them...


But if it is a class that exists in multiple jars in different versions due to dependencies dependencies depend... then it is not so easy when the wrong class is loaded.

That is why application servers use multiple classloaders isolated from each order, and in stuff like WebSphere you can even specify which overrides which.

I am not familiar with websphere and it is long time since I have used weblogic. Maybe it has the same feature? But it sounds like a tedious process to go through when trying to resolve such a problem. Though it also shows how serious the problem is. That the Application Server vendor builds such a feature.

Tomcat provides isolation between multiple webapps on the same Tomcat instance using a classloader hierarchy [0]

Barring the system classloader - where one ought not to place anything unless in dire circumstances - one gets very nice class separation.

Infact, every JSP is compiled to a Java servlet and then loaded in its own classloader. To see the speed - edit a JSP page and then see how fast the change is reflected. When devs edit JSP pages and see changes, a fresh classloader picks up the new class file and the earlier classloader is discarded. Due to classloader inheritance, the new Servlet is able to access Java classes from the parent classloader (the webapp).

For production deployments, and ideally for continuous integration setups where one should compile once and test and use the artifacts everywhere, one can use the pre-compilation feature [1] where the JSP pages are compiled as servlets during the compilation process itself (and not at runtime on production environments).

[0] https://tomcat.apache.org/tomcat-8.0-doc/class-loader-howto....

[1] https://tomcat.apache.org/tomcat-8.0-doc/jasper-howto.html#W...


All languages suffer from some form of "dll hell" when everyone goes through endless third party dependencies, the major difference is how those conflicts get sorted out.

The solution in the Java world allows kind of shipping both conflicting libraries and still have a go at it.

Other ecosystems might make you rethink the whole dependency tree instead.

I guess it was similar, I never used WebLogic.

https://www.ibm.com/support/pages/system/files/support/swg/s...


Classpath isolation is part of the JEE (and now Jakarta EE) specification for application servers. This guarantees that the application servers can host multiple applications without the classes in one application bring used in another.

More concretely, if one application packages log4j 1.2 and another packages log4j 2, the application servers guarantees that the two log4j libraries are isolated and do not interfere with each other.


I guess the span of my coding is really really small, I never really needed any of these.

to me the issue is that it creates parallel graphs on top of the FS, and that people, unless spending some time reading the spec (oracle basic doc website was utterly useless for me), will have barely no visibility on.

what I appreciate is that it's fully standardized in the spec, so you don't have to deal with competitive packaging systems / hygiene like in other languages.


> and that people, unless spending some time reading the spec (oracle basic doc website was utterly useless for me), will have barely no visibility on.

It's literally modeled after PATH, which behaves exactly the same and has been used just fine for decades. I don't really see the argument here.


A classpath is more complex than a fs path right ? there's more nesting levels. $maven/.../foo.jar contains stuff I don't know, and the graph is also not something necessarily obvious. Or maybe I'm just too limited and confused (plausible)

It's not really more complex. It's just a search path. There's no graph. It goes entry by entry, "can I load the requested class from this location? nope, okay next." and so on, finally ending up with a ClassNotFoundException or whatever if it can't find it in any of the classpath entries.

Not really any different than PATH.


PATH lists folders with files (or links) it's pretty much flat

classpath lists folders with jars which embeds their own subtree of classes, that's one difference in my (poor) understanding

then, as far as i can recall, each class file can define package namespace so looking at the class path I have no clue what is there or what isn't so classpath errors felt more daunting to me

now that the JVM does a tree search over all of this in an obvious linear manner, i can understand, but as a user i still find classpath and armies of jars very hard to map in my head (added to that the verbosity and large number of classes in many java apis)

if i'm still off, then there's nothing saving me :) except maybe if you have an article about classloading and all that, that may remap my brain


> classpath lists folders with jars

No, the classpath lists folders of classes, or actual paths to jars. Not paths to folders of jars.

> then, as far as i can recall, each class file can define package namespace so looking at the class path I have no clue what is there or what isn't so classpath errors felt more daunting to me

Classes are resolved from the root of the jar. Just inspect the jar to see what's in it. It's not common for third-party libraries to have conflicting package names.

I don't know why you're talking about tree searches.


The JVM doesn't look for classes in jars unless you explicitly specify the jar on the classpath. It really is just like PATH if you imagine that a jar is simply a compressed folder (which it is).

A huge disadvantage is that you can not have different versions of the same dependency used across the project - if you do it will, AT RUNTIME, fuck up your application (since Maven's dependency resolution is very primitive).

But then you have the ability to exclude dependency X's dependency 1.4.0 because you're already using 1.4.8 of Y, and 99% of the time, it works.

And I've had so many times when I've wished I could do this for Python dependency trees.


Sure you can, you just need different classloaders.

Using Gradle for local stuff takes pretty much all the pain out of it, and when running in production I just have a single .jar file with all the dependencies baked in. I used to get annoyed by classpath issues but since moving to Gradle I haven't had any.

Yep, this is also what I've experienced. Gradle isn't a "good" build system, by any measure. But it gets the job done and can handle the Java story quite well.

I mean, python is a lot cleaner than java in a lot of ways, but the PYTHONPATH environment variable does a very similar thing in a very similar way.

Yeah, I get lost in a sea of node modules. It almost seems like all of the files are named the same. node_modules, package, and index.js everywhere.

This article explains it well, code examples included: https://www.baeldung.com/java-9-modularity

Modules fix two issues that plagues Java for a long time, classpath resolution and monkey patching using reflection.

For the former, the classpath is a colon separated list of jars (a zip containing the classes). At runtime the VM tries to find a class by scanning linearly all the jars, when needed or panic with a NoClassDefFoundError. The modulepath is a colon separated list of directories that contains some jars. At runtime, the VM revolves the dependencies (stored in the module descriptor of each jar) before starting the application.

For the later, when you define a module you have to explicitly export packages and open them to reflection


That later solution won't work after Java 17, which I agree with. Java is not the language for doing monkey patching with internals.

https://openjdk.java.net/jeps/403

Notice that on Android, this kind of tricks earn you the prize of the system killing the process and be done with it.


Moduler are almost completely orthogonal to the classpath; nothing changes for the latter. They are meant more as a way to enforce encapsulation. For example, runtime reflection is disallowed through module boundaries.

From the JDK’s point of view it was really important, because some libs were depending on the internals of the OpenJDK, which are prone to change. With modules enforced, the internals can change much more liberally, while still keeping up perfectly with backwards compatibility.


You did not elaborated what frustrated you about classpath. So it hard to answer.

Not a fundamental flaw, but took some time to figure it out once I was hit by it. Jar files are iterated in a non defined (filesystem) order, meaning if the same class is found in more than one jar file, on 2 different systems different version of the class can be loaded.

This one will be somewhat better, because of those two will be in different modules they will be independent. If it happens within one module, then it stays the same.

I agree that this one is annoying. It happens rarely, but every time it happens it is frustrating for whoever is dealing with it.

I think there are ways how to automatically check for this, but yeah.


One thing I did not understand is why looking for class name collisions depends on modules. If I have two classes named com.foo.Bar in two different JARs, and I use both JARs (without any class loader trickery), then what part of the module system is needed to recognize this fact and throw an exception in my face?

This seems interesting to me because most of the time, dependency management happens to include two versions of the same library by accident and a simple exclusion rule fixes the problem, but obviously only if you know about it in the first place.


This is an implementation problem. For performance reasons, the JVM doesn't scan the entire classpath - when a Class object has to be created (to access one of it's members for the first time, such as a constructor or static method), some class loader is asked to scan the classpath until it finds that class (.class file, in .jar or directly on disk). This stops after the first successful find.

This is also a reason why you can't use regular Java reflection to do something like finding all references to a method, or even finding all classes - classes are loaded on the fly, only as needed (there are libraries that modify this behavior, though).


It doesn't depend on modules. And in fact, you don't even have to wait until runtime and throw an exception: plugins like https://github.com/basepom/duplicate-finder-maven-plugin/wik... can enforce this at build time, with neat features like allowing two classes with the same name as long as their contents are 100% identical, or allowing specific exceptions where you can confirm they aren't going to cause problems.

We run this plugin where I work. We have no plans to modularize our build since the duplicate class protection would just be worse that this plugin.


If these JARs are on the module path, this problem would be spotted at compile time, resulting in a failure due to the "split package" situation which has been created by having that class in more than one JAR.

Big fan here of the shadowjar Gradle plugin, which produces a single self-contained JAR file with your app and all dependencies, no classpath to worry about, no versioning headaches.

Another major benefit of modules: when distributing native executables, the included JDK can be limited to just the modules you need.

This can potentially reduce the size of java binaries by many megs.


Yep, I have a JavaFX GUI which is fairly complex, and the jlink image it creates is less than 30MB. Notice that this includes all my code, JavaFX, the JVM runtime and even Swing (even though I don't use Swing, it needs to be included for stuff like splash screen and some platform features, unfortunately).

How do you handle libraries that aren’t modularized?

I haven't used any yet. But if I do, I will try to either make a merge request modularizing it (it's pretty easy for libraries, and it doesn't change anything for non-modularized consumers) or just try to patch it with "Automatic-Module" and then use CLI commands to setup the module correctly.

This post explains in detail all the options: https://developer.ibm.com/tutorials/java-modularity-5/


Thanks, will read.

Is it just me or are Java Modules hard to create and use?

I found it pretty confusing to get them to 'work', especially with libraries that are not module enabled.

Also, there is not support for Java Modules for Scala nor Kotlin. May be we need more tooling and IDE support.

Use case: JavaFX


How do you address the security of classpath jar signing? Will this be addressed with modules? How can you prevent shims and tampering?

jar signing has been around at least since Java 6 if I recall correctly: https://docs.oracle.com/javase/tutorial/deployment/jar/signi...

Usually one way is to have your own classloader for example.

Super off topic but I had no idea .java was a TLD

about 10 years ago, there was a major push by some vendors (probably mainly IBM) to use OSGI.

some execs at my company caught the bug and had to have OSGI because it promised a "service oriented architecture"

except that was for services inside a giant monolith VM.

a more modern approach is to make anything that comprises a "service" and requires encapsulation be separated into its own process/server/pod

OSGI and the like really only benefit systems that are run in mainframe like environments (see the IBM connection?). Otherwise they're needless complexity.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: