Hacker News new | comments | show | ask | jobs | submit login

This diff is more explicit about what's going on:

https://android.googlesource.com/platform/libcore.git/+/aab9...

    Change dependency from libart -> libopenjdkjvm.
There are also diffs adding lambda support, tweaking various classes for compatibility with applications that use reflection to access internal capabilities, and fixing lots OpenJDK compatibility bugs.

Android still needs to run dex bytecode somehow, so there are two possibilities for how N will work.

Option one is that Android stick with ART and replaces Harmony with OpenJDK: from a technical perspective, that wouldn't be the end of the world, especially since the Harmony implementation is rather inefficient.

Option two is that Google ports Hotspot to run on Android and, then has PackageManager convert dex bytecode back to Java bytecode on device. That would be awful, since ART is built for low-end devices and, well, Hotspot isn't.

In favor of option one is that Google is still developing ART. In favor of option two is Oracle being Satan incarnate.

I also wouldn't be surprised if Oracle has compelled Google to simply ship a copy of Hotspot, allowing developers to ship "authentic Java" APKs instead of dex-bytecode ones, with the two environments running in parallel, with two different zygotes.




> That would be awful, since ART is built for low-end devices and, well, Hotspot isn't.

That is plain Google FUD to not follow the Java standards.

There are lots of embedded devices, more constrained than Android phones, running commercial compliant JVMs like Atego, Jamaica, J9 among many others.

Also Sun/Oracle Hotspot implementations have existed since the J2ME and Embedded Java early days for devices with just a few hundred KBs, not a few hundred MB like Android.


I'm pretty sure those constrained devices are running bytecode interpreters that aren't nearly competitive with native code in terms of speed, with UIs that wouldn't cut it on even a low-end smartphone these days.


Then you should educate yourself, as they are running JIT and AOT code.

I can provide links for product documentation and real products if you wish.


You are right.

The code for these embedded devices is JITted and I know because i worked on them. It also included some AOT but not much. In the past one of our guys benchmarked against an older version of Dalvik and had us at about 5x with those VM's.

To be fair they are not hotspot and it does have startup time issues because unlike the embedded version of our JIT's it doesn't feature MVM. However, a lot of this is configurable and easily fixable. Hotspot is remarkably tuneable and if MVM is added (which is possible) could probably beat ART in startup time as well...

ART is pretty fast by now though and pretty well understood. I don't think Google would switch to hotspot and I don't think it will need that for compliance either. They might just reuse some libraries that can be common and that's it.


Every time an Android device installs its monthly security update, it spends an inordinate amount of time, with the screen on, "optimizing" every single app.


pjmlp: original poster above referred to Hotspot, not arbitrary JVMs. Clearly Android devices can run VMs with both JIT and AOT (they do, after all), what OP questioned was whether Hotspot in particular (not some arbitrary VM) is a good VM on a mobile device. Given Hotspot's somewhat underwhelming startup speed even on desktop class computers, I think that's a very reasonable technical question to ask, and not FUD in any way.


Hotspot requires as little as 128 KB RAM and 1 MB ROM

http://www.oracle.com/technetwork/java/embedded/javame/embed...

If one adds the GUI client APIs, then the whole SDK grows up to 5MB ROM.

http://www.oracle.com/technetwork/java/embedded/javame/embed...

Some commercial products using the above runtimes are the Gemalto's M2M modules or evaluation boards like the ARM Keil F200.

But I expect Google fanclub to downvote me.


The document you link to is not about Hotspot, it's about Oracle's Java ME VM, which is a different product from all I can tell. It doesn't mention the term "Hotspot" at all.


Hotspot is Oracle's Java VM JIT compiler, not a separate product.

They don't produce different JITs from scratch for each VM implementation.


Thanks for links and info.

I think comparability between Java (SE or ME) and Android would be a great win fit both communities :-)


As a sidenote, giant commits including giant dependencies in the tree are the perfect time to include new backdoors. Nobody will know who really introduced them.

Wish people didn't do that. Separate repos are not hard to create...


dang, want to change the headline from "mysterious commit" to "Google switches Android to OpenJDK"?


> Harmony implementation is rather inefficient.

A lot of stuff have been fixed or improved from the original Harmony code over the years, performance-wise the android implementation of the core library and the openjdk implementation are now similar (openjdk has a few more intrinsics and makes a more liberal use of native methods).


[deleted]


You're confused on several points. First, Java VMs don't "generate bytecode": if they have JITs, they generate machine code. Otherwise, they just interpret bytecode. Second, you're confusing OpenJDK-the-set-of-libraries with Hotspot-the-VM. OpenJDK probably isn't a disaster for performance: not because it's good, but because Harmony was abysmal. (Count the allocations inside String.format.)

Switching from ART to Hotspot is not a clear win. ART does AOT compilation (at least some of the time), is integrated with the system runtime (doing a compacting GC pass on app switch, for example), and interacts properly with Android's zygote-based start scheme. (It doesn't, for example, COW away all the memory benefits as a naive fork would.)

Both ART and Hotspot have pretty good code generators and allocators. There's no reason to support that the latter would be a better choice, technically, than the former. Google seems to agree, since ART development is ongoing.

Please read more about VM implementation schemes instead of continuing to make unfounded assertions (like "Java is based on the idea of a JIT" and "virtual method calls are slow without a JIT").


Option two wasn't even worth mentioning. There is absolutely no way Google is going to abandon ART and switch to a significantly less performant VM. AOT is here to stay.


A JIT makes many tradeoffs but it is always capable of producing code at least as good as an AOT. How much better that code is depends on many factors, such as the language, the application, and how much time/energy you're willing to spend on optimization (the latter might lead to choosing to generate code that's less optimized than an AOT).

A slightly bigger difference is not between JIT and AOT, but whether you can dynamically load code at runtime or not. If not, that opens the door to some whole-program optimizations, or, at least, removes the need for guards, generated by the JIT, that are triggered when new code is loaded. In any case, mobile applications don't load code dynamically.


> the latter might lead to choosing to generate code that's less optimized than an AOT

With ART compilation happening on the device (during installation instead of runtime, but still) it suffers of the same trade-off of compilation performance vs. performance of the generated code.

> In any case, mobile applications don't load code dynamically.

Android has a DexClassLoader. I'm sure some people use it.


> it suffers of the same trade-off of compilation performance vs. performance of the generated code

Sure, which is why HotSpot may be better.

> Android has a DexClassLoader. I'm sure some people use it.

I didn't know that (not an Android dev), but that only means that some form of JITting may be beneficial anyway (depending on how popular this feature is).


Of course a JIT is capable of producing code as good as an AOT and perhaps even better since it can capture more profiling data. The problem with a JIT is the startup time and this has not gone unnoticed by Oracle as even they've started working on AOT.


Not startup but warmup (i.e. the time until the application is fully optimized). There are other problems with a JIT on small devices, such as increased memory and energy consumption (each may be significant or not, depending on how the JIT works). I am a big fan of JITs, but as with everything in software, it is a tradeoff.


Increased memory consumption and reduced page sharing between processes using the same code means lower performance. Memory bandwidth and cache capacity are a much bigger bottleneck than CPU throughput for most use cases. A JIT compiler is not competitive in performance with an AOT compiler if both are using the same code generation implementation. And profiling is limited in what it can do, especially when AOT compilation with PGO as used by projects like Firefox is considered.


> A JIT compiler is not competitive in performance with an AOT compiler if both are using the same code generation implementation.

If a JIT uses the same code generation as an AOT, it will work as an AOT, namely, only when the application starts, without additional burden during its runtime. Most good JITs use profiling information, which is collected at runtime -- this has some memory overhead, but results in better code.

> And profiling is limited in what it can do, especially when AOT compilation with PGO as used by projects like Firefox is considered.

Modern JITs are like _very_ sophisticated PGO AOTs, but they have true profiles exposed to them. Besides, the overhead for even a good JIT is not that large, it's just that on mobile devices even a bit more RAM is more than you're willing to afford.


The main benefit from ART on Android is having all of the code generated in advance and mapped from storage. There's no waste from code being generated at runtime in-memory. Wasting memory by generating code dynamically certainly hurts performance so claiming that a JIT compiler is always just as good is not true.

There's room for a JIT compiler but the sanest baseline is mapping AOT compiled code from storage. It's way faster than a baseline interpreter and way lighter than a baseline method JIT.


> AOT is here to stay.

No it isn't. Read the source. ART has a JIT now.


As a guy who used to work for Sun on JITted code and is now doing AOT for https://www.codenameone.com/ I've got to say that JIT always beats AOT in runtime. It can also beat it in startup when properly designed (MVM, caching etc.).


Generalizations about which technique beats the other technique reflect a lack of technical maturity. Both have advantages, and you're doing a disservice by advocating the use of one or the other exclusively ignoring differences in environment, circumstances, and workload.


Fair. But having worked on both and currently working on AOT I'd say I should be biased for AOT not against it...

Startup time/warmup is obviously a well known JIT weakness but if its well written its just really hard to beat. E.g.

if(x) { invokeMethodX() } else { invokeMethodY() }

Say x is related to user details/preferences and for a specific install will never change... AOT can't do anything about that...

JIT can inline invokeMethodX() then optimize across methods and eliminate the if altogether. This isn't some theoretical exercise... Its stuff that exists today in JIT's. AOT provides consistency which is a good thing, you can write low level code (e.g. C can be handcoded to a level that's pretty great) so I'm mostly talking about higher level languages (Java etc.).


Profile-guided optimization exists and is used by projects like Firefox is the real world. The advantage of JIT compilers is getting the profiling data dynamically, but that's usually going to be worse off than a well-tuned profiling suite especially if there are manual annotations.


AOT can do most of that as well - see likely() annotations.

And it's highly unlikely the JIT can get rid of the if entirely - it needs to check that it's assumptions hold true, otherwise it needs to drop out into the none-JITted code.


The very fact that Oracle has started to address the slow startup times of JVM applications by finally working on AOT compilation is an admission of the technical immaturity of the JVM. Also, the only relevant environment here is mobile and startup times are paramount in this environment.


The issue they're addressing with AOT is not startup time, it's warmup time and it's intended to help out, specifically, high frequency trading companies. Currently such companies have been known to engage in risky behaviour like, um, submitting bogus trades to the market and then immediately cancelling them in order to force JIT compilation of their codebase.

The HotSpot AOT work is actually a hybrid. The AOT compiler (Graal) has several modes and to get the best performance the AOT compiled code actually is compiled with profiling code. Thus it runs slower than it could do, but the profiling data is then used to trigger further JIT compilation in order to reach the best possible peak performance. So it ends up being a hybrid approach in which JITC still features heavily, and the AOT work is done to reduce the length of time taken to reach peak throughput.


> The very fact that Oracle has started to address the slow startup times of JVM applications by finally working on AOT compilation is an admission of the technical immaturity of the JVM.

Not at all. If you listen to the talk introducing that work, you'll see that it is designed to address a very particular (and relatively unusual) use-case, which is important to some specific (yet lucrative) Oracle customers.


As far as Android is concerned - AOT is here to stay. And yes, I'm aware ART has a JIT and it's probably used for devices that cannot handle the overhead of AOT, but I imagine it's seldomly used considering the specs of today's phones and the requirements of Google's CTS.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: