I remember being very excited when this came out. What impressed me right off was the java.net package -- back then, few language environments (except in Unix-land) had built-in networking support. That was usually a (kludgy) add-on package. That, along with Threads made Java really stand out as the language for the era of distributed computing. Applets were a great idea that unfortunately didn't pan out.
It still has a lot of good qualities, but is one of my least favorite languages to use (much more into Go now).
was expecting the actual vm source code, which I believe is written in C and is an engineering marvel. This looks like just the end user install package
What do you mean by "does absolutely nothing"? It does something, namely load a dynamic library that presumably implements its methods. Now maybe that library has never done anything, but that's impossible to tell from the code you linked.
I mean that the native methods it ultimately calls do nothing, and never have.
The early versions of the JVM didn't include a compiler, they were purely interpreting, so those calls had to be no-ops. Later versions included a JIT, but the decision of which classes and methods to compile was made by the JVM, and again, those calls were no-ops. As far as i know, there was never a JVM from Sun / Oracle where those methods did anything.
It's possible that IBM's or some of the other JVMs did something with those calls. But in general, having user code be involved in the decision of which methods to JIT is the wrong approach.
I vaguely remember that in the 1.0 era, the plan was that JIT compilers would be third-party plug-ins to the Sun JVM. So you'd buy a compiler from, say, Borland, install it into your JVM installation, and then when java.lang.Compiler was used, it would load your shiny new Borland compiler, which would somehow magically interact with the JVM to compile bytecode. I never heard of this happening. There wasn't a published JVM API for the compiler to talk to, so it would have required a lot of close cooperation between Sun and the compiler vendor.
Note that this is under a proprietary license (see toplevel COPYRIGHT code). Given the rather messy history around the Java API, it might be legally problematic to look at this code.
I came across this post [1] in which the author discovers an undocumented feature in the JVM.
"Although the first stable release of Java used bytecode version 45.3, the JVM will actually accept classfiles with versions starting at 45.0. Furthermore, there’s an undocumented feature in the JVM where it parses Code attributes slightly differently when the version is 45.0 - 45.2.
In a normal classfile, the stack, locals, and code length fields of the Code attribute have lengths of 2, 2, and 4 bytes respectively, but in a pre-45.3 classfile, the JVM expects them to be 1, 1, and 2 bytes instead. Normally, this means that a pre-45.3 classfile produced by ASM will just crash when run on the JVM because the JVM encounters garbage data while parsing and rejects it.
However, if you are very careful, it is possible to construct a classfile that is valid when parsed with the 2,2,4 widths, and also valid when parsed with 1,1,2, but parses as different code in each case. This means that it is possible to craft a classfile that executes one piece of code when run on the actual JVM and displays a completely different set of fake code when viewed with reverse engineering tools"
TL;DR JVM contained an undocumented feature in the JVM where it parses Code attributes slightly differently when the version is 45.0 - 45.2. The stack, locals, and code length typically has 2,2,4 bytes rspectively; in pre-45.3 classfiles, the JVM expects them to be 1,1,2 bytes instead.
Judging from the downvotes, a good portion failed at thinking about it. Answer:
In the 20th century, we commonly expressed the year by its last two digits, you know this is why we talk about "70s", "80s", "90s". It's a small number, not ambiguous, and it also happens to be "the number of years since 1900". So of course a computer in the mid to late 20th century would choose this base. IT means the number is convenient for display, but also works for the machine, because "years base 1900" is a valid machine model as well. And it became a popular time format others picked up.
Lo and behold, though, the world didn't end in 2000. What do we do? Well, let it go over 99 and we add "1900" to it for display purposes, so we can maintain BC.
There. That's why we didn't choose any other base from 4.3 billion years or whatever.
It still has a lot of good qualities, but is one of my least favorite languages to use (much more into Go now).