Graal can also be used as an AOT compiler ("GraalVM Native Image") -- from Java bytecode (only!) to a native executable that runs independently of Hotspot. If the Java program so compiled is itself a Truffle interpreter, the result is a native interpreter for the Truffle language, that contains an optimizing JIT (for the Truffle language only).
To make things even more interesting, when Graal is used as a Hotspot JIT, because Graal itself is written in Java, it must either compile itself (bootstrapped by Hotspot's Java bytecode interpreter) -- in which case it is both a Java component run by Hotspot as well as a component of Hotspot -- or it could be AOT-compiled into a native shared library, which is then loaded by Hotspot like any other shared library.
You can use it by adding the following JAVA_OPTS:
libgraal isn't a C2 JIT compiler - it's (the clue is in the name here) a Graal compiler.
And the person you're replying to already mentioned it:
> or it could be AOT-compiled into a native shared library, which is then loaded by Hotspot like any other shared library
One question I have is does the GraalVM open up any possibilities towards extending Smalltalk to a more multi-threaded domain using the capabilities of it and the JVM. I guess that the one thing that has always held me back from doing more Smalltalk is watching that single thread peak at 12.5% of my machine's capabilities, with the other 7 cores staying quiet. Or is the "single-threadedness" of Squeak too deeply baked into the implementation to be able to work around with the JVM?
I know there were some multi-core experiements a few years ago with the RoarVM but nothing seemed to make its way into the main implementation.
RE your question: Squeak's scheduler (also written in Smalltalk) would need to be extended with support for multi-threading. Since that requires a significant amount of work, it's probably out of the scope of the project at this point (unless someone is interested to work on this, of course).
However, it's already possible to do multi-threading from Smalltalk using some other language like Java. We have some students doing just that with Ruby to parse HTML content in multiple threads.
So far, I'd say our observations wrt RPython match the ones discussed in "Tracing vs. Partial Evaluation" : Language implementers have to do more work in Truffle, but probably get better peak performance in return (not talking about warmup or memory consumption here).
RE "a more optimized bytecode" set: Sista  (an extension of the OpenSmalltalkVM) is doing something like that, too. I'd guess performance would be more or less the same using GraalVM as Truffle produces highly specialized code. But, of course, this would need to be benchmarked. The advantage of the Sista approach is that it's managed on the image level, so specialized versions of methods are persisted as part of the image. AFAIK the GraalVM team is working toward persisting compiled code caches, which is kind of similar but on the level of the language implementation framework.
Hope this answers your questions!
I posted a link to side-by-side measurements vw-python3.html of individual Smalltalk and Python programs.
Goldman Sachs use Graal for their Slang language experiments.
I use Graal at Shopify.
There are others.
There are also a bunch of Java frameworks (e.g. Quarkus, Helidon, Micronaut; see "In the Highlights" section on the bottom of https://www.graalvm.org/ for more) that come with special support for GraalVM.
So far, there's been a lot of interest in the performance improvements that the Graal compiler provides  and GraalVM Native Image , the toolchain that allows the compilation of Java apps and Truffle languages into binaries.
But, there's a lot more you can do with GraalVM .
The Dutch Police is using GraalVM's interop between Scala and R for their data science .
I'm sure there are other projects which explore the benefits of having a polyglot runtime, would love to hear about those efforts.
Goldman Sachs use it as an easy way to implement languages, to integrate existing C code, and possibly integrate Python in the future.
I use it for Ruby performance.
It is a big project, there are many runtimes it supports, it already seems production ready.