I know it supports executing LLVM IR - but that's not the same as a target platform.
Sorry, seems that you're right:
Specifically, that's a source platform, which is basically the opposite of target platform.
> It has an LLVM IR backend [ie, LLVM IR as a target platform]
The quote in my previous comment was from yarg, and (as stated) was describing a source platform.
That implies other runtime will execute them.
There are already enough of WASM runtimes, Graal itself, as others pointed out.
For example to let you run Java code in a browser.
It'd be much more direct to ship apps directly to the desktop, or to develop a kind of Graal browser. You could even fork Chromium to add GraalVM inside Blink. Trying to put Java into the browser is a dead end though, browser makers won't tolerate competing platforms they don't control.
What one wants is a .class to WASM converter and there already a couple of them to choose from, namely TeaVM and CheerpJ as the most advanced ones.
Do they provide the kind of best-in-class whole-world-analysis optimisations that Graal does?
But we should take a look at what they produce some time!
I think in practicality Graal's LLVM output is not terribly platform agnostic, as LLVM lacks some expressiveness for the thing that Graal needs to do for deoptimisation.
Naturally consuming them as yet another WASM runtime makes sense, but that isn't what OP was asking about.
But you can't apply high-level optimisations by the time all you have is WASM bytecode and all your high-level structure is gone. You also don't have enough time to apply these optimisations if you're JIT compiling.
Maybe if we were talking about SubstrateVM, but even then, I don't get why that should be something to increase its code complexity of backends.
Yeah that's what I was talking about.
> I don't get why that should be something to increase its code complexity of backends
Backends are pretty well-isolated parts of the code. You can see how simple the LLVM one is for example.