Hacker News new | comments | show | ask | jobs | submit login

Am I correct in reading the license in Github that this is all GPL? Albeit GPL2 but still not some Oracle proprietary license.

"One VM to rule them all" reminds me of Parrot VM (for Perl 6). Looks like development slowed down a year or two ago. Back then, it got me excited as it looked like the "open source community" alternative to "Big Corporate Java".


Is anyone else so leery of Oracle's lawyers you want nothing to do with this? Even if it's awesome work, as I expect it is, I don't trust Oracle not to someday find a new way to screw me, whatever the text of the license.

(I'm having doubts if this comment belongs here. But it's my genuine reaction, for what it's worth.)

Wouldn't touch this tech with a 10 foot pole due to licensing concerns and lock in.

LLVM is providing most of the interop without strings attached.

0% chance I'll ever use this or let one of my team members use it.

I don't know a single person that's said a single good thing about Oracle, including people that work there. I know a ton of people that have been burned by Oracle and their lawyers.

> Am I correct in reading the license in Github that this is all GPL? Albeit GPL2 but still not some Oracle proprietary license.

Ref https://github.com/oracle/graal#license, appears that the parts you would want to distribute have the classpath exception which allows linking without messing with the license of your code (same as the JVM). The FAQ also covers this: http://www.graalvm.org/docs/faq/.

"One VM to rule them all" goes all the way back to mainframes and their language environments, with UNCOL being one of the first attempts in 1958.


I was very interested in Parrot VM for a while too for the same reasons. I always liked the idea of language interoperability and being able to use the best libraries of each language. This brings me to a question I have had for a while, why do we even have so many language specific libraries for common tasks instead of more "universal" libraries with bindings for each. Of course these exist, but I feel like this should be almost the default, and language libraries would just be wrappers to help it integrate better with the language (like making it more pythonic for python etc). That way you would have at most a few competing libraries for each task with higher quality. Also it would be easier to bootstrap new programming languages since that is a big initial barrier (having few libraries).

There are fundamental semantic differences between the languages. An immutable language will always have a serious impedance mismatch with a mutable library. A library that uses GC won't work in a language without them, a specific example of the general principle that memory management strategies vary widely. A library that uses dynamic types will have a huge conversion boundary with a statically-typed language, and vice versa. A library that uses threads, or expects threads, will have a hard time working in an event-based environment that depends on code voluntarily yielding to maintain its responsiveness. A language that wants to provide guarantees about correctness or whathaveyou is forced to give all those up to talk to a lowest-common-denominator library.

And that's just the top-level differences. It gets worse as you get into the details of API style and the costs of the indirection layers to convert into the local style, assuming conversion is even technically possible.

Even if you write with the lowest possible common denominator of C or Rust with no runtime, you'll still encounter a significant subset of those issues. And of course we still will use such things; all serious languages can communicate in some manner with a C implementation of a library. (Not for any theoretical reasons, really, C has more opinions than people often realize, but because it's still the baseline current systems are built on.) But it can't be as good as a native implementation.

>you'll still encounter a significant subset of those issues

Heck you will encounter those issues in a large project using only one language!

> This brings me to a question I have had for a while, why do we even have so many language specific libraries for common tasks instead of more "universal" libraries with bindings for each.

A lot of language specific libraries incorporate foreign language (often C) libraries and add additional functionality thst fits the ergonomics of the target language, rather than being a thin wrapper around a language neutral library. This is more than aesthetic when the target language has, for example, differences in type system or fundamental guarantees from the underlying library's language.

OTOH, sometimes there aren't C (etc.) libraries, and it's more straightforward to implement a library in (say) Python. But once you have a Python library, calling into it from Rust, C, Erlang, or even a broadly similar language like Ruby is, to put it mildly, non-trivial.

Languages usually have a foreign function interface (FFI for short) to interface with external libraries. That's not as easy as one could expect, even for languages that are "closer to the metal": I remember Pascal and C different parameters passing conventions, left to right vs right to left and the cleaning of stack... you need to use compiler directives and careful thinking. Also which memory manager you use and the fact that DLLs might not "see" the same address space.

You mention Python that they say has an easy FFI with C. That's nice but it's usually used for the performance more than for reusing code.

Well, we do have some programming tasks that are almost exclusively performed through bindings on different platforms (cryptography, graphics/GUIs, scientific computing to some extent), but the hassle of maintaining bindings is enough that the “need barrier” is high: a given library has to be doing really important stuff that is also really hard to reimplement to end up in the set of tools that are almost always used via interop/bindings. This indicates a deeper problem hidden in your question: for a bunch of reasons (technical, personal, OSS-community-political), maintaining multi-platform bindings is a hassle. Things like Truffle/Graal aim to solve that by providing a common binding “middle” layer in various platforms’ runtimes. Things like libffi offer ways to make ad hoc binding easier. It’s a complicated problem.

This is exactly the case with JVM languages today: Every language has its own idiomatic abstractions, but everything is ultimately based on the underlying Java libraries.

> "One VM to rule them all" reminds me of Parrot VM (for Perl 6). Looks like development slowed down a year or two ago.

The concept lives on in the form of two "new" Perl 6 projects, namely MoarVM (started in 2012; a multi language VM) and NQP (started around 2008; a virtual VM and multi language compiler toolkit).



These can appropriately be compared with Graal and Truffle. Imo their main weakness is they're not backed by Oracle but rather the Perl 6 project.


MoarVM/NQP compares very badly with Graal and Truffle. MoarVM is a ton slower and less flexible.

Thanks for replying.

I think MoarVM/NQP are strikingly similar in their approach based on what I've learned so far about Graal/Truffle (which isn't much; perhaps a few hours of exploration; eg reading/watching the One VM to Rule Them All, One VM to Bind Them paper/video). [1]

Given the large number of Oracle employees working on Graal/Truffle (40 in 2016 according to the video) it doesn't surprise me that it's currently a lot faster.

Did you use Graal/Truffle friendly benchmarks (just as was done in the video)? Have you tried MoarVM friendly benchmarks to see where MoarVM shines? In particular, have you accounted for the new JIT that recently landed after 3 years of non-master branch development? [2]

Would you please elaborate a little on how MoarVM is insufficiently flexible?

TIA for any reply.

[1] https://www.youtube.com/watch?v=FJY96_6Y3a4

[2] https://perl6advent.wordpress.com/2016/12/09/a-preview-of-th...

> I think MoarVM/NQP are strikingly similar in their approach

Does MoarVM use a meta-compilation technique? Pypy for example uses meta-tracing and Truffle uses Partial Evaluation to get from an interpreter specification to dynamically compiled code without duplicating the language logic. If yes, could you please point me to further information?

Meta-compilation is the primary innovation in Truffle. It allows to compose languages together and to compile them as one unit.

If you want to know more about the Truffle approach to meta-compilation see this PLDI paper from 2017[1].

[1] http://chrisseaton.com/rubytruffle/pldi17-truffle/pldi17-tru...

Thanks for the 2017 link.

Its NQP that does metacompilation, not MoarVM (afaik).

I linked to NQP's (spartan) readme above.

Some other points/links that might be helpful:

* nqp is written in nqp.

* nqp has a first class grammar construct. See wikipedia's Perl 6 Rules page.[1] (nqp's grammar is defined using nqp's grammar construct. So is Perl 6's.)

* nqp and MoarVM implement a protocol for abstracting behavior (eg methods) and representation (memory layout and use) called 6model.[2]

To try set your expectations appropriately, let me emphasize this note from the NQP readme: "NOTE: there's no end-user support for NQP and the behaviour can change without notice. It's a tool for writing Perl 6 compilers, not a low-level module for Perl 6 programmers."

NQP is a tool for writing compilers, not just Perl 6 compilers, and is in fact much more than that, but the Perl 6 project is currently very focused on the Perl 6 language and definitely doesn't want ordinary Perl 6 programmers thinking NQP is a tool for them.

Doc is sparse, again reflecting the focus on the Perl 6 language and the Rakudo compiler that builds atop NQP, analogous to your team focusing on just one of your guest languages.

The best doc for getting into NQP is not even in the NQP repo. It's the support materials for a 2-day workshop held in 2013. The abstract is "This intensive 2-day workshop takes a deep dive into many areas of the Rakudo Perl 6 and NQP internals, mostly focusing on the backend-agnostic parts but with some coverage of the JVM and future MoarVM backends also. During the course, participants will build their own small compiler, complete with a simple class-based object system, to help them understand how the toolchain works.".[3]

Beyond that, there is a doc directory in the nqp repo.[4]

Please write an update here on what you find out.


[1] https://en.wikipedia.org/wiki/Perl_6_rules

(nqp rules are Perl 6 Rules grammars but the general purpose DSL (!) for writing procedural code embedded in an nqp grammar is nqp's general purpose DSL rather than Perl 6's general purpose DSL.)

[2] https://github.com/perl6/nqp/tree/master/docs/6model

(MoarVM directly implements 6model -- MoarVM is named for "Metamodel On A Runtime". 6model is how one nqp based language is able to do magic like subclassing a class written in another language (whether nqp based or via an existing compiler) despite them having distinct OO semantics and implementation strategies.)

[3] https://github.com/edumentab/rakudo-and-nqp-internals-course

[4] https://github.com/perl6/nqp/tree/master/docs

What is the license of GraalVM?

There is a community edition (CE) and an enterprise edition (EE) of GraalVM. The community edition is distributed under an open source license. It is free to use in production and comes with no strings attached, but also no guarantees or support. The enterprise edition is available from the Oracle Technology Network under an evaluation license. It provides improved performance and security for production deployments. If you are interested in using the enterprise edition in production, please contact graalvm-enterprise_grp_ww@oracle.com.

The CE doesn't run on OSX though. Sigh!

If it really is, I’ll gladly start poking at it! That’s the only issue (licensing that is) that’s been pushing me away from Oracle these days.

> "One VM to rule them all"

Except a VM with a more liberal license could easily beat it in popularity.

Linux and Java have the same license as this.

Apparently the GPL was never an issue to those enjoying GNU/Linux, maybe you are enjoying your BSD desktop experience?

True, but on the other hand more people are using BSD than Linux, at least on the desktop/laptop because OSX is based on it.

Doesn't OSX ship with more GPL code than BSD? It has a BSD kernel at it's core but it also has GNU tools and safari (forked from KHTML).

Yeah, but Apple doesn't mind sharing the source for those, if they make modifications.

These licenses prevent Graal VM from being used inside e.g. OSes and browsers with a more liberal license.

And how many of those changes have actually benefited FreeBSD?

None, but that is not the point. The point is: can we use the code without hindrance in whatever plans we have now or in the future? Or is building upon the code like going into a long one-way street?

The fact that some people will use the code without giving back is secondary to that.

That is not just my opinion, because if a more liberal library comes into existence (inevitable), it will prevail.

I agree that GPL is bound to long term irrelevance due to the way most companies deal with open source.

And on that day you will miss when GNU/Linux mattered.

But hey, we will always have freeware and public domain.

Not in performance.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact