

Fast Enough VMs in Fast Enough Time (2013) - epsylon
http://tratt.net/laurie/blog/entries/fast_enough_vms_in_fast_enough_time

======
_halgari
I completely agree with this article. I was surprised how fast I was able to
implement the guts of pixie ([http://github.com/pixie-
lang/pixie](http://github.com/pixie-lang/pixie)) using RPython

I had a repl, with a JIT and a GC done in about 40 hours of work (this is a
pet project and I'm a father, so I only hacked on it for about 3 hours a day
for a month). At one point I commented to my wife how shocked I was that I
started implementing the JIT one morning and about 2 hours later it was done.
From there it's just been an hour or two here and there tuning it. That's
something that would have taken me months to implement by hand. Making
polymorphic functions transparent (completely removable by the jit) was about
4 lines of python code.

In addition, all the stuff in rlib is simple to implement. Want FFI in your
language? Pull in the rlib/jit_libffi.py module and crank out a hundred lines
of python code to interface. Want strings that are implemented via ropes? Just
one more module to import.

RPython (and the PyPy toolchain) are amazing, I wish more people would spend
time playing with them. And on top of all that the pypy mailing list is very
active. Most of my questions were answered within a day, sometimes within an
hour.

Oh, and when I found a bug in PyPy, my patch was accepted and merged within 48
hours.

I can't say enough good things about this project.

~~~
616c
I have briefly followed your work on pixie and read the [R]Python code a few
times when I had down time after it was published on HN the first time. Very
inspiring stuff.

Out of curiosity, how long had you been coding Python or programming in
general prior? Your comment is so motivating, but as a father with a 1.5 year
and no free time I dream of being even half as productive as you.

------
th3iedkid
This article is a good example on language implementation concerns, another
problem faced while creating new languages is about language designs along
with toolchain.

Engineering language designs actually involves many iterations and have found
projectional editors very very useful saving lots of times and effort for us
building quick prototypes.One such platform i use very often is JetBrains MPS.

With MPS we even have a good IDE eco-system developed while actually iterating
on the design.So we get to know how good a GUI toolchain we will need post
implementation internals are engineered.

Another thing with this is the concurrency in language design as well language
toolchain and implementation.So while major design concerns are fixed ,
toolchain concerns and others are already prototyped to close in development
of an overall ecosystem.

Though i haven't designed a lot of languages , but i have been on DSL
platforms esp projectional ones.Have JetBrains MPS to be quite mature nowadays
as compared to Eclipse Spoofax but the former has involved so much of learning
that i haven't been able to invest time and engergy experimenting with the
latter.

Is there someone who has used Spoofax as well as MPS who can may be add more
details on design concerns . I would be very happy to share my experiences
with MPS , might be limited but was wonderful for sure.

[1] JetBrain's MPS :
[https://www.jetbrains.com/mps/](https://www.jetbrains.com/mps/) [2] Eclipse
Spoofax : [http://strategoxt.org/Spoofax](http://strategoxt.org/Spoofax) [3]
Projectional Editors:
[http://martinfowler.com/bliki/ProjectionalEditing.html](http://martinfowler.com/bliki/ProjectionalEditing.html)

------
jeffreyrogers
This is a great article. I wonder how implementing a VM in RPython compares
performance-wise to using the LLVM toolchain. I know that Julia (which is a
very fast, dynamic language), for example, uses LLVM to provide its JIT
compiler. (Of course, Julia is also specifically designed for performance,
while most research languages won't be).

~~~
Dewie
> (Of course, Julia is also specifically designed for performance, while most
> research languages won't be).

Doesn't that depend on the kind of research that is being pursued? If the
research is about high-level, productive language features, performance is
going to be a secondary concern at best. If the research is about a language
being able to prove properties about programs - including low-level stuff like
memory safety, not just high-level semantic properties - then the language
might very well accommodate efficient implementations.

[http://en.wikipedia.org/wiki/ATS_%28programming_language%29](http://en.wikipedia.org/wiki/ATS_%28programming_language%29)

~~~
jeffreyrogers
Yeah, it depends on what you're trying to accomplish with the language. The
most interesting new languages I've seen lately have focused a lot on
performance (Rust, Julia, Nimrod, etc.) but the author of the linked article
is more interested in how different languages can be composed together, so for
him performance matters only inasmuch as the language is useable.

My main point though was that part of Julia's high performance is due to
specific decisions by the language designers (e.g. the compiler knows a lot of
information about types even though the programmer can mostly ignore them),
while a language like Python wasn't designed for performance and is thus
harder to optimize.

------
ScottBurson
I think it's unfortunate that more people aren't aware of the benefits of
using Common Lisp as an implementation target for their new language. All you
have to do is to parse your language into list structure and write macros to
translate it into Lisp. You get an interpreter _and_ a native-code compiler
for free! And there's a long list of language features for you to build on: GC
of course, plenty of built-in types (including lists, arrays, and several
kinds of numbers) -- and the crown jewel, CLOS, with multiple inheritance,
multiple dispatch, and of course, the MOP.

I understand why a lot of people aren't interested in writing CL directly --
though I am still fond of it; it does suffer somewhat from historical baggage.
But if you're layering another language on top of it, none of that matters.
And it's a great platform.

~~~
techdragon
I keep thinking it would be a great idea, then I go and look for the 'right
lisp to run it on' and slink back out of the dark alley carefully before I run
sprinting back to something much more standard.

[http://en.wikipedia.org/wiki/Common_Lisp#Implementations](http://en.wikipedia.org/wiki/Common_Lisp#Implementations)

Only Movitz ever looked interesting enough to get me started on pre-prototype.
But it was shortly abandoned as the usefulness didn't translate to the more
used lisp implementations.

------
e12e
What a nice reminder to look more closely at implementing a language of two
with rpython. I remember being fascinated when I first read about a prolog
implementation, I believe it was this one:
[https://bitbucket.org/cfbolz/pyrolog/src/653f1c4febf83b92d38...](https://bitbucket.org/cfbolz/pyrolog/src/653f1c4febf83b92d3891086fe8157b5f32fd9fb/prolog/doc.txt?at=default)

Paper:
[http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.103....](http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.103.1886)

Other(?) paper, more about pypy and jit: "Jitting Prolog for Fun and Profit":

[http://bergel.eu/download/Dyla2010/schneider-prolog-jit-
fina...](http://bergel.eu/download/Dyla2010/schneider-prolog-jit-final.pdf)

At any rate, I remember seeing how (seemingly) easy they'd implemented
something like Prolog (which, to my mind is a complex thing, but that has more
to do with the nature of Prolog-like systems, search, cuts etc -- than with
the language being particularly complex). Great to see that someone that
didn't initially come from within the pypy sphere found a use for it "in
anger" \-- and to see that it's a viable tool.

