

Numba - JIT specializing compiler for annotated Python and NumPy code to LLVM - albertzeyer
http://numba.pydata.org/

======
takluyver
The homepage is a bit dry, but for a look at what Numba is already capable of,
this blog post shows that Numba-translated Python code runs about as fast as
optimised Cython code (the current tool the SciPy world uses for fast code):

[http://jakevdp.github.io/blog/2013/06/15/numba-vs-cython-
tak...](http://jakevdp.github.io/blog/2013/06/15/numba-vs-cython-take-2/)

The downside is that Numba can't yet translate any old function you give it,
especially if it involves string manipulation (as the name suggests, the focus
is numeric). But it's still quite a young tool, and I'm optimistic that that
will improve.

~~~
ninjin
As someone that has numerical code at the bottom of my research applications,
Numba looks really exciting to me. Shaving off a constant factor means more
experiments etc.

But one of the nice things about NumPy is that it allows you to make very
complicated operations largely transparent thanks to broadcasting. Just
compare pairwise_numpy vs. pairwise_python on the blog post you linked. I
don't think anyone really favours the version that you can apply the Numba JIT
to.

I know that I will cave eventually when I am desperate for speed, I already do
write C99 extensions at times (not for numerical code though). I just wish
there was a way to use expressions rather than going full imperative to gain
some of that speed.

~~~
takluyver
No-one favours the pure Python version at the moment, because we're used to it
being unworkably slow. In terms of the actual code, I do prefer
pairwise_python in that example, because I can see roughly what's going on.
pairwise_numpy is kind of like writing in Perl - it's short, but hard to
understand. I use the SciPy stack, and I don't fully understand it. How does
slicing a 2D array into three dimensions work?

------
albertzeyer
I think somewhat related is the RPythonic project
([https://code.google.com/p/rpythonic/](https://code.google.com/p/rpythonic/)
[https://news.ycombinator.com/item?id=5927769](https://news.ycombinator.com/item?id=5927769)).
It uses RPython to be able to statically compile Python code to C and that way
creates CPython extension modules - so you end up with something similar to
Numba.

I'm not sure which approach is better.

Some core PyPy people suggest against using RPython for anything else than
what they intended it to use (like writing an interpreter). See:

[http://mail.python.org/pipermail/pypy-
dev/2013-June/011498.h...](http://mail.python.org/pipermail/pypy-
dev/2013-June/011498.html)

[http://mail.python.org/pipermail/pypy-
dev/2013-June/011503.h...](http://mail.python.org/pipermail/pypy-
dev/2013-June/011503.html)

~~~
takluyver
One area in which Numba probably wins is that it's aware of Numpy arrays as
types for the compiled functions. There's a lot of code that handles with
Numpy arrays, and Numba integrates nicely with that.

~~~
compilercreator
Newer versions of PyPy also support a subset of numpy.

~~~
takluyver
PyPy does, but I don't think that the RPython translation and compilation
layer knows about the numpy semantics, so I don't think that helps RPythonic
make extension modules.

