

NumPyPy progress report - running benchmarks - jperras
http://morepypy.blogspot.com/2012/01/numpypy-progress-report-running.html

======
wookietrader
This is something like the holy grail for me.

I am a machine learning research, using the python/scipy/theano stack. One
problem performance wise is that I can get my models reasonably fast with
Theano--but as soon as I need a python for loop (e.g. by using Theano.scan)
performance dies. As soon as you have a couple of nested (e.g. due to a
Jacobian matrix instead of vector) performance is virtually gone again.

I'd love to use numpypy, and the way I see it not much is missing.

Keep up the good work! I wish I cold fund you! :/

------
slug
This is good news, although confusing Laplace transform and Laplace equation
makes me think that before using their numeric code I should check it
thoroughly.

~~~
fijal
Fortunately for you, we're only dealing with things like addition,
multiplication, maybe sinus and even so we rely on the processor to do the job
:-) The rest is dealt by numeric guys from the original numpy and we're simply
reusing those parts. It's important for us to understand how processors work
and how to make sure we do _exactly_ the same computations, but not much above
that.

Besides, believe it or not, having non-english maths background makes you seem
incredibly dumb (which might or might not be the case).

EDIT: I should maybe stress this point more. It's very important for us to get
_exactly_ the same results, or more numerically stable as original numpy, not
just the same algorithms, so we won't be experimenting on that field.

