
The first stable release of PyPy3 - pjenvey
http://morepypy.blogspot.com/2014/06/pypy3-231-fulcrum.html
======
Buetol
Wow, this is a very exciting moment for the Python world.

And they didn't even reached their funding goal for "py3k in pypy" [1]. This
is dedication. I encourage everyone to fund this extremely incredible project!

[1]: [http://pypy.org/py3donate.html](http://pypy.org/py3donate.html)

~~~
chrismonsanto
I have been checking the py3k branch on Hg every other day waiting for this
moment, what a pleasant surprise. Very, very exciting. Thanks all.

I donated a while back, will make another donation soon.

I would like to start using this immediately but I think I'll have to wait
until a 3.3 release for "yield from".

------
rectangletangle
Awesome, I hadn't realized this project was quite this far along. If they get
PyPy 3.4/3.5 going with NumPy, it will make a _really_ nice package. Fast
Python code for the high-level logic, paired with fast low-level number
crunching. This could also help speed up the adoption of Python 3.

~~~
rch
Looks like they're over 80% of the way to hitting the funding goal for that
one too:

[http://pypy.org/numpydonate.html](http://pypy.org/numpydonate.html)

~~~
ma2rten
The problem is: even if numpy gets ported we still don't have scipy and a
million other packages which require C bindings.

~~~
rectangletangle
True. Though this will likely make PyPy more mainstream, and thus it'll
hopefully attain more community support.

------
thomasahle
I wish the community would just switch entirely to pypy. Being able to just
slightly performance sensitive code in python is a huge win.

~~~
ngoldbaum
It makes sense to use pypy if you're writing pure python code. The second you
need a C extension, you're pretty much out of luck. This kills a lot of the
appeal for people in the scientific/analytics side of things, who make heavy
use of legacy C and Fortran routines.

~~~
dragonwriter
> The second you need a C extension, you're pretty much out of luck.

In theory, shouldn't CFFI be the foundation of the solution to that problem?

~~~
tych0
In practice it works pretty well. I am nearing completion of a rewrite of X's
XCB-based python bindings in cffi, and it has worked out quite nicely.

------
tedunangst
Minor note: the openbsd support (at least for 2.x) is amd64 only. Building for
i386 at some point requires running a bootstrap process that doesn't fit in
memory.

~~~
hcarvalhoalves
> Building for i386 at some point requires running a bootstrap process that
> doesn't fit in memory.

Seriously, it takes more than 4gigs to build PyPy? Is that also necessary for
other platforms besides OpenBSD?

~~~
sitkack
4GB is literally nothing. My laptop has 16, most servers I use have 128+. 4GB
is netbook territory.

~~~
tekacs
... I think the implication is that more than 4GB would exceed the
pre-[PAE][1] memory limit[2]. A form of cross-compilation might work, though
PyPy build isn't exactly a simple, 'classical' build process. :P

Edit: also, looking at your comments[3E] it looks like surely you know this
(sorry) so I'm now really not sure what you're getting at... :P

[1]:
[http://en.wikipedia.org/wiki/Physical_Address_Extension](http://en.wikipedia.org/wiki/Physical_Address_Extension)

[2]: and even with PAE you still need to split into multiple processes/address
spaces to do anything useful

[3E]:
[https://news.ycombinator.com/threads?id=sitkack](https://news.ycombinator.com/threads?id=sitkack)

~~~
sitkack
My point is requiring a lot of ram for a build is not a problem. Yes it would
be nice to support low end devices for PyPy compilation, but the set of people
on extremely constrained hardware and those people doing development on PyPy
that would need to build from source is well, by definition zero.

32 bit is dead except for ARM, and it will be dead on ARM in 4 years.

~~~
tekacs
> 32 bit is dead except for ARM, and it will be dead on ARM in 4 years.

Uh... sure? ... but the parent post was about how building for 32 bit _today_
simply does not work and will not work.

Whilst it's not necessarily best to build for technology almost gone, there
definitely will continue to exist 32 bit devices that people would expect to
run Python on for quite a number of years yet - today's 32 bit ARM chips
aren't going anywhere awhile and not every form factor (say non-desktop) is
well suited to a 64+-bit architecture. :/

~~~
sitkack
Remember we are talking about _building_, actually JITing a JIT using a
dynamic language _for_ a dynamic language.

I haven't run a 32 bit desktop or server system since 2004. 32 bit is quite
dead. In 4 years, only the cheapest ARM SoCs will be 32 bits. In embedded
devices, yes 32 bits will be around for a great long while.

------
zyngaro
I've just made a small donation.

------
wldcordeiro
This is awesome, now just to wait for a Python 3.4 PyPy release :D

~~~
Derbasti
And Numpy! And ctypes (for Matplotlib)!

Although I must say, numpypy is quite usable already!

~~~
sitkack
PyPy has had ctypes support for a great long while.

~~~
Derbasti
True. Not complete enoughbfor Matplotlib, though.

~~~
rguillebert
I think you're talking about the c extension api.

------
johnrds
I created a simple Terminal instance that compares Python and PyPy in a
performance test:

[https://terminal.com/tiny/shkhWWkcEV](https://terminal.com/tiny/shkhWWkcEV)

(this lets you compare the performance on a real Linux system, without
installing anything)

~~~
codiator
PyPy seems to be 7x faster!

~~~
hyperbovine
On a silly piece of code that nobody would ever have any use for. I have tried
PyPy for "real" data and numerical tasks from time to time, and never have I
noticed any sort of speedup. Usually it's slower than CPython. Perhaps this
latest version will be different, who knows.

~~~
apendleton
I'm using it in production, and speedups tend to be on the order of 4-5x for
my app (the compute-intensive part involves hierarchical agglomerative
clustering of documents by text similarity, so it's data/numbers-heavy).
Obviously it'll depend on your individual application (and non-CPU-bound tasks
won't benefit much), but we switched to PyPy because it showed major
improvements in profiling of our app on production data (and we switched
around PyPy's 1.9 release, so it's even better now). It's not like everyone's
just imagining the speed improvements...

~~~
illumen
It's not like everyone's just imagining that it's slower for many work loads
either.

~~~
apendleton
Not disagreeing, but they implied that this benchmark only showed a speed
improvement because it's a toy, and that real workloads with real data are
_usually_ slower. That hasn't been the case in my experience.

------
chris_mahan
Excellent. I've been waiting for this for a long time.

------
voidlogic
How does the performance of PyPy and Jython compare?

~~~
rguillebert
Jython is usually slower than CPython I believe, it has no GIL though.

~~~
rdtsc
Wonder if it can be faster under higher parallelism conditions. Multiple
threads doing some CPU intensive work?

~~~
sitkack
Jython can utilize threads as well as Java can, so on many core machine Jython
wins by a pretty large margin.

------
husio
Thank you.

------
derengel
I don't know or use Python but why an implementation that is trying to be
"superior" still has the GIL?

~~~
pekk
What DO you know or use? Did you think that the GIL was an obvious and stupid
oversight made by stupid people for no good reason?

~~~
glibgil
Obviously the GIL was shortsighted, yes. Leave the people out of it. The idea
was stupid. There was a reason, but it wasn't a good reason.

~~~
rguillebert
What would you replace it with ?

~~~
glibgil
No GIL.

~~~
rspeer
Your username is apt, but novelty accounts aren't a thing here. What exactly
are you hoping to communicate?

~~~
glibgil
My name is Gil. Look at my comment history and apologize.

