And they didn't even reached their funding goal for "py3k in pypy" . This is dedication. I encourage everyone to fund this extremely incredible project!
I donated a while back, will make another donation soon.
I would like to start using this immediately but I think I'll have to wait until a 3.3 release for "yield from".
In theory, shouldn't CFFI be the foundation of the solution to that problem?
What purpose would that serve?
> Being able to just slightly performance sensitive code in python is a huge win.
I think you slightly this phrase, but aside from that pypy does not work for everybody and everything (e.g. at best it's no slower for sphinx, it really doesn't like the way docutils works). It's not like pypy's a magic wand.
If PyPy became the official/canonical implementation, PyPy would receive more attention and third-party library compatibility would be a requirement. Complaints about Python's slowness would be somewhat less relevant, and Python might see wider adoption. The RPython toolchain would receive more attention and that could be useful to other languages. There are plenty of reasons, but PyPy is usually a free speedup for your Python application. Who's going to complain about that?
> pypy does not work for everybody and everything
True, but as the official implementation of Python, compatibility with PyPy would then be a must, and this situation would be greatly improved.
GvR has done enough damage to Python with Python3. I don't intend to encourage him to do make any more changes. Us Python web developers are better off using what we have (non reference implementations, which don't hurt anyone), or just use Node.js.
The groundwork is done, and I think everyone who is going to support Py 3 without any extra prodding has already done so. Now we need the distros to come through and give that extra nudge to the maintainers that are still slacking, or encourage people to replace those libraries that refuse to update.
For my PyCon Russia talk, I pulled down the data for all 44,402 packages (as of May 31). 13.5% of all packages on PyPI support some version of Python 3. 75.5% of the top 200 packages by download count claim to support some Python 3 version (according to their setup.py classifiers). Additionally, 64% of the top 500 support some Python 3 version.
Another interesting thing I saw was that of those 44K packages, 44% of them have seen a release within the last 12 months (representing 82% of the last month's download share), and 22% of those packages released in the last year support some version of Python 3.
Seriously, it takes more than 4gigs to build PyPy? Is that also necessary for other platforms besides OpenBSD?
When you're compiling PyPy, it basically has to load the entire Python interpreter structure into memory so it can do its various analyses and annotations, so compiling PyPy takes a long time. I think for a while it was excluded from certain Linux distros because their package-build-farm machines wouldn't handle it.
Pypy is written in RPython, a subset of the python language. When it's 'compiled', the pypy RPython code runs in cpython or pypy, to re-compile the pypy source into C code, to generate a binary. Lots of tuning and such occurs at the same time, so the JIT runs well on the target machine. This is why it takes a long while, and lots of memory.
The build also prints a fractal while compiling. http://pypy.readthedocs.org/en/latest/faq.html#why-does-pypy...
Edit: also, looking at your comments[3E] it looks like surely you know this (sorry) so I'm now really not sure what you're getting at... :P
: and even with PAE you still need to split into multiple processes/address spaces to do anything useful
32 bit is dead except for ARM, and it will be dead on ARM in 4 years.
Uh... sure? ... but the parent post was about how building for 32 bit _today_ simply does not work and will not work.
Whilst it's not necessarily best to build for technology almost gone, there definitely will continue to exist 32 bit devices that people would expect to run Python on for quite a number of years yet - today's 32 bit ARM chips aren't going anywhere awhile and not every form factor (say non-desktop) is well suited to a 64+-bit architecture. :/
I haven't run a 32 bit desktop or server system since 2004. 32 bit is quite dead. In 4 years, only the cheapest ARM SoCs will be 32 bits. In embedded devices, yes 32 bits will be around for a great long while.
Although I must say, numpypy is quite usable already!
(this lets you compare the performance on a real Linux system, without installing anything)
PyPy aims to be (and is in many cases) faster than CPython.
The advantage with Jython isn't a performance one: it's the ability to call Java code directly.
Ruby also has a GIL.
MRI has a GIL; major alternative implementations (JRuby, Rubinius) do not.
OTOH, addressing the downsides of a GIL are not the only reasonable motivations for an alternative implementation, so there's no reason that a better-than-stock Python (or Ruby) fundamentally must remove the GIL (the current "MRI" used to be an alternative, YARV, to the old MRI, and both had GILs.)