

Ruby 1.9, massive boost in threading preformance - ashleyw
http://www.tonyspencer.com/2008/07/28/ruby-could-replace-my-python-crawler-pretty-soon/

======
koblas
Dug into this a bit more did a quick writeup here:

<http://www.skitoy.com/p/python-vs-ruby-performance/172>

Bottom line is well over 70% of time is going to rand() and 25% to list
overhead, threading is drowned out by that noise.

------
jwecker
In 1.9 Ruby is also adding "Fibers" and built in Actor patterns (i.e., Erlang-
like concurrency)- more exciting IMO. <http://www.infoq.com/articles/actors-
rubinius-interview>

------
tx
Does Ruby 1.9 have a web page or something? Ruby's site has very little
information on how's the project going, and I am not too involved to follow
the mailing lists.

Questions like: when Rails support is coming? Or will native threads be
supported?

------
richcollins
With Ruby threads you get the worst of both worlds. They are preemptive and
user-level.

1.9 will introduce native threads, which aren't much better. Each native
thread requires megabytes of memory for its stack. Co-routines require only
64k of memory.

Concurrency in Rubinius should require even less memory overhead as it is
stackless.

~~~
tx
_1.9 will introduce native threads, which aren't much better._

What? They're not only "better", they're actually _threads_, i.e. are able to
run in parallel, you know? What are Ruby 1.8 threads good for, except for
sitting on sockets?

~~~
tesseract
Producer/consumer where there are multiple I/O bound producers (examples that
come to mind: RSS reader, web spider, multiple-file search). They can also be
a useful abstraction for things like waiting for events from multiple sources,
or running quasi-realtime simulations.

I agree, though, that real threads would be a significant improvement. Or
better yet, MxN threads like GHC.

------
marijn
I'm not sure how using 20 threads tests threading performance, but there's
probably some issue that Ruby used to have that I'm not aware of.

It is impressive though, that to beat the given Ruby time on my machine in
SBCL, I actually had to add optimize declarations. Of course, I have no idea
what kind of machine the published number were from, and I'm too lazy to try
and install Ruby 1.9 myself, but it seems that the old 'Ruby use -> slowness'
implication no longer holds.

[edit] Hold on, the Ruby 1.8 test, which takes 22 seconds in their figures,
takes 4.3 on my machine. So that would mean Ruby 1.9 is like super-sonic ultra
fast. At least on this benchmark. Which is mostly testing the speed of the
sorting routine which I suppose is written in C. So what _are_ we talking
about, anyway? I'll shut up now.

~~~
swombat
_it seems that the old 'Ruby use - > slowness' implication no longer holds._

That's a fallacy in the first place (at least for web apps)... Most web
application code spends most of its time waiting for the db to respond, not
number-crunching.

~~~
marijn
That does not mean performance is irrelevant, does it? Heavily used web apps
benefit a lot from a fast runtime (it allows you to push the point where you
have to split across servers forward quite a bit), and being _able_ to do CPU-
intensive stuff in the same environment when you need to instead of having to
bust out the C or Java or whatever is very pleasant.

~~~
ovi256
You can do CPU-intensive tasks using a general purpose Python API with
SciPy/NumPy. Yes, it's implemented in C.

------
henning
[http://shootout.alioth.debian.org/gp4/benchmark.php?test=all...](http://shootout.alioth.debian.org/gp4/benchmark.php?test=all&lang=yarv)

Overall it looks like Ruby 1.9 is 2-4x faster, which is impressive but not
massive.

------
pavelludiq
im curious about stackless python, and how do the two compare?

------
tokipin
i heard callcc and eval were on the table for modifications due to their
effects on performance. anyone familiar with the details?

