

Asynchronous Processing in Python Using Redis - rlander
http://richardhenry.github.com/hotqueue/tutorial.html

======
samratjp
Looks a lot like GitHub's Resque(Ruby) <https://github.com/defunkt/resque>
hopefully, this python variant will get all the good stuff that comes with
Resque (like a Sinatra app to see your queues - very handy).

------
wildmXranat
I still like a generic job server like gearman better. You can do sync, async,
back the queue with memory, mysql, tokyo cabinet ... and if you have a need of
PHP clients talking to Python consumers no, problem as well. Those libraries
exist.

------
unoti
I use Python + Redis using Hotqueue
(<http://pypi.python.org/pypi/hotqueue/0.2.1>) for all the processing of
flurbils.com. All of the scoring for the game comes in from Second Life via
http requests that are handed off immediately to a background handler via
Redis. It's enabled me to seriously level out load spikes, and enabled me to
push off having to upgrade the capacity of my servers as a result.

I've used other libraries for doing this in the past. Being able to switch
backends using some other library would be nice, I suppose, but there is
literally only about 6 lines of code in the software that is specific to
hotqueue.

This has been working very well in production, and I'm very pleased with it.
Using background processing to level out load peaks, plus migrating to Apache
worker MPM has enabled me to handle loads on the same hardware that I wouldn't
be able to touch using the basic normal approach.

Starting off using Redis as a queue is a great way to start with Redis,
because everybody can use a queue for something. Then once Redis is running in
production, it becomes far easier to start using it for more and more things.

------
freyrs3
Anyone have some insight on how this would compare performance-wise with
Celery + RabbitMQ?

~~~
simonw
I'm using Celery with the Redis backend for <http://lanyrd.com/>

~~~
frankwiles
We're also using that with several clients.

