Maybe it is just me, but I can never bring myself to setting up Celery and RabbitMQ and all for some Django apps that could sorely use it. There seems to be so many moving parts, last I checked there were like 3-4 different services to tie everything together...
Anyone know of a simpler system or technique for achieving the same thing? Or perhaps, a better way to go about using Celery.
IMHO, setting up RabbitMQ and Celery, especially with django-celery, ends up being much simpler and less painful than rolling any kind of custom queue/async processing. In terms of moving parts, for most setups, you can put Celery and RabbitMQ under supervisord and almost never have to worry about them.
If you don't need all the features of RabbitMQ or just don't want to deal with it, you can use any of the other broker backends: Redis (if you don't need persistence) or even with your default DB through the Django ORM.
Yes, but I think what he was referring to was message acknowledgements, which Redis doesn't have. If a worker reserves a message, and is abruptly killed, then the message is lost.
A "quick and dirty" solution is to use django-celery with the django-kombu (http://pypi.python.org/pypi/django-kombu/0.9.3) backend. This stores the tasks in your database, rather than having to run a separate broker like RabbitMQ. I've been using this for a few months and it's worked pretty well.
The only moving parts are the broker (RabbitMQ, Redis, etc), your existing app (already moving), and some workers --- which for a simple case like yours is "python manage.py celeryd".
It's worth it, in my opinion. Celery is one of the best-of-breed (even outside of Python) tools like SQLAlchemy.
If you're already using redis, use that for your broker. I agree that configuration can appear overwhelming, but it's pretty straightforward. To say that we've had success with Celery would be a big understatement, it's fantastic.
I felt absolutely the same way, but once we started using redis (for other reasons) I finally took the plunge.
We've been quite happy with it so far. It has a bit of a learning curve, but once you buy into the way it does things suddenly a lot of code you thought you had to write disappears. I was just looking at our async code and really was surprised at how short it is.
bryanh, it may seem daunting at first, but everything fits together quite nicely. You don't even have to know anything about erlang in order to setup celery.
Nowadays you can install RabbitMQ + Erlang through a package manager like apt-get/homebrew.
After making sure that RabbitMQ is running, add "djcelery" to your project's settings.py + syncdb, add a task (The API is very straightforward and intuitive), run celeryd and voila =) [1].
The other services (like celerybeat/camqadm) are for monitoring/administration purposes.
Besides, RabbitMQ is not even required since there are the so-called ghetto queues (db, redis).
But if you're looking for something even simpler, I suggest you take a look at hotqueue[2]
You can't really compare Celery with beanstalkd or gearman. In fact, Celery supports beanstalk as a transport, and adding gearman support would be simple. You could rather compare celery with the Python clients for these services.
Also, the default configuration is good enough in most cases.
Anyone know of a simpler system or technique for achieving the same thing? Or perhaps, a better way to go about using Celery.