

Ask HN: Is async request processing possible with Python Django? - anon_founder

I started with Python Django to build my web app since it was mostly short-lived http requests rendering UI. As the product matured, now I realize that I need to support lot more web service calls (with no UI component). The web service calls need to be processed and responded to asynchronously for 2 reasons: 1) My code may call another web service in turn to get some data to service the call and I don&#x27;t want to hold the original thread or process while doing so and 2) Certain requests can only be responded to after certain events happen on the backend and I don&#x27;t want to hold the thread&#x2F;process blocking and waiting for that event.<p>The more I think about it, the more it sounds like I may need something like nginx for my web service calls (from whatever I have heard about nginx). Can python django really be used in this manner with an appropriate web server and a set of middleware components? If so, can you recommend a particular stack choice? If not, what are my options and can you educate me on the pros and cons of those choices?
======
gane5h
Two ways you can do this:

1\. Long polling: It’s definitely possible to have thousands of long-lived
requests with something like gunicorn/tornado. Remember to turn-off buffering
in your front-end nginx proxy if you want to use long-polling.

2\. Async with web hooks. Gather the request payload, push it to a job queue,
and return the response. Process the job queue at your leisure and then call
the web hook when complete. You can use celery, beanstalk or my personal
choice rq (and django-rq.)

~~~
anon_founder
Thanks gane5h. Can you point me to an example that shows how to use django
framework for long-lived requests via gunicorn/tornado?

~~~
gane5h
There're many examples on GitHub, here's one I found just by searching:
[https://github.com/tbarbugli/django_longpolling](https://github.com/tbarbugli/django_longpolling)

------
ilhackernews
This sounds relativly simple. I would use Celery for this. would recommend
reading this tutorial [http://sebastiandahlgren.se/2012/11/13/using-celery-
for-asyn...](http://sebastiandahlgren.se/2012/11/13/using-celery-for-
asynchronous-messages-in-django/)

It's pretty simple and straight forward

~~~
anon_founder
Thanks. Google returned lot of pages suggesting using celery for async
processing. I also read the the tutorial you have provided a link to. Using
celery makes sense for deferring processing to another process/time. I'm still
trying to figure out if I can get away with not returning any response in my
view's page request process function (and return something that says will post
a response later....hold the original http request).

------
jkarneges
You want this: [http://blog.fanout.io/2013/04/09/an-http-reverse-proxy-
for-r...](http://blog.fanout.io/2013/04/09/an-http-reverse-proxy-for-
realtime/)

~~~
anon_founder
This is very interesting and sounds simple enough to get started. It seems to
solve my long-poll problem fine. The other issue I'm dealing with is high-
latency network iOS holding up the process/thread. Does push-pin have a
solution for that? I'm dreaming but this is what I want to tell push-pin: "I
want to invoke this out-bound API...Hold this http request, go invoke this API
and call me back when the call is complete...". Basically an out-bound proxy
with similar benefits. I looked at gevent/greenelets to make python
concurrency work but it all seems a bit unnatural to me.

~~~
jkarneges
Hmm, that's definitely outside the scope of pushpin, but it's an interesting
idea for a different project perhaps.

I guess you want a proxy that you could tell to make an HTTP request on your
behalf but would return immediately. And then it would make an HTTP request
back to you once it has received the remote response.

Another of our projects is Zurl, which is used for async outbound HTTP. You'd
have to write some zmq handler code but potentially it could be appropriated
for what you're trying to do.

