
Asynchronous Programming in Python: Asyncio - submeta
http://djangostars.com/blog/asynchronous-programming-in-python-asyncio/
======
jacob019
Most of our stack is built around gevent, it works great just about everywhere
and requires little to no thought about the asynchronous nature of the code.
Performance has been incredible compared to the old thread based code it
replaced. Asyncio seems nice in theory, being baked in and all, but we tried
it with some sample apps and it just requires too much thought to stay
asynchronous, it's a takes a total paradigm shift like twisted. It's easy to
slip up and have parts of the code become synchronous and the code gets hairy.
We decided to just stick with what we know.

~~~
deathanatos
I'm assuming your stack looks like mine: gevent has monkey patched everything.

1\. fork'ing just doesn't work properly; something about the event loop in the
child hangs.

2\. You can't easily spawn an honest to god hardware thread anymore. (Since
the typical call is monkey patched!) Yes, you can get the original threading
module by getting it out of gevents hands, but…

3\. …it doesn't compose well. Libraries, in order for gevent to play nice,
need special "if gevent then …" sections.

~~~
jacob019
No need for forking or threads, that's the point of async. Yeah there's
limitations when CPU bound, but that's python. Fire up a few processes and
load balance.

#3 hasn't been an issue for us, monkey patching works well. We've switched to
pure python versions of a few libraries. When not available use a pool of
instances.

~~~
sametmax
Of course you need threads even with async. When you code blocks for CPU
reasons, like handling a heavy HTTP request with lots of calculations, you
don't want to stop serving other requests.

~~~
jacob019
It depends. If the calculation can be split into nice chunks you can throw in
some gevent.sleep(0) to prevent them from hogging the process. We use python
for handling web requests, anything heavy tends to be handed off to another
process. It's rare for us but yeah, sometimes you need threads.

------
robbiep
Question from a mediocre programmer: what are the advantages of this over,
say, a messaging queue and microservice architecture for an application (i.e.
Web service built on flask, but the details don't really matter)

For example, would it be a bad idea to put my email module on an async
function as opposed to in a different application?

~~~
JustSomeNobody
They aren't really meant to accomplish the same goals, async and
microservices. What I mean is, these are architectural choices. While at some
level they may _look_ interchangeable, they really aren't.

There's no 100% correct way to say when to use one or the other, but here's
one way I would think of them.

Microservices are used to help define the architecture of your system overall.
This is more ... high level. Async would be more at a process level.

So, for your example, you could have a flask microservice that exposed an
interface that accepted a list of email addresses and a message to send to
each. The request handler for that could then call a method using asyncio to
send the emails. This would allow the microservice to remain responsive and
wait for more requests.

~~~
robbiep
Point taken. Much appreciated, thank-you

------
Goopplesoft
There will be interesting community impacts from python's asyncio -- I'd
venture its going to create a lot of fragmentation in the realtime application
ecosystem. The issue seems to be that there isn't an agreed way to consume
derivative apis as the flow control and interfaces around asyncio are super
clunky. Further, the all-or-nothing nature of asyncio (don't block the thread)
means a significant portion of applications will have to have some rewrite to
support asyncio (webframeworks, sqlalchmy and such).

There are already two fairly large ecosystems around asnycio network libs +
http frameworks which are large rewrites of existing libraries
([https://github.com/klen/muffin](https://github.com/klen/muffin) and
[https://github.com/aio-libs](https://github.com/aio-libs)). It seems like
having a whole framework/ecosystem for loop lifecycle management, async libs,
state/comm, etc will unfortunately be needed. I hope we find a way to work
around this although I doubt there is a good way.

~~~
cat199
> It seems like having a whole framework/ecosystem for loop lifecycle
> management, async libs, state/comm, etc will unfortunately be needed.

... twisted has been around for ages for exactly this reason as a 'framework'
unfortunately this means it's not 100% ported to python3 yet, but I suspect
(not a dev, etc) that as twisted becomes more mature and 2.x dies off, there
might be more overlap between these two things.. and perhaps other similar
projects

~~~
sametmax
I use twisted with Python 3 all the time, espacially with crossbar.io. Reports
of twisted not being ported to V3 are greatly exagerated.

------
rajathagasthya
Looking at the
[http://www.dabeaz.com/coroutines/Coroutines.pdf](http://www.dabeaz.com/coroutines/Coroutines.pdf)
recommended in the article, coroutines remind me of partial functions in
Python. Can folks who know more about Python than I do explain when to use
coroutines vs partial functions? Is the difference only in the "generator"
part of it?

~~~
pdonis
A partial function in Python is just a wrapper around the underlying function,
that automatically supplies whatever arguments you passed in to
functools.partial. It doesn't change anything about how the function is
executed.

A coroutine gets executed differently from an ordinary function (or an
ordinary generator, for that matter); when you call it using the await syntax,
the Python interpreter can choose to run some other code while it is waiting
for your function to return a result from I/O (for example, a network
request). A normal function call doesn't allow that. (A generator might, if
you use yield from and do a number of other things, but you have to do them by
hand, whereas the await syntax does it all for you automatically inside the
interpreter.)

~~~
ATsch
This is slightly misleading considering async funcions are just simply
generators and nothing but parsing the keywords is actually implemented in the
interpreter. The whole logic is in the asyncio library.

This is a great talk where the presenter goes into how asyncio works in detail
and builds a mini version of the asyncio library:

[https://youtu.be/M-UcUs7IMIM](https://youtu.be/M-UcUs7IMIM)

~~~
pdonis
_> async funcions are just simply generators and nothing but parsing the
keywords is actually implemented in the interpreter._

As I understand it, async functions have an extra flag that ordinary
generators don't have, and the interpreter does some different things if that
flag is present. But I have not dug deeply into the source code.

------
d0mine
curio and trio libraries demonstrate pure async/await models
[https://vorpus.org/blog/some-thoughts-on-asynchronous-api-
de...](https://vorpus.org/blog/some-thoughts-on-asynchronous-api-design-in-a-
post-asyncawait-world/)

------
orf
A lot of people seem to have trouble wrapping their heads around async
programming and I'm not sure if this article will help that much.

What are the most confusing aspects of using async/await? Is it the
implementation (i.e the syntax in JS, C# or Python) or the theory behind it?

~~~
bullen
I think it's because async. means nothing if you don't specify how, where and
when something is async. For starters you can't use languages that don't share
memory between threads if you wish to see any improvements by adding async.
threading to your non-blocking network code. So Python, Go, Ruby, Javascript,
Erlang are all out of the argument when talking about async.

Only C (C++, etc.) and Java (C#, etc.) families of languages that share memory
between threads can scale better with async. on multicore processors when it
comes to network related performance.

Edit: Please comment if you downvote.

Edit: Now I can't comment, the site tells me I'm writing too fast.

~~~
kjksf
FYI: Go, Python and Ruby do share memory between threads.

JavaScript doesn't have threads so the point is moot.

Only Erlang doesn't share memory between processes.

~~~
bullen
Ok, didn't know that, can you point to some pages that explain how you can
reference the same memory from two threads in the same process with these
languages?

~~~
kornish
Here's a snippet of Go code, with comments, which spins up different
goroutines (lightweight thread primitives) to read and write values in a
shared map: [https://gobyexample.com/mutexes](https://gobyexample.com/mutexes)

~~~
bullen
Can goroutines run on separate cores? Because I'm pretty sure go explicitly
decided to not share memory between threads because google thinks async. is
too hard to understand for programmers...

~~~
geoka9
You probably mean channels - yes they let you avoid dealing with
synchronization of shared memory access explicitly, but they still use mutexes
underneath, so you can use those directly if you want (instead of/along with
channels).

------
fafhrd91
what is good about asyncio, it allows to replace default event loop. I am
working on asyncio event loop based on tokio-rs (rust). it could bridge
asyncio with code written in rust. so potentially, it should be possible to
mix python and rust code on higher level than just simple c-extension.

[https://github.com/PyO3](https://github.com/PyO3)

------
sideproject
Apologies for digressing a bit, but is there something like Asyncio for PHP?
I've been looking into this for quite some time, but the standard answer I get
is using queues. Would love to hear about any latest projects related to this.

~~~
Lazare
You're probably looking for ReactPHP
([http://reactphp.org/](http://reactphp.org/)), or possibly Amp
([http://amphp.org/](http://amphp.org/)).

There's also a collection of resources here:
[https://github.com/elazar/asynchronous-
php](https://github.com/elazar/asynchronous-php)

~~~
sideproject
Hm. Never heard of Amp! But it looks promising! Thanks.

