
I don't understand Python's Asyncio - ingve
http://lucumr.pocoo.org/2016/10/30/i-dont-understand-asyncio/
======
jerf
I think I've made the point a few times that I don't like this style of
programming at all, because the coroutine layer turns into an Inner Platform
[1] replicating all the control-flow structures the original language has,
which then has to integrate with the original language which causes more than
twice the complexity to emerge. But it's hard to bash together an example of
how that happens in a comment, when it's all faked up and easy to dismiss.
This is a great example of the complexity that can emerge. Some of that is
incidental and will be fixed in future releases, but some of that is pretty
fundamental, such as the initial list of all the sorts of things you need to
learn about to use asyncio. Languages like Erlang or Go that have the event
loop simply embedded into the language from day one have a much shorter list
of such things you have to know, and it does all the things that this
complicated list of asyncio concepts do.

But I will also admit some of this complexity is definitely Python-specific.
I've been using Python since 1.5.2 was still the version you'd be most likely
to encounter and I've liked it as a language for a while, but one release at a
time, the language has been getting more and more complicated. By the time
asyncio came around, the language was quite complicated, and sticking asyncio
on top while integrating it with everything else is really a mess. Python has
too darned many features at this point. I don't know exactly when it jumped
the shark feature-wise, because individually they all make sense, but the sum
total has gotten quite unwieldy. Watching new programmers try to learn Python,
a language I used to suggest to such people as a good language to start with,
has been a bit dispiriting lately.

[1]: [https://en.wikipedia.org/wiki/Inner-
platform_effect](https://en.wikipedia.org/wiki/Inner-platform_effect)

~~~
rumcajz
Why can't Python have something like [http://libmill.org](http://libmill.org)
?

If it's doable in C, why not in Python?

~~~
rdtsc
It has those. It as the greenlet library and used by eventlet and gevent. Many
use those, and they've been there for many years. But Guido and others decided
that is not acceptable. So we've gone the Twisted route (because everyone know
Twisted is easy and fun) and now we have yields, co-routines, futures, awaits
and all the other mess.

~~~
WaxProlix
Lots of conversation between Armin and a few others about these approaches in
reddit's /r/python thread for this article. I happen to think that gevent got
a _ton_ right, and the consensus seems to be that this 'new' approach is sort
of half-baked, at least vis-a-vis python implementation.

[https://www.reddit.com/r/Python/comments/5a6gmv/i_dont_under...](https://www.reddit.com/r/Python/comments/5a6gmv/i_dont_understand_pythons_asyncio_armin_ronachers/)

------
coleifer
I've been a gevent user for a long time and Python's decision to "bless"
twisted by adopting it's patterns was a watershed moment for me, and basically
was the beginning of the end of my belief that I'd ever adopt Python 3.

User jerf's comment that asyncio "more than [doubles] the complexity" is
absolutely correct. Watch this video of Guido talking about tulip...or
struggling to talk about tulip, rather. It's clear the dude is out of his
depth and my god the recent changes to the language show that the inmates are
now running the asylum... Seems like Python, in it's effort to chase the
latest fads, is no longer the language I would endorse to someone new to
programming. Whether you think that's a meaningful litmus test or not, the
staggering amount of _crap_ that's infiltrated the language now completely
flies in the face of the zen of python's statement that there should be one
and preferably only one way of doing things. Fuck. I'm going to go code some
lua now.

[https://www.youtube.com/watch?v=1coLC-
MUCJc](https://www.youtube.com/watch?v=1coLC-MUCJc)

~~~
hyperbovine
Oh come on. Hyperbole much? You're acting as if it's impossible to write any
sort of meaningful code in 3.x without embracing asyncio. A few 2to3-ish
language changes excepted, you can punch the exact same code into the Python
3.x interpreter that you have been for the past 15 years.

~~~
carapace
Not much.

I agree with coleifer (and others) that Python 3 is losing "the zen". The new
f-string stuff is a perfect example: a bad, retrogressive idea that never
should have been approved.

I have no plans to adopt or use Python 3.

~~~
tedmiston
I have mixed feelings about f strings. Having the string literal bound to the
variable name feels kind of backwards -- that one could break interpolated
strings inadvertently by refactoring a variable name. On the other the
str.format syntax is a bit verbose, even in shorthand notation.

That said, I use Python 3 every day for almost everything without need to use
every shiny new feature.

~~~
Too
It's Python we are talking about right? Refactoring a variable name in a non
type safe language has always been prone to break something. If you trust your
IDE to do refactoring on that level I would also trust it would rename inside
the string.

------
plq
The general sentiment in this thread seems to be that Guido et al "blessed"
the Twisted way, it's a pity and now using asyncio and friends is the only way
to suspend a Python stack in a concurrent program besides using OS-level
threads.

This is not correct. Green threads, as a programming paradigm, are just a sub-
optimal (but cheaper) way of doing preemptive multitasking. Yes, switch is
explicit, but user code can't know what will switch. So you are not supposed
to treat green threads any differently from OS threads. ie you still need your
mutexes, semaphores etc. if you want to avoid race conditions. Again, that's
because the caller doesn't know whether a function will yield execution to the
next queued task at some point. Guido explains this point pretty well in one
of his PyCon keynotes.

So green threads may be great but they don't bring anything new to the table.

However, Twisted-style concurrency (aka cooperative multitasking) is a
different paradigm. In Twisted, you know that you only have one thread running
at a time, so you actually don't need any thread synchronization primitives
when accessing non-local state. This simplifies things a great deal. Yes, not
having to spawn a thread for every single concurrent IO operation has other
great benefits, but that's not the reason why CPython now has a blessed event
loop -- it's cooperative-multitasking-the-paradigm.

Before asyncio, there was no standard way of doing cooperative multitasking.
Now there is and it's baked right into the language. Use it if you like it. If
not, the old ways of doing things work just fine in Python 3.

I'll admit that the concurrency model in Python 3.4 was not perfect. However,
what we have with Python 3.5 and up looks quite polished.

~~~
ntoshev
I disagree that the code can't know what will switch with green threads. They
switch on I/O and when explicitly requested. An example of using this to get
rid of concurrency control: [http://www.underengineering.com/2014/05/22/DIY-
NoSql/](http://www.underengineering.com/2014/05/22/DIY-NoSql/)

If you want to make sure that there are no context switching in certain part
of your code, you can do it ASSERT-style, something like:

    
    
        # enable this only if you use atomic, so a new module that should be imported before gevent
        in_transaction = False
        if __debug__:
            import greenlet
            old_switch=greenlet.greenlet.switch
            class _greenlet(greenlet.greenlet):
                def switch(*args, **kwargs):
                    if in_transaction:
                        raise Exception('Switching context during / atomic')
                    old_switch(*args, **kwargs)
            setattr(greenlet, 'greenlet', _greenlet)
    
        @contextmanager
        def atomic():
            """
                Ensure that a function or a block of code is atomic, raise exception if it's not
    
                Usage:
    
                @atomic()
                def myTransaction(...):
                    ...
    
                or 
    
                with atomic():
                    ...
            """
            global in_transaction
            in_transaction = True
            yield
            in_transaction = False

~~~
plq
And how is this not a mutex?

~~~
ntoshev
This only runs in the debug version of the code. I hope your mutexes run in
production, too.

~~~
lmm
So you rely on hoping your tests exercise all possible paths, because if not
you get silent race conditions in prod?

~~~
ntoshev
If you want you can run it in production, too (this would give you a warning,
won't prevent race conditions, but there is almost no performance penalty).
The point was this is very different from a mutex.

------
stygiansonic
For background, the author founded the Flask project [0] among others, and
contributed to many other well-known Python projects.

0\. [http://lucumr.pocoo.org/projects/](http://lucumr.pocoo.org/projects/)

~~~
dom0
Also [https://www.palletsprojects.com/](https://www.palletsprojects.com/)

Btw. does someone know why some things where moved from Pocoo to Pallets?

~~~
loulouxiv
Armin wanted to step back a little from these projects and moved them to
pallets in order to make them more community-driven

------
anacrolix
I agree with Armin that Python is becoming burdensomly complicated. Since
Python 3.2, the language has started to accumulate a lot of features, Guido
seems to be saying yes to everything. Furthermore, the Python 3 fiasco has
severely affected Python's popularity. It would be in much greater use now if
the clean-up had been done more gradually.

~~~
krick
I don't quite see why it's being downvoted. Like it or not, but that's what is
happening: Python's 2.4-2.7 success was largely due to it's simplicity. It
more or less did correspond to values defined in `import this`. It wasn't
particularly performant nor "safe/foolproof", it wasn't even that powerful as
a language. It lacked (and still lacks) good error reporting system. But it
was lean, easy to get started with, agile enough, expressive enough. It was
always possible to make something unintelligible using reflection and magic
methods, but it is easily distinguishable form "how stuff should be written"
and otherwise you could be quite sure there won't be any surprises.

This is why python is still used for it's purpose. CLI utils, bots/crawlers,
tf/pandas/scikit-learn, REST-APIs on top of flask.

Then hard times came. First, many year long story (still not concluded) of
transition between 2 and 3. Then all this stuff. Sure, there is such thing as
progress. Stuff is invented for some purpose. But now, in 2016 and at v3.6 —
Python isn't what it has been loved for. Not easy-to-start-with, nor simple.
This return/yield fuck up, showed in the article is absolutely huge deal, for
example, and it is not about asyncio per se. Async stuff is always
complicated, it wouldn't be that bad if it was all it's about.

If some 5 years ago one would use Python just because "why not, I just need to
get stuff done", now it's quite likely that after struggling with all this
micro-nuisances he would go with golang/js/php/whatever instead.

~~~
int_19h
The thing about Python is that unlike most other languages, you don't _have_
to deal with that complexity. You can still write Python 2.7 style code, no
problem.

This approach to async, though, is just a language feature that's becoming
mainstream right now. C# has it, ES7 has it, C++ has a working paper on it
etc. Python actually had the benefit of watching how things work out elsewhere
before implementing it all.

~~~
thijsvandien
The idea that complexity doesn't matter if you don't use it sets Python on the
path to becoming the next C++. Pick the subset of the language that you like.
Then carefully watch your libraries, because if they use a different subset,
you still have to deal with what's in there (and Python is a library heavy
language!). It goes against everything Python stands (stood) for. Definitely
not what I'm looking for.

~~~
int_19h
This is not at all a new problem for Python. In fact, it's possibly more of a
problem for Python than it is for C++, because Python's dynamic nature and
exposure of many internal mechanisms makes it possible to do some pretty crazy
stuff in the libraries.

For example, speaking of async - even 2.7 already has Twisted, and an
ecosystem of libraries around it.

The only two ways I can see it being solved is either by making it more of a
toy language (which is great if you're just writing short scripts, but it's
not really what it's supposed to be about); or by having a very centralized
"best practices" enforcement that basically forces libraries to conform
through peer pressure, like Java - which has its own disadvantages aplenty.

------
earthnail
I just got started on Python's asyncio. It took me a long time to understand
how to get things done, but after about a week of doing small example projects
in the evenings, it finally 'clicked'. For the record, I never worked with
gevent before, and have only done a small webserver with Tornado a long time
ago, so I can't comment on how it compares to other solutions. One thing that
made me wait for asyncio is the fact that, despite being experimental, it has
a lot more of an "official" feeling to it.

I'm trying to stay away entirely of the Python 3.4 way with coroutine
decorators, and am using only await and async in Python 3.5. The async code I
wrote has to live in parallel with regular synchronous Python code in a large
scientific code base, but migrating our custom database adapter to an
asynchronous codebase without breaking old synchronous code was surprisingly
easy.

Debugging is, in my opinion, a pain. Stacktraces can be extremely long and
very hard to understand. The only profiler that seemed to give useful results
was pprofile (in sampling mode). I also still don't fully understand why
there's both Futures and Tasks - I probably didn't spend enough time
understanding the difference, but that just means the author of this blog post
is right. Mixing asyncio with threads and/or processes, however, is
surprisingly easy and elegant.

I hope the Python developers will have the courage to break backward
compatibility in the asyncio module, and will remove the old yield from and
@coroutine way of doing things. That would probably help a lot in reducing
confusion. There's still not a lot of information about asyncio when you
google for it, so the amount of existing code and examples that that would
invalidate would not be too high.

All in all, we are very happy with asyncio. We use it mainly to add
concurrency to small sections of our code base. By default, all our code is
synchronous, with some heavy I/O-bound functions exposing async versions, too.
asyncio allows us to parallelise these sections without the use of a thread
pool, and thanks to Futures/Tasks and queues, it's actually very easy to do
this in a "streaming" fashion if the order of processing of the outcome of
your concurrent tasks matters. Add to that the executors which allow you to
run stuff in sub-processes when you're CPU-bound (instead of I/O), and it
makes for a fairly solid tool.

~~~
takeda
> I also still don't fully understand why there's both Futures and Tasks - I
> probably didn't spend enough time understanding the difference, but that
> just means the author of this blog post is right.

Because they are essentially the same.

Task is just extending Future and adds extra functionality (like for example
keeping track of tasks schedule in given event loop).

It is created when you call ensure_future() or loop.create_task(). You are not
supposed to create it directly, so if you're wondering whether you should use
Future or Task you should use Future.

------
doh
So I'm not the only one. I wrote larger-ish project while using asyncio and it
is pain. The syntax is very unfamiliar (especially in 3.5 with async/await),
the documentation is confusing and it's very hard to debug it in general. Also
it's very hard to combine/stack multiple IO heavy events (make 5 calls to
these URLs and whichever is done and returns these task run them in parallel).

~~~
greglindahl
That last part is exactly what my webcrawler does with asyncio plus threads:
[https://github.com/cocrawler/cocrawler](https://github.com/cocrawler/cocrawler)

This is the part that sends work to a thread:
[https://github.com/cocrawler/cocrawler/blob/master/cocrawler...](https://github.com/cocrawler/cocrawler/blob/master/cocrawler/burner.py)

I agree that this was confusing in the docs. Docs can be improved. It really
helped that this isn't my first crawler written using cooperative
multitasking.

------
lyschoening
There are a few new concepts that need to be learned when beginning work with
asyncio as well as some confusing naming choices (who would expect that a
coroutine is not the same as a coroutine function?). Some of the tooling is
also lacking. Armin points out, no doubt with a look at server frameworks,
that there is no elegant way to access the context of a task.

That being said, as someone who started working with asyncio in Python 3.5
none of it feels particularly difficult to understand. Asyncio needs more
work, sure, but the API so far is relatively straightforward.

The overloading of iterators/generators is a bit odd. But the same approach
that was taken with JavaScript -- and it's nothing anyone working with Python
3.5 or above will be exposed to. If I recall correctly, Python 3.6 will even
feature async generators. Any developers diving into this using that release
won't have to ask themselves why they can't 'yield' from within an 'async'
function.

~~~
the_mitsuhiko
I don't find it hard to write asyncio code but I find it hard to generically
interface with other people's asyncio code and that is lacking basic support
like a context object or logical call contexts or just the lack of agreed upon
usage patterns.

It does not help that asyncio evolves in the stdlib and changes with every
major Python version. It might be less of an issue if this was pip installable
I suppose. Right now writing utility code for asyncio is targeting many
things.

~~~
sitkack
I was talking with David Beazley and he had some of the same confusions around
asyncio. I think it would be nice to revisit it with a round table and fix it
in 3.7.

~~~
webmaven
This should definitely happen, if at all possible.

------
erikb
From what I learned through youtube videos and meetups it seems to solve a
problem that doesn't exist (getting all these patterns into python) and in
return doesn't solve the problem people hope it would solve (multiprocessing
made easy and pythonic). That's why people who want to pretend to be smart
(like me a few years back) find this totally attractive, king's new clothes
style. Nobody really understands it so they can act like they really do
something meaningful with it.

How do I get to this painful conclusion? Well, as I said, I was (and probably
still am to some degree) just like that. And to me it looks nearly
irresistably interesting. But at the same time I also don't know what I would
use it for, what it would actually improve for me. And since the need to pay
my rent forced me to use my time more practically I didn't get around to
looking at asyncio more in depth. Both these things together make me believe
it's not really solving a real problem.

~~~
pekk
The name "asyncio" includes the substring "io" and is clearly about
asynchronous I/O, so it's hard to see why anyone would think its purpose was
"multiprocessing made easy."

------
kamyarg
Python core developer Brett Cannon wrote a nice(and long) post about
understanding it a couple of months ago[1], that might help. After reading
that I felt like I could use it, but never got the chance to try.

[1] [http://www.snarky.ca/how-the-heck-does-async-await-work-
in-p...](http://www.snarky.ca/how-the-heck-does-async-await-work-in-
python-3-5)

------
orf
I'm working on some Asyncio stuff right now. Asynchronous programming seems
pretty natural to me, but other people do struggle to wrap their heads around
somewhat confusing terminology: tasks, co-routines, awaitables, event loops.
Underneath the terminology the theory is pretty simple.

And boy is it powerful. If you ever find yourself doing network requests in a
loop (for url in list: requests.get(url)) then a small bit of refactoring and
a sprinkling of asyncio will speed this up immensely.

But it's not just for network calls, you can `await` on threads and processes.
It's a joy and I think it's one of the best things in Python right now.

~~~
sidlls
Have you tried comparing the performance of asyncio based network requests
versus multithreaded requests? And also compared the relative complexity of
the code?

I have never used asyncio in Python, mainly because the very use case you
described is solved with multitheading, but that doesn't mean it's solved best
that way of course.

~~~
Animats
Python's multithreading is insanely inefficient, because of the Guido von
Rossum Memorial Boat Anchor. Anything in Python can mess with the innards of
anything else at any time, including stuff in other threads. (See
"setattr()"). There's no such thing as thread-local data in Python. This
implies locking on everything. CPython has one big lock, the infamous Global
Interpreter Lock. Some other implementations have more fine-grained locks, but
still spend too much time locking and unlocking things. One Python program can
thus use at most one CPU, no matter how many threads it has.

This basic problem has led to a pile of workarounds. First was
"multiprocessing", which is a way to call subprocesses in a reasonably
convenient fashion. A subprocess has far more overhead than a thread; it has
its own Python interpreter (some code may be shared, but the data isn't) and a
copy of all the compiled Python code. Launching a subprocess is expensive. So
it's not a good way to handle, say, 10,000 remote connections.

Now there's "asyncio", which is the descendant of "Twisted Python". That was
mostly used as a way for one Python instance to service many low-traffic
network connections. The new "asyncio" is apparently more general, but
hammering it into the language seems to have created a mess.

After the Python 3.x debacle, which essentially forked the language, we don't
need this.

~~~
the_mitsuhiko
> There's no such thing as thread-local data in Python.

There is. threading.local in all aspects is thread local data.

> Now there's "asyncio", which is the descendant of "Twisted Python". That was
> mostly used as a way for one Python instance to service many low-traffic
> network connections. The new "asyncio" is apparently more general, but
> hammering it into the language seems to have created a mess.

I think the mess was created before 3.5. Had the whole thing started out with
the async keywords we might have been spared `yield from` which is a beast in
itself and a lot of the hacky machinery for legacy coroutines. I do think
however we can still undo that damage.

~~~
Animats
_threading.local in all aspects is thread local data._

You can still pass data attached to threading.local to another thread. Another
thread may be able to get at threading.local data with setattr(). There's no
isolation, so all the locking is still needed.

This is a hard problem. There's real thread-local data in C and C++, but it's
not safe. If you pass a pointer to something on the stack to another thread,
the address is invalid and the thread will probably crash trying to access it.
C++ tries to prevent you from creating a reference to the stack, but the
protection isn't airtight. In Rust, the compiler knows what's thread-local, as
a consequence of the ownership system. Go kind of punts; data can be shared
between coroutines, but the memory allocation system is mostly thread-safe.
Mostly. Go's dicts are not thread-safe, and there's an exploit involving slice
descriptor race conditions.

~~~
the_mitsuhiko
> You can still pass data attached to threading.local to another thread.

You can in most languages. Only rust I know has enough information to prevent
that.

------
balloob
At home-assistant.io we just migrated our core from being based on threads +
locks to use asyncio. We managed to keep a full backwards compatible API
available so we can slowly migrate other parts of the system over to async.

Asyncio has a steep initial learning curve (especially in our hybrid setup)
but it's well worth it. We target low resource computers like Raspberry Pi and
using async over threads has speed up things a lot.

The biggest catch is that while writing code you have to think about every
function that you use. Is it doing I/O, is it a coroutine or is it
callback/async friendly.

~~~
ddorian43
... why do you need locks in threads and not in asyncio ?

~~~
the_mitsuhiko
You need locks in asyncio as well.

~~~
xapata
Only if you have multiple threads/processes/coroutines sharing resources.

~~~
DasIch
"Only"? You need locks in exactly the same scenarios as you do when you use
multiple threads.

~~~
xapata
Good point. I should have said that asyncio allows patterns that use fewer
shared resources.

Actually... I take that back. Asyncio is solving a different problem than the
locking of shared resources.

------
yladiz
Maybe I'm not the intended audience for this article, but I don't really have
a problem with a lot of the warts of asyncio. It really is pretty simple to
use if you don't have your mind hard set to a different way of thinking. It
wasn't as simple in Python 3.4, but then the async/await keywords became part
of the language in Python 3.5 and simplified things. Understanding asyncio
took me a few days to understand the complexities: the event loop usage and
how this works with async/await, how an async function is different than a
normal function, and a few other things. Once I understood those complexities
(which I don't mind because it is a new part of the language) it was easy.
Granted, I feel like it's easier to just work around issues rather than
complain about them but I didn't find particularly many issues about asyncio.

As an aside, not to pick on Armin but he was also complaining about Python 3
about things that may have been valid, but were more that he didn't like the
way that Python 3 worked because he liked Python 2 more, and hid that fact by
writing lengthy blog articles about how he doesn't like certain Python 3
things. I do find it a little strange that he does complain about these things
publicly rather than trying to fix the things he doesn't like (if they are
fixable), especially given his reputation in the community and his Python
projects like Flask, because it makes him seem whiny and solves none of the
issues that he's presenting.

~~~
the_mitsuhiko
> I do find it a little strange that he does complain about these things
> publicly rather than trying to fix the things he doesn't like (if they are
> fixable), especially given his reputation in the community and his Python
> projects like Flask, because it makes him seem whiny and solves none of the
> issues that he's presenting.

It's easy write code, it's harder to write specs and design systems and it's
hardest to convince others. I'm very bad at the last part. My only real
attempt to improve python 3 that went anywhere was to get the u prefix back.
My suggestions for bytestrings were not very popular for instance.

~~~
yladiz
Okay, fair enough, it's harder to think about a problem than write the
solution to the problem in general, at least for software development. But it
would be better if you did try to talk to the Python developers in a
constructive manner and try to get whatever you feel is wrong with Python
fixed. Writing blog posts is fine, and I'm not saying that voicing your
opinion is bad, but there is a threshold between voicing an opinion and
actually doing something about it, and while I feel you do contribute to the
Python community, you don't contribute to Python itself even if you have the
ability to and after years of blog posts you've crossed that threshold. The
reasons that you don't attempt to contribute I'm not sure -- lack of patience
for the process, unable to convince others, designing a system that fixes the
issues you're finding. The biggest thing is that if you want something fixed
and have the ability to fix it (which I feel you would have) don't just sit
around and complain -- do something about it. The best way to convince a
developer is to write code to prove your point.

------
jdnier
The post mentions in passing David Beazley's curio[1] project. Is anyone using
curio for async programming instead of asyncio?

The curio docs are fun to read through and didn't leave me feeling lost.
They're also full of understandable examples of using the new async/await
syntax.

I've made some simple scripts with curio but have found I keep hesitating take
on the asyncio docs to learn it "the real way". Any thoughts on whether curio
might be a plausible alternative to asyncio?

[1]
[http://curio.readthedocs.io/en/latest/tutorial.html](http://curio.readthedocs.io/en/latest/tutorial.html)

~~~
scribu
Since it's polling for I/O events and doesn't use threads [1], it does indeed
sound like a plausible alternative for doing non-blocking I/O.

I think I'm going to take a stab at using it, since asyncio doesn't have a
mature ecosystem around it anyway.

[1] [http://curio.readthedocs.io/en/latest/index.html#under-
the-c...](http://curio.readthedocs.io/en/latest/index.html#under-the-covers)

------
r4pha
I had a similar experience with python 3's asyncio. I have worked with gevent,
which has an arguably less "elegant" interface (with monkey patching, for
instance), but with which it is so much easier to be productive. I was
frustated for having dificulties understanding and using asyncio. The author
is a better python programmer than I am, so I supposed there is really a
problem there.

------
mangeletti
The one thing that I think is absolutely lovely is the syntax:

    
    
        async def spam(eggs):
            ...
    

It can't get much better than that.

My hope is that the implementation will become:

A) simpler

B) more optimized

C) out of the box functional (e.g., no need to manage event loops and other
things yourself)

D) unified (e.g., the zen of python even states that there should be one way
to do something)

I also strongly agree with the comment @RodericDay made herein, pertaining the
new typing syntax and the addition of yet another string formatting syntax (a
dangerously unexplicit one).

~~~
voltagex_
Have a look at the way C# does it. I think it comes pretty close.

------
denfromufa
I find that Armin's blog posts about Python 3 are generally destructive, not
constructive. A constructive way would be to post suggestions on python-ideas
and python-dev mailing lists or report bugs and feature requests on issue
tracker.

Python is open-source project and there is not a single core developer working
full-time on the language unlike his beloved Rust.

~~~
raymondh
I've always found Armin's posts to be intellectually honest and thought
provoking. And unlike your ad-hominem comment, he itemizes his concerns and
provides details (such as the performance comparison with David Beazley's
curio project).

At the very least, it is a warning sign that a notable and highly experienced
Python expert is having a hard time grappling with the best practices (or even
workable practices) for a significant new feature set: "I know at least that I
don't understand asyncio enough to feel confident about giving people advice
about how to structure code for it."

As far as I can tell, not a single respondent to this thread has indicated
that to the contrary, they have been able to say that they do feel confident
enough to give people advice on how to structure code with asyncio.

At the very least, that means that we have a documentation and communication
problem which is either intrinsic to the new API or something that will work
itself out over the next few years.

~~~
ubernostrum
For what it's worth, I read Armin's critiques, but I take them with a grain of
salt; it's clear that what he wants from a language, and what other people
want from Python, diverged a while back and are probably irreconcilable at
this point. That doesn't mean he doesn't have good points, but does mean that
I read his articles through a lens of "the language he really wants probably
is never going to be Python again".

~~~
the_mitsuhiko
> I read his articles through a lens of "the language he really wants probably
> is never going to be Python again"

I always had that opinion that what I would like Python to be like is never
going to happen. This is not something new with Python 3.

------
jftuga
I wrote a fast tcp port scanner using Python 3.5 and
concurrent.futures.ThreadPoolExecutor.

See lines 123-124:

[https://github.com/jftuga/universe/blob/master/tcpscan.py](https://github.com/jftuga/universe/blob/master/tcpscan.py)

I have used this under Linux, OSX and Windows. It's cool to add the Thread
Count field in Task Manager and then see something I wrote use so many
threads! I am more of a sys admin, so this code could be better - but it seems
to work very well. :-)

~~~
hueving
quick github tip: if you click the line number on the left you can get a link
directly to the line you are referring to.

extra tip: after clicking the line, press 'y' on your keyboard and you'll get
a link to the file in it's state at the current commit so future commits won't
break your old hyperlinks.

~~~
jftuga
cool, thanks

------
adolgert
The Fluent Python book has a nice set of chapters on coroutines, futures, and
async.io. They present not the whole of what it does but one way to do it,
which helps.

------
scardine
Armin criticism of Python 3 is nothing new.

He is the author of several popular projects including the web framework
"Flask". This makes him a person very respected in the Python community -
personally I love his taste for interface design.

I wish he could interact better with the core team, because some of his rants
are not as constructive as they might be.

~~~
coleifer
Whoa... Armin has never, to my recollection, been malicious in his criticisms
of Python 3. He simply disagrees with some of the core team's decisions. And
he's far from the only one.

It's not Armin's duty to interact in any way with the core team. Hell, the
core team should be working to please Armin if you ask me. He's your target
user, and he's disappointed with your product, it's not his fault.

I think Armin does a wonderful job providing a voice for those of us who are
increasingly disenchanted with Python 3.

~~~
solipsism
We know it's you Armin!

------
991821911
I still prefer Twisted, it feels more natural to me. But this is something
that one cannot say aloud in many Python circles.

~~~
bogomipz
>"But this is something that one cannot say aloud in many Python circles"

Can you explain why this is? What's the central issue?

~~~
X86BSD
Twisted ROCKED. I remember when it came out everyone I knew thought it was
pretty kick ass, at least those who developed in Python.

Did something happen that it's now fallen out of favor?

~~~
rspeer
By now, people expect libraries to have thorough documentation on the Web, not
just an O'Reilly book.

The fact that the port of Twisted to Python 3 is slow-going, and far from
complete, also gives the suggestion that there are corners of the code that
developers don't even understand anymore.

------
omginternets
What's missing from the asyncio docs, IMHO, is a section on common asyncio
idioms and patterns. I understand the pieces individually, but I struggle to
get a more global, Gestalt understanding of the system.

~~~
the_mitsuhiko
The problem there is that not everybody who develops asyncjo/tulip agrees on
the patterns.

~~~
omginternets
You're probably right! I just (for once!) wish a few opinionated people would
tell me what they think!

------
coldtea
> _Since I 'm not clever enough to actually propose anything better I just
> figured I share my thoughts about what confuses me instead so that others
> might be able to use that in some capacity to understand it._

Coming from a well known Python developer, this gives a somewhat passive-
aggressive vibe (which might not be meant at all, just sayin').

------
poletopole
I didn't need to understand everything about asyncio to use it successfully.
It's definitely a lifesaver when it comes to making lots of requests; hours of
waiting turn into seconds.

------
sametmax
Actually using asyncio tools is quite easy and straighforward. However,
writting an asyncio lib, or god saves you, an asyncio framework, is really,
really hard.

------
shad42
The complexity is the reason why I came back to gevent, especially since it
now supports Python 3...

Gevent monkey patching isn't perfect but it works and gets you closer to how
an event loop should be used with standard libs IMO, closer to Go.

------
marmaduke
No one has to use this stuff if they don't want to.

Another complexity in Python is metaclasses. I've written meta classes which
generate data descriptors, and was greatful that that was there when I needed
it, but I also needed to look at the data model reference constantly and wrote
200% coverage tests.

------
carapace
Coroutines are goto.

That's not a joke or anything, the semantics of coroutines are the semantics
of goto statements. All this async business is just the old spaghetti sneaking
back in while people are distracted by the nomenclature.

~~~
1st1
Then for-loops are goto as well.

~~~
y11
A for loop is localized. I've seen Python code bases that use so many
interdependent _iterators_ that the code is non-refactorable and the
developers just pray that it does the right thing.

That's plain iterators. Now people throw in coroutines.

------
gdamjan1
It's too late to complain now.

I did like gevent/greenlet a lot, but the wider community, for many years, was
unforthcoming to it.

Now asyncio is in the stdlib, including the language changes for coroutines,
better than the status-quo.

------
morty16
From early on in the article

> asyncio.get_event_loop() returns the thread bound event loop, it does not
> return the currently running event loop.

How can these be different objects? In order to ask for the thread-bound event
loop, you must be in the thread, right? When/why would you expect anything
else?

fyi, I don't have any background with asyncio/twisted.

------
brettcannon
If people would like some more illustrative examples for using asyncio there's
[https://asyncio.readthedocs.io](https://asyncio.readthedocs.io)

------
ddorian43
So I guess it's ok for now at least to stay with gevent ? Does asyncio has any
pros (except that it's explicit) compared to gevent ?

~~~
rspeer
People don't write articles named "I don't understand gevent" because nobody
expects to understand gevent.

Armin's complaints are for library writers. You will do much better using an
async framework with support from the core language (asyncio) than a
monkeypatch.

------
woah
I'm not really that well versed in Python, but wasn't it supposed to be a
language for beginners?

~~~
pdonis
_> wasn't it supposed to be a language for beginners?_

No. It was supposed to be a language in which programmers at widely different
skill levels, from beginner to expert, could be productive. Easy to pick up
the basics, but also easy to use more advanced techniques when you find you
need them.

------
dschiptsov
This is what happens when coders are rushing to code without real
understanding what they are doing and why. What is worse - they borrow
"features" without understanding from amateur JavaScript projects or C#.

What a bloated mess. This is clearly the second system syndrome, described in
the Mystical Man-month.

In good old times futures were macros on top of delay and force special forms,
and explicit message passing a-la Erlang would do the job.

~~~
orf
What a confused comment. You're saying its bad because it's borrowed hastily
from other languages, and a better solution is... borrowing features from
Erlang?

~~~
dschiptsov
Not at all. Erlang's approach to concurrency has been well-researched and
validated (Akka).

Async, await and friends are mere standardized kludges - popular syntactic
sugar without clear semantics and real world connection (explicit message
passing mimics how biological systems do self-regulation).

So called enterprise languages, especially C++ are full of similar stuff
(kludges).

~~~
orf
> popular syntactic sugar without clear semantics and real world connection

Uhh... the point of them is that its as close to the semantics of synchronous
code as possible. That's the 'real world connection' \- your single threaded
code can become asynchonous with just a few keyword changes. Rather than
"sendRequest()" you do "await sendRequest()".

------
agumonkey
I consider A.Ronacher as a very very competent guy. I felt really bad not
mastering asyncio; now I'm laughing green.

