
PEP 492 – Coroutines with async and await syntax - 1st1
http://python.org/dev/peps/pep-0492
======
bayesianhorse
I am detecting a high level of "Py3K denial syndrome" in this discussion.

The greatness of Python would have been impossible without rejecting virtually
all feature requests. To move the vision of Python forward, it was necessary
to depart with old syntax.

Python 3 is happening. Deal with it.

~~~
captainmuon
From my perspective (mostly scientific computing), Python has been killed
after 2.7 by its creators. Python 3 might as well be Perl 6.

I know this is just the situation in my corner, and that it looks rosier for
other users - e.g. for web development, system administration, etc..

The thing is, we have decades worth of legacy code - large libraries and small
config scripts - that no one is going to rewrite. All python versions up till
2.7 were mostly backwards-compatible, and we came to depend on that. That the
changes towards 3.x were breaking seems totally unneccessary (except for the
string/unicode thing), which also gives it a psychological component in my
opinion. Give us back the stupid print statement, and I bet this alone will
massively increase adoption.

And if someone makes a patch to Python to import a Python 2 module in Python 3
(old syntax, old str/unicode, and a copy of the old stdlib - and it doesn't
matter if it is 10% slower or never gets merged upstream), I'd be willing to
pay $$ for it.

~~~
ant6n
is there any PEP that makes parens around function call params optional? Tried
to find it, no luck.

~~~
actsasbuffoon
That would be tricky. Ruby has optional parens for function calls, but it
makes it tricky to get a reference to a method without calling it. We have to
write "foo.method(:some_method_name)", while the Python equivalent is simply
"foo.some_method_name".

You might be able to require parens for methods with no arguments, but
optional parens with arguments. It might require some clever hacks in the
parser, but I suspect it's possible. The bigger question is whether or not the
result would be Pythonic.

~~~
ant6n
Maybe they could just re-introduce print x as syntactic sugar for galling a
print function. Like in Matlab, where the operators (+, *, ..) are just
syntactic sugar for calling functions add, mul, ..

~~~
DrJosiah
The Python 3.x print function and 2.x print statement automatically called the
__str__ method on an object to get a representation to print to screen. The
result is then sent to the appropriate file handle for actual printing (which
can be overridden).

------
sylvinus
New features like this are exactly what we need to encourage more people
(including myself...) to invest time porting their projects to Python 3! I
look forward to seeing this approved.

~~~
endymi0n
Exciting but toothless "features" like this are exactly what we need to
encourage more people to invest time porting their projects to languages with
proper concurrency primitives like Go, Rust or Scala... :)

------
rdtsc
Looks good, but coming from someone who used Twisted for a few years, I had
found deferreds to be messy and switched to yield-type co-routines
([http://twistedmatrix.com/documents/10.2.0/api/twisted.intern...](http://twistedmatrix.com/documents/10.2.0/api/twisted.internet.defer.inlineCallbacks.html)),
but eventually I found those pretty verbose as well. Small demo examples
always look clean and fun, but large applications at the top level will end up
looking basically like this:

    
    
        y = yield f(x)
        z = yield g(y)
        w = yield h(z)
        ...
    

In this case it would be async/await instead of pure yields.

The worst thing was having to hunt for Twisted version of libraries. "Oh you
want to talk XMPP? Nah, can't use this Python library, have to find the
Twisted version of it". It basically split the library ecosystem. Now
presumably it will be having to look for async/await version of libraries that
do IO.

~~~
nostrademons
I found the same, working with AppEngine NDB in an app that talked to a dozen
or so backends. It's still a lot better than the Node.js (pre-ES6) style of:

    
    
      f(x, function(y) {
        g(y, function(z) {
          h(z, function(w) {
            do_something_with_w(w);
          });
        });
      });
    

What would be a better syntax? The Java/C++ way of threads & hidden shared-
state concurrency is a total mess; it hides all the potential yield points, so
you never know when your flow of control might block or what shared state
might need locking. Channels in Golang are better - they at least have some
syntactic support - but the reification of the yield point into a concrete
channel can end up creating a fair bit of boilerplate in the common case where
you make a bunch of one-off async RPCs to remote services. Maybe Erlang has it
right, where the pid is an implicit channel you can send messages to - but
then you still need to pass that into any async library function, and it gives
you no typing discipline. Maybe we really need something like futures where
the syntax:

    
    
      y <- f(x)
      z <- g(y)
      w <- h(z)
    

means "wait for the promise returned by expression f(z) to resolve, returning
control to the executor, and then assign the result to y."

The splitting-the-library-ecosystem thing is a big part of it too (and why ES6
won't magically fix the Node ecosystem), but that's why Python is putting
async into the language & standard library itself. At least then the stdlib
will support it, and there will be strong social pressure to use the same
concurrency mechanism in all libraries.

~~~
jonesetc
> y <\- f(x)

> z <\- g(y)

> w <\- h(z)

So you would basically want "<-" instead of "yield from"? Or is there
something additional I'm missing? My main thing against this, is that it's
very opposed with general Python principles. Think "||" vs "or", "&&" vs
"and", and "test ? value : alternate" vs "value if test else alternate".

~~~
nostrademons
Yeah, it's just syntactic sugar. "<-" is chosen to match roughly-equivalent
constructs in other languages, like Erlang, Go, or CSP. The reasoning behind
it is that 'yield' is common enough to warrant special syntactic sugar - there
are other precedents, like allowing dict['value'] instead of getitem(dict,
value) or the @ matrix multiplication operator that was just added in Python
3.5.

------
bayesianhorse
At first look I don't like this. "async def" feels ugly, and I am not sure
this is just because of the unfamiliarity.

I don't believe coroutines/async routines will ever be practical without
decorators, so you might as well keep them and not disturb one of the most
basic syntax rules of Python.

~~~
1st1
> I don't believe coroutines/async routines will ever be practical without
> decorators

Why do you believe so?

Have you seen the implementation of asyncio.coroutine? Have you read the PEP
thoroughly and saw the downsides of using a decorator/generators?

~~~
RickHull
For those of us haven't, can you make the downsides clear? (links or text
reply would be fantastic)

~~~
1st1
Please read the Rationale of the PEP.

edit: I can't explain topics better than I did so in the rationale section of
the PEP. If I could, I would have explained it better in that section in the
first place ;)

As for asyncio.coroutine decorator -- it's just a very simple wrapper, that
makes sure that the decorated object is a generator-function. If it's not --
it wraps it in one.

It also does some magic to enable debug features. But with some serious
shortcomings (that is also explained in the PEP in great detail).

My point is: there is absolutely no other value in that decorator. There is
nothing fundamental that it does, it just fixes the warts. Documentation,
tooling, "easier to spot", etc arguments are unfortunately weak.

The PEP makes coroutines in python a first-class language concept, with all
the benefits you can have from it (better support in IDEs, tooling, sphinx,
less questions on StackOverflow).

Disclaimer: I'm the PEP author. I'm also python core developer, and I
contributed to asyncio a lot.

------
orf
This is the kind of feature that will get people really interested in moving
code to Python 3. Everyone said it was lacking a blockbuster feature, this
could be it.

~~~
saurik
As someone still using Python 2 (due to the very large number of libraries and
tools I rely on for Python 2), I tend to look at these kinds of features more
with the eye of "why not implement this for the language people are actually
using more often, aka Python 2, rather than trying to use it to fight an
ideological battle to move people to a language people aren't, aka Python 3?".
The mentality just makes me unhappy with the attempts to somehow force
everyone to move, rather than suddenly excited to use Python 3.

I thereby ask: is there something about Python 3 that makes this feature
extremely easier to implement or extremely easier to integrate? If not,
wouldn't it be more interesting to have "more impact" by improving the lives
of a larger number of people? Python 3 was maybe an interesting experiment,
but given how well Ruby 1.8->1.9 went and how poorly Perl 5->6 has been, it
seems like people should be learning some lessons and fighting a more winnable
battle.

Maybe the people who want to encourage people to migrate to Python 3 should he
spending their time not providing "killer features" but instead narrowing the
gap between the two versions, not by back porting features to Python 2.7, but
by making it easier to use code written for Python 2 in Python 3. It should
not have taken until 3.3 to see the u'' syntax return for compatibility with
Python 2. :(

The current strategy of "try to convince an army of people to port a bunch of
code from Python 2 to Python 3 while people are at the same time still often
writing code for Python 2", which is what we are seeing being asked of people
in other posts here over the last few days (there was a post about this for
Debian) just seems like a waste of effort leading to tons of lost ground to
alternative languages.

~~~
nas
Open source developers work on what they like to do. Guido was the primary
developer of asyncio, funded by dropbox, as I understand. The asyncio library
depends on some Python 3 features that other developers have implemented. I
believe there is some kind of 2.7 backport of asyncio. No one is stopping
development but the core Python team is not interested in releasing new
versions of the 2.x line.

The gap has been getting smaller. Python 3.5 will include %-style formatting
for byte strings. I helped implement that feature since I feel it will make
porting code easier (plus it makes some programming tasks, like network
protocols, easier). The strict handling of bytes/unicode is what really trips
up people from porting and there is just no good way to further smooth that
path, IMHO.

I feel the upgrade path has been handled badly. People were wildly optimistic
about how fast people could port code and much more effort should have been
spent on making the transition easier. A little too much purity instead of
practicality. For example, u'foo' style strings were not accepted until Python
3.3. That just made porting more difficult for great reason.

Given the massive amount of Python 2.x code in the world, I fully expect that
version to live much longer than most people expect. There is still old
Fortran and Cobol code out there for example. Businesses don't want to rewrite
a working system. Someone is going to keep making releases of the 2.x branch.
Maybe it will be a fork.

At this point, I'd bet money on Python 3 succeeding. The vast majority of
users still use 2.7 but there is steady progress of porting. The 3.x branch
has received a lot of new and useful improvements and that is where all the
development is happening. The memory savings from the new string
representation will help my projects a lot. As I said elsewhere, asyncio is
really sweet. The core team needs to keep the improvements coming. More
carrots and less sticks, IMHO.

~~~
xorcist
But it is not old code that is somehow stuck with Python 2. Lots of new code
is written with Python 2 every day. It wouldn't surprise me if there were more
lines of Python 2 than Python 3 produced today.

With many important code bases such as Ansible and Django you still need to
keep a Python 2 around. (Yes, I know Django works on 3, but less people use it
and performance is slightly lower.) So you are dual stacked for the forseeable
future if you choose the Python 3 route.

So the Python ecosystem is a little more complicated than "old" Python 2 code
versus "new" Python 3 code. Should I put my consultant hat on I would still
advise people to start new developments with Python 2, but to keep an eye out
for future compatibility issues. That recommendation hasn't changed in five
years.

~~~
mherrmann
You're probably right that more lines of Python 2 are written today than are
Python 3. But I bet it's like that for virtually every latest version of a
programming language today.

Python 2 will be kept around until 2020. Obviously there will still be
companies using it after that date. But they have been given ample notice.

I use Django with Python 3 and it's great. Admittedly, I don't need a lot of
performance though.

I really like Python 3 mainly for its more explicit treatment of Unicode.

I believe the only way for Python to remain a success is if the community
adopts Python 3 and puts in the effort to migrate away from 2. Once the
community is there, organizations will follow.

------
Sidnicious
This looks great, but it'll be even better with utilities that let you await
multiple tasks in parallel:

    
    
      a, b, c = atools.join([f(), g(), h()])
    

and select among multiple async tasks (e.g. to implement a timeout):

    
    
      with atools.select({"res": f(), "timeout": timeout(10)}) as tasks:
          res, which = await tasks

~~~
1st1
There is equivalent to 'atools.join' in asyncio. And I suspect that we'll see
libraries like 'atools' in the wild if the PEP is accepted. If those libraries
will end up being popular, we'll add one of the to the standard library!

I like the idea of 'a, b = await foo(), bar()' syntax. I'll think of how we
can implement it and integrate to the PEP.

Thanks!

------
sametmax
I read it, but I don't see the benefit of it vs yield from and coroutines. Can
somebody explain ?

~~~
ceronman
This is explained in the _Rationale and Goals_ [1] section of the PEP:

    
    
        it is easy to confuse coroutines with regular generators, since they share
        the same syntax; async libraries often attempt to alleviate this by using
        decorators (e.g. @asyncio.coroutine [1] );
    
        it is not possible to natively define a coroutine which has no yield or
        yield from statements, again requiring the use of decorators to fix
        potential refactoring issues;
    
        support for asynchronous calls is limited to expressions where yield is
        allowed syntactically, limiting the usefulness of syntactic features, such
        as with and for statements.
    

[1] [https://www.python.org/dev/peps/pep-0492/#rationale-and-
goal...](https://www.python.org/dev/peps/pep-0492/#rationale-and-goals)

~~~
sametmax
I just said I read it. Copying it and pasting it doesn't make it more easy to
understand.

I just can't somebody somebody would like to introduce a whole new syntaxe
just for dislike of decorator and 2 edge cases twisted/tornado/greenlet lived
with for years. So I though I missed something. Espacially since we have ways
to do async and "there should be one way to do it" is as much important as
keeping builtins number down.

So I'm asking again : am I missing something or is it again on of these "I
don't like it so let's do it like in language x" claims like we had for
brackets, lambdas et alike ?

~~~
jacobolus
Here are some comments from python people (from the mailing list):

Victor Stinner: """As a contributor to asyncio, I'm a strong supporter of this
PEP. IMO it should land into Python 3.5 to confirm that Python promotes
asynchronous programming (and is the best language for async programming? :-))
and accelerate the adoption of asyncio "everywhere". ¶The first time Yury
shared with me his idea of new keywords (at Pycon Montreal 2014), I understood
that it was just syntax sugar. In fact, it adds also "async for" and "async
with" which adds new features. They are almost required to make asyncio usage
easier. "async for" would help database ORMs or the aiofiles project (run file
I/O in threads). "async with" helps also ORMs."""

Marc-Andre Lemberg: """Thanks for proposing this syntax. With async/await
added to Python I think I would actually start writing native async code
instead of relying on gevent to do all the heavy lifting for me ;-)"""

Guido: """I'm in favor of Yuri's PEP too, but I don't want to push too hard
for it as I haven't had the time to consider all the consequences."""

etc.

For more: [https://mail.python.org/pipermail/python-
ideas/2015-April/th...](https://mail.python.org/pipermail/python-
ideas/2015-April/thread.html#33007)

~~~
sametmax
Thank you, that helped.

------
pjtr
This reminds me very much of C# async and await (although "async for" and
"async with" are very nice additions). It seems to me many other languages are
now basically adopting wholesale these same features from C#.

I'm curious, is this just my warped perception? Did this terminology and
feature set previously exist in programming language theory or practice? I
know F# had "async" for a while before C#. Was there "prior art" that looked
basically the same in Lisp, Haskell, Scala, ML, etc. or in some research
language long before that?

~~~
kevingadd
C# has essentially put a common name to many existing CS features and
popularized particular forms of them.

A bunch of modern C# features are actually lifted from F#, which had them
first on the .NET platform. Sometimes the C# version is simplified or has some
of the sharp edges filed off (which makes it less powerful, but safer).
Async/await is definitely one of these.

~~~
didibus
I love C#'s names for things. They take often academic names of non obvious
meaning, and just replaces it with a common word whose meaning is obvious.

------
hendzen
And here's a proposal for something quite similar in C++, written by the
author of Boost Asio:

[http://www.open-
std.org/jtc1/sc22/wg21/docs/papers/2015/n445...](http://www.open-
std.org/jtc1/sc22/wg21/docs/papers/2015/n4453.pdf)

------
jMyles
The "async with" is syntactically pretty similar to crosstown_traffic from
hendrix (based on Twisted):

[http://hendrix.readthedocs.org/en/latest/crosstown_traffic/](http://hendrix.readthedocs.org/en/latest/crosstown_traffic/)

------
anon4
Since

a. this is functionally equivalent to coroutines implemented via library
support through generators

b. Python will never ever get rid of the GIL, i.e. won't have real parallelism
(given that the JVM/CLR Python implementations are basically dead and PyPy is
chasing the pype dream of STM, we're left with CPython the only actual
implementation)

Is there any reason to add new syntax? Other than wishful thinking that one
day someone will write a Python runtime with proper JIT and parallelism
support?

~~~
ynik
async/await is useful for more than just 'real parallelism': its main use is
coordinating concurrency. The GIL isn't a problem when dealing with async IO
or async calls into C libraries.

------
ksec
Off Topic: Does anyone know if there are anything similar for Ruby in the
work?

P.S - Ruby's issues system is hard to follow.

~~~
dmgbrn
Yes and no. Ruby has Fibers, but they're roughly where Python is with
generators as coroutines. They're wonderful when working with EM, and one time
I wrote a massively overengineered Ruby Warrior program that allowed you to do
things like "move(:left) until feel(:left).wall?" instead of doing one thing
each time a method was called. This is exactly what makes coroutines so
attractive in simulation programs. Still, Fibers are clumsy and
counterintuitive and require some abstraction to produce a useful interface.
Also, there are some problems related to overflowing the Fiber stack,
especially in ridiculously deep Rails stacks, not sure if those have been
resolved yet.

I'm not sure that special syntax is necessary for widespread adoption of
Fibers, since Ruby syntax is flexible enough to provide a sort of DSL for
working with Fibers.

------
jhgg
Although this is great and all, what does this do compared to green threads
(gevent, eventlet, etc...)?

~~~
nchammas
I'm not sure from what perspective you were looking to compare asyncio to
green threads (performance, readability, etc.) but there is an interesting
blog post [0] by the lead architect of Twisted about the fundamental
difference in programming model between explicit yielding (e.g. asyncio) and
implicit yielding (e.g. green threads, regular threads). It may give you some
answers.

PEP 492, from what I understood, is more about streamlining the syntax, and
the major differences between asyncio (pre this proposed syntax) and green
threads in terms of programming model still apply.

[0]
[https://glyph.twistedmatrix.com/2014/02/unyielding.html](https://glyph.twistedmatrix.com/2014/02/unyielding.html)

~~~
nas
Glyph is a smart guy and has been doing async code for a very long time (at
least in computing terms). It would be wise to consider his arguments.

I've done a project making heavy use of Python 3's asyncio feature. It is much
nicer than doing the same job with threads or call-backs, in my experience.
I'd welcome nicer syntax but the current functionality seems to work well.

I can't understand why someone would design a language in 2009 (Node.js) that
uses callbacks. For simple programs it works but it doesn't take long to end
up with an unmaintainable mess (in my experience).

~~~
pekk
Why is it wise not to consider opposing arguments? Because people who disagree
with Glyph are not "smart guys"?

~~~
nas
Quite the logic you have going on there. Certainly you should also consider
other arguments.

------
mangeletti
Dear Python,

I love you.

------
carapace
The Python 2.7 code freeze is totally working! All the horrid new BS is
getting piled into 3.* and Python 2.7 is becoming _sweet_ FORTRAN. Viva la
BDFL! So wise! ;-)

~~~
carapace
Downvotes? Folks I am a Python fanboy, I love it, am devoted to its success,
and I'm delighted to use it every day in my work and personal life.

I meant the above comment _in earnest_ , not sarcastically. Python 3.* lets
the people who like that sort of thing have their cake and eat it too (and I
am one of the people who drools at new shiny tech.) But the fact that Python
2.7 is stable for the foreseeable future is something to celebrate in my
opinion, not condemn.

------
Animats
Python 3 does not need cool new features right now. Python 3 needs to get its
existing libraries working properly. People aren't using Python 3 because the
out-of-the-box experience sucks. We covered this yesterday on HN.[1]

[1]
[https://news.ycombinator.com/item?id=9378898](https://news.ycombinator.com/item?id=9378898)

~~~
scrollaway
Who made you BDFL?

