Hacker News new | past | comments | ask | show | jobs | submit login

Anecdotally, no. From where I sit, the multiprocessing and threading modules still seem to be the main way to to concurrency in python.

From where I sit (I've been soloing python projects for a few months now) For IO-blocking tasks, asyncio is the way to go and threading is very rarely needed if any. It is very simple to place a bunch of tasks into a list and do asyncio.gather on them, or add timeouts to async methods.

For CPU intensive tasks, asyncio actually provides it's own syntax for process pools which are easily interchangeable with thread pools (just change an import alias)

We actually tried switching to asyncio in production, but got worse cpu utilisation and worse latency (vs greenlet). After some benchmarking and tuning, we got a bit better latency, but CPU utilisation was still worse, so we actually ended up swithcing back to just using greenlet :(

I would not recommend using asyncio for cpu-intensive tasks. Processing pools are much better for that.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact