I'm making an argument about how threads are used (in real life) in web development, an area where it's trivial to make concurrency and synchronization someone else's problem. Despite this, I've heard a number of hypesters throw around the idea that this scenario is an example of whether threading fails and moving to async is required.
I agree with you that this is a weak argument, and I hope to see people understand better the difference between:
a) an application that NEEDS to handle huge amounts of concurrent users (because most of them are idle for most of their lives), and
b) an application that spends a non-trivial amount of time using the CPU, and therefore does not need more than a few threads to fully utilize the CPU
There are different cases, and while those of us with a good grasp of the subject understand the difference, a lot of people have conflated the two ideas, and then further conflated the problems of thread synchronization in these cases as well.
It was things like the comparison to Node (which I don't use) and the comment about how well async had worked for you in browser js --- which implicitly somewhat demerited serverside async --- that made me think you might have been reaching for something more ambitious with this post.
I explicitly referenced a chat server (lots of concurrent, mostly-idle requests) as a case where I'd personally use an async solution.
There are certainly middle-ground cases where the question is muddier (and more religious, likely), and I just wanted to set the record straight that Rails itself is not really in the middle. I'm glad that you agree :)