Hacker News new | comments | show | ask | jobs | submit login

I disagree that it doesn't add to most programs... it absolutely does as most programs are deployed web applications and use of async/await affects the number of users/requests a server can support dramatically.



Of all the .Net web apps I've worked with I've seen an app maxing out IIS threads once, but I've seen plenty of web apps with shitty SQL queries maxing out the SQL server.

I've admittedly never had the 'millions of requests per second' clients, but the number of available threads has never been remotely the problem.

Back in 2006/2007 we had one app maxing out the IIS threads, but that was back when there were 100 threads max (as far as I can remember, might be wrong) and the actual underlying problem was slow SQL requests. Temp fix was to up the IIS threads, but once we fixed the SQL problem, no need for those extra IIS threads any more.

And even then, what will implementing async/await get you that you couldn't fix with a load balancer and web farm? Web farm is cheaper, easier and more effective. I think there's a Jeff Atwood piece somewhere on coding horror, or maybe a Joel one, to that exact effect. Cheaper to throw hardware at it then dev time. There's a certain scale that async/await actually fixes, the vast majority of C# web apps won't ever get close to it.

Every client I've taken on who were having performance problems were always fixed by monitoring SQL and fixing the bad queries. Or it was stupid loops calling moderately expensive SQL that did the same query over and over. Or something that could be easily cached. Or the same query being called several times in a stack where you could easily just pass the result down. One of those clients even had a previous dev who went async/await happy for extra 'performance' and it did zip. I ended up ripping most of it out, as I said, it infects code up and down the stack and complicates the code. All the 'optimized' async code was taking 1-2 milliseconds to run while the problem SQL was taking 5-10 seconds.

In a web app I can see the need if you're doing lots of external requests


What it gets you is not having to run as many servers... I've worked on IIS apps that required 6+ servers to handle high load, and that was before async was available, Similar situations could reduce the servers needed to 3-4. But agreed, in most situations it may not be needed... However, if you're in an org an want multiple apps per server/cluster then it benefits all the other applications.

I've seen far more issues with people not understanding how to use static and how it applies in a web application.


But most programs written are internal corporate apps at SMBs, and async/await isn't needed at that scale.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: