Personally I think the Async/Await model is too limiting. E.g. lets talk about this:
var result = await client.GetStringAsync("http://msdn.microsoft.com");
From the perspective of the caller, that code is still blocking. It solves the problem of threads being blocked, but that's only one of the problems you have.
The real and more complete alternative would be to work with a well designed Future/Promise framework, like the one in Scala [1] which is also usable from Java [2]. Doing concurrent computations by means of Futures/Promises is like working with Lego blocks.
Let me exemplify with a piece of code that's similar to what I'm using in production. Lets say that you want to make 2 requests to 2 different endpoints that serve similar results. You want the first one that completes to win (like an auction), but in case of a returned error, you want to fallback on the other one (type annotations and extra verbosity added for clarity):
val url1 = "http://some.url.com"
val url2 = "http://other.url.com"
// initiating first concurrent request
val fr1: Future[Response] = client.get(url1)
// initiating second concurrent request
val fr2: Future[Response] = client.get(url2)
// the first fallbacks on the second in case of error
val firstWithFallback: Future[Response] =
fr1.fallbackTo(fr2)
// the second fallbacks on the first in case of error
val secondWithFallback: Future[Response] =
fr2.fallbackTo(fr1)
// pick a winner
val finalResponse: Future[Response] =
Future.firstCompletedOf(firstWithFallback :: secondWithFallback :: Nil)
// process the result
val string: Future[String] = finalResponse.map(r => r.body)
// in case both endpoints failed, log the error and
// fallback on a local copy
val stringWithFallback: Future[String] = string.recover {
case ex: Exception =>
logger.error(ex)
File.read("/something.txt")
}
Given an HTTP client that's based on NIO, the above code is totally non-blocking. You can do many other crazy things. Like you can wait for several requests to complete and get a list of results in return. Or you can try the first url and if it fails, try the second and so on, until one of them succeeds.
In the context of web apps, you can prepare Future responses this way and return them when they are ready. Works great with the async responses support from Java Servlets 3.0. Or with a framework such as Play Framework 2.x you can simply return Future[Response] straight from your controllers [3]
I suspect that you could so something very similar in C# - GetStringAsync returns a Task<string>, and you don't have to await it right away or one by one. So you can do:
var urlTask1 = client.GetStringAsync(url1);
var urlTask2 = client.GetStringAsync(url2);
var firstDone = await Task.WhenAny(new Task[] { urlTask1, urlTask2 });
These also return Task<T> objects that you can work with further or await. I'd be very surprised if you can't do the equivalent, also in a totally non-blocking way.
I haven't used tasks heavily yet so I'm not sure how to implement everything in your example. But I believe it's all possible.
Async/await is "just" syntactic sugar for tasks. So I'm not sure what you mean by "still blocking from the perspective of the caller". When you call an async method, you get back a Task. When you use await on a task, the compiler rewrites your code to introduce continuations. If you need some more powerful Task features hidden by the syntactic sugar, you always still have the option of using tasks explicitly.
The interesting thing about async / await is that for the coder, the code looks linear, but it doesn't execute that way. So a gap is opened up between "the perspective of the coder" and the perspective of the machine.
I think that bad_user means that if you await one client.GetStringAsync then await a second client.GetStringAsync, the second GetStringAsync only starts after the first one completes, so the first get "blocks" the second, "from the perspective of the caller". I.e no parallelism.
Of course, you don't have to do that. Code elsewhere in this thread.
The real and more complete alternative would be to work with a well designed Future/Promise framework, like the one in Scala [1] which is also usable from Java [2]. Doing concurrent computations by means of Futures/Promises is like working with Lego blocks.
Let me exemplify with a piece of code that's similar to what I'm using in production. Lets say that you want to make 2 requests to 2 different endpoints that serve similar results. You want the first one that completes to win (like an auction), but in case of a returned error, you want to fallback on the other one (type annotations and extra verbosity added for clarity):
Given an HTTP client that's based on NIO, the above code is totally non-blocking. You can do many other crazy things. Like you can wait for several requests to complete and get a list of results in return. Or you can try the first url and if it fails, try the second and so on, until one of them succeeds.In the context of web apps, you can prepare Future responses this way and return them when they are ready. Works great with the async responses support from Java Servlets 3.0. Or with a framework such as Play Framework 2.x you can simply return Future[Response] straight from your controllers [3]
[1] http://docs.scala-lang.org/sips/pending/futures-promises.htm...
[2] http://doc.akka.io/docs/akka/snapshot/java/futures.html
[3] http://www.playframework.com/documentation/2.1.0/ScalaAsync