Well, async I/O is good. However, the reason why it's super hot right now on the internetz is because people don't understand that race conditions and synchronization is something you _cannot avoid_. Your program either needs coordination or not. If it does, then there will be some kind of synchronization used. Dependencies and synchronization is not something that only occurs when accessing shared collections. In any condition, where the semantics of the algorithm requires ordering and/or transactionality and that algorithm is somehow distributed(between threads, processes, machines, you name it) and concurrent, then you will end up using lock/barrier/etc. semantics.
No language itself will solve this issue as it stems from the problem you're trying to solve, not from the language constructs.
Also, on a side note: I love what Clojure does with transactions and STM, but it essentially converts a problem into something that is easier to handle in 99% of the cases, and hard to handle in the rest. (Transaction rollbacks, etc.)
Anyway: every single programmer needs to be comfortable with thread-level concurrency, as you'll face the same problems in on higher abstraction levels where it'll be more difficult to recognize them. He's not saying threads are the solution: he's saying that these 'modern' approaches don't make your life as easy as you would think. Abstracted coordination makes simple programs simpler, but if you don't understand the underlying principles then suddenly your ocean of callbacks will fail somewhere, and you won't understand why.