This project is actually an example of how they are slow-moving and lumbering. If they implemented state-of-the-art learning algorithms in silicon it would be amazing and potentially revolutionary and they'd have tons of customers. That would be deep convolutional neural nets. Instead they've gone with spiking neural nets, which have approximately nothing to do with the current state of the art in artificial neural networks.
Furthermore, you might be misled by their PR storm. In fact this chip doesn't implement learning at all. The learning is the important part! This chip is merely an accelerator for running pre-trained neural networks, and because of the spiking architecture those neural networks are doomed to perform poorly.
This cool research (just like Watson) seems like a loss-leader for selling consulting services which tailor the tech to very narrow, specific AI purposes. They're all breathlessly announced as though revolutionary, but none seem generalizable. Still, investing in so many small bets increases their exposure to a big win in the long run...
My understanding is that part of the appeal of this chip is the power consumption. 100mW is sounds pretty stingy to me, but I'm not terribly knowledgeable either. So I think they're trading off capability for power consumption by implementing a spiking net rather than a convolutional.
Furthermore, you might be misled by their PR storm. In fact this chip doesn't implement learning at all. The learning is the important part! This chip is merely an accelerator for running pre-trained neural networks, and because of the spiking architecture those neural networks are doomed to perform poorly.