Sure, but that's the deal. I'll buy the latest nVidia 1080 card as soon as I can but renting these custom chips per minute would be a way better option for me.
GPUs also have this nice side effect of being great at playing games on. Purely as a guess I'd think that the gaming market is bigger than the AI researcher market.
In a future where AI is everywhere, Nvidia hopes it can sell GPUs by the hundreds and thousands to large data centers. You can make a lot more money a lot faster selling your hardware this way, and Nvidia is very interested in it judging from how much they talked about it at their recent conference.
I would be surprised if they weren't working on their own specialised chips then, though Google have the advantage of already having the software specs to build for.
> Purely as a guess I'd think that the gaming market is bigger than the AI researcher market.
Machine learning isn't just targeting the AI researcher market though -- it's widely used by a huge number of companies, and of course, by many of Google's most important products. I would argue that those markets combined are larger than gaming.
Yup, I assume they're gonna keep them in house as a competitive advantage for a time. I doubt they'll do it forever; the most valuable part of NVIDIA's CUDA is the ecosystem, and I think Google knows that.