Hacker News new | past | comments | ask | show | jobs | submit login

I don't think GPUs are a particularly good solution for these, they aren't the future and won't be around for mass-deployment that much longer.



It seems the author is down the 'deep learning' rabbit hole.

>> It does this at the cost of requiring many, many times the computing power. But much of this computing cost can be parallelized and accelerated effectively on the GPU, so with GPU cores still increasing exponentially, at some point it's likely to become more effective than CPUs.

So can be any matrix. Sadly, there aren't as many algorithms that are efficiently represented by one.


That's quite a statement - what will replace GPUs for the ever increasing amount of ML work being done?


TPU-like chips; though they can be (partially) included on GPUs as well as is the case with the latest NVidia/AMD GPUs.


There's nothing special about the tpu. The latest gpus are adding identical hardware to the tpu, and the name "GPU" is a misnomer now since those cards are not even intended for graphics (no monitor out). Gpus will be around for a very long time, just not doing graphics.


Yep. Simply the core idea of attacking memory latency with massive parrelization of in flight operations rather than large caches makes sense for a lot of different workloads, and that probably isn't going to change.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: