Hacker News new | past | comments | ask | show | jobs | submit login

You're right that GPU is a misnomer these days. The top-end chips are for HPC and scientific computing in general. Good for training, inference, and non-AI applications such as molecular dynamics simulations. They can accelerate FFT and linear solvers. Their applications are broader than AI/ML. Useful for physics simulations.



Depends on the chips, I've been hearing from people that NVidia ones (due to various changes in design which preference low-precision over high precision) are becoming more and more useless for traditional HPC, and so there's a move on to switch to AMD (noting that the software side of AMD is still not good). https://news.ycombinator.com/item?id=40655965 suggests Apple's hardware would have similar issues.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: