Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: Will there be chips that are more efficient than GPUs for AI computing?
2 points by jolieli 3 months ago | hide | past | favorite | 2 comments



TPU, LPU, and other are custom ASICs already exist. I also assume Nvidia's AI focused offerings have some optimizations that make them better at the usage patterns and avoid spending silicon on the typical graphics rendering pipeline.

My understanding is the biggest issue is keeping the model fed with data, especially during training across multiple GPUs/VMs, which is required for the bigger models.


I would assume so. Bitcoin mining turned to asic




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: