Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don't understand when people blame AI for buying DDR5 DRAM - aren't they mostly interested in HBM? Or is the fab space being diverted to manufacture more HBM than DDR DRAM previously?




Inference, don't need gpu's for inference. Frontier labs are eking out progress by scaling up inference-time compute. Pre-training scaling has kind of stalled / giving diminishing returns (for now).



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: