Hacker News new | past | comments | ask | show | jobs | submit login

Modern hardware is multiple different things. GPU programming does not have a realistic model of what CPU hardware is like, because despite CPUs having more cores and SIMD, they are also very cache-oriented and branch-prediction-oriented, while GPUs aren't that at all.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: