Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

While I agree with the sentiment here... this has been the argument Intel has been using as a "last excuse to buy Intel over AMD" - if you buy a fast enough video card, but play your games at 1080p on a really high refresh rate monitor... the gaming performance was better on Intel.

So AMD focuses on it here to say "look, your very last excuse for choosing Intel over AMD is no longer invalid."

Of course I do more "non-gaming" than gaming, so it wasn't very important to me in the first place, and I don't spend enough on a graphics card for this to matter. But I want a lot of cores for fast compilation and great multi-tasking with containers and virtual machines.



The AMD software to control CPU doesn't work with virtualisation enabled. They have been ignoring requests to fix it for years.


What does it control on the CPU? Overclocking? I never needed to control CPU via a software...


Yes of course, non gaming, dev and media work, these will be beast. I was just confused on why they were so focused on 1080p gaming performance benefits. Buying a whole new setup if you're running any relatively modern CPU would be a waste of money for gaming. At 1080 you're probably already killing it in framerate, and at 2k+ the benefits just aren't there.


At 1080p, with a 2080 Ti (AMD's Testing Rig), the GPU is not a bottleneck so any performance difference = CPU performance difference.

Going higher than 1080p, GPU starts to become a bottleneck so you're no longer comparing CPU-to-CPU, but (CPU+GPU)-to-(CPU+GPU).

Testing game performance at 1080p is a well-known and well-accepted way to compare CPU performance.


Because at 1080p you can actually compare the performance maybe?




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: