Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Gamers will happily buy it.

I remember paying paying something like $800 for an Athlon 700 MHz CPU (the cartridge one) in 1999.



Gamer and programmer here! From the perspective of a gamer with a "large but not infinite budget" in the past say ~8 years. I play counterstrike where any fps stutter is unexceptable. I also enjoy prettier games like BF5, etc. My current system is an i7 8700k, 32GB ram (just because), and a 1080ti.

Intel has always been the go-to. The #1 Priority is thread performance, first and foremost. Second is at least 4 cores. Most modern games can utilize at least 4, but it's also important to give the OS and other programs like discord plenty of cores.

While the Ryzen Gen 1 and Gen 2 have been amazing values, for gaming performance Intel has still ben king. When you compare AMD to Intel FPS to FPS Intel nearly ALWAYS wins.

CSGO is especially thread performance reliant, but this goes for most games. It's worth noting too that while games can use multiple cores, I don't believe most engines scale to 8+ cores very well.


Historically the only reason Intel has won on absolute top performance gaming FPS is because their raw single-threaded performance has beaten AMD due to most games still being bad / ineffective with multiple threads. For the first time in many processor generations this may actually not be true because of Intel’s stumble in their 10 nm transition.


That changed slightly with Ryzen: AMD closed the gap on single-threaded IPC (close enough, anyway) but the new issue with Zen 1 and Zen+ was memory/cache/inter-CCX latencies. Zen+ solved most of the memory latency issues but hadn't fixed cache/CCX latencies much.

Supposedly Zen 2 solved most of that. (And some game benchmarks like CSGO suggest they really did) We'll see how it actually pans out since there's still the issue of inter-CCX latency (and now even cross-chiplet latency).


Windows 10 1903 has scheduler changes (intra CCX bias?) that seem to offer significant performance uplift in games (10+%)


It doesn't solve all of it however. If your program has more than "$number_of_cores / 2" threads, you'll cross the CCX boundary at some point(s). On Zen 2, that instead changes to "$number_of_cores / 4" (CCX boundary) or "$number_of_cores / 2" (chiplet boundary).

Inter-CCX communication requires hopping over the Infinity Fabric bus, which (in case of Zen 1, no newer benchmarks) increases thread latency from ~45us to ~131us. I'm sure it was reduced in Zen+ and is probably closer to 100us by now. However, I'm not sure if inter-chiplet communication will be the same (e.g.: has its own IF bus) or worse (IO chip overhead).

Hopefully someone runs the same inter-thread communication benchmarks on Zen 2.


Recovering CSGO player here. I got a beefy box (TR1950X, dual 1070i’s, NVME, etc) for ML and crypto mining, and gaming inevitably followed. That plus low ping internet immediately boosted my ELO rankings and I started having more fun. Life in general became less fun since my sleep was suffering. That and the toxic CSGO community has kept me away, but I still relish the palpable advantage I enjoyed with better gear.


The trick is to play random matches and gradually add people you enjoy playing with. We started doing this a year ago, and now we have a small discord server with a few dozen people who are all fun to play with. It's best to recognize things are frustrating by not verbalize it 24/7 as it lowers the teams moral.

Otherwise, yeah it can be terrible.


as another CSGO player, I am seriously considering one of the high end ryzen 3000 cpus. if the performance is as advertised, it looks like amd will be the single thread king for at least 6 months.


Before you pull the trigger wait to see what the latency between the chiplets/memory does to framerates. We'll know once benchmarks are out, but remember not just to look at average framerate but minimums too, you can have high framerate with terrible stuttering.


I doubt gamers will be a big market for that chip. You don't get a whole lot of increased capability / FPS with a high end chip compared to a mid range chip when the GPU is generally the limiter. But I do think they are going to sell a ton of 3600-3800 chips.


Came to say the same... the 3600 (non-X) is extremely competitive in gaming, and is pretty likely to have some good overclocking headroom with a good water cooler. Personally, I'm very much looking forward to the 3950X and will probably be my choice (even though waiting yet another 2 months to upgrade) unless something significant/soon happens in the next ThreadRipper, the 3950X is likely to be a very sweet spot carrying it for 5 years and more.

I've said in other comments my 4790K is getting a bit old at this point, not slow for most stuff, but definitely hungry for more cores for a lot of tasks, and looking to break past 32gb of ram. I'd also been considering Epyc or even Xeon, as older/used Xeons can be very well priced. Guess I'm waiting until September.


> I've said in other comments my 4790K is getting a bit old at this point, not slow for most stuff, but definitely hungry for more cores for a lot of tasks, and looking to break past 32gb of ram. I'd also been considering Epyc or even Xeon, as older/used Xeons can be very well priced.

I’m in nearly the exact some boat. I’d like to have ECC ram the second time around for my home server, which the Zen chips reportedly support though I don’t see people using. I’d also like better power usage. I think I’m going to wait one more year.


Just got a used Dell, dual 8-core CPUs and 128GB ECC ... main purpose is for a NAS and it'll sit in the garage because of the noise. I may look into what CPU upgrades are available and maybe throw some heavier workloads at it.

For now, planning on just playing around with it. I haven't decided if I'll be running Windows or Linux as the base OS yet.


Is it a rack server? I found the power usage too high on those servers compared to more traditional servers.


It's a standard 2U enclosure... haven't tested the power usage, but it's a relatively current Intel CPU (E5000 series iirc), so should idle reasonably well.


Well, at first gamers said dual-core chips are useless. Then that quad-core chips are useless. Now they're testing waters with octa-core chips.

Game developers have always made a good use of the available resources. They'll use the extra power available. The newest techniques they have, like work stealing queues, can scale to a large number of cores.

So games and gamers will use the extra cores. It's much less of a jump from 4 cores to 16 than from 1 to 2.

Give it a year or two.


In (recent) games made with Unity, a lot of workloads like scheduling the GPU and such are offloaded to separate threads with (almost) no developer intervention. Future games will extensively utilize the job system which provides safe and efficient multithreading. Not sure how Unreal and the remaining leading engines stand, but things seem to be looking very good for high core count CPU owners.


Pretty bleak for Unreal. UE4 uses only one core at a time and is often the limiting factor before GPU.


gamers will probably not be a big market for that chip, but it might be appealing for gamers with a large budget. unless intel has something big hidden up their sleeves (doubtful when they don't even plan to release their next mobile line until holiday 2019), that 16-core chip will likely have the best single-threaded performance on the market. plus it has to be a highly binned part to have the same TDP as the 12-core model even with a slightly higher boost clock. I for one am very interested to see overclocking results.


Just look at the cost of gaming GPUs (including the costs of watercooling?). Not to mention the fact that CPU can have a slower upgrade cycle than GPU (since a CPU upgrade will usually mean upgrading the motherboard, possibly the RAM, who knows what else while you're there), so getting a top GPU is not at all cheap in the long run.

No it's not worth it IMO, but some people spend crazy amounts chasing a few extra fps.


More recently, the 9900k's MSRP was $500 but it was sold for $600 at launch due to scarcity. People wondered who would even buy that given its price but gamers (myself included) happily did and it sold out for months.


It probably sold out because of extremely limited stock though, not because of how high demand was...




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: