Hacker News new | past | comments | ask | show | jobs | submit login

PC gaming is doing well, but anecdotally the number of PC gamers using tablets and laptops seems to be driving that as much as if not more than traditional desktop/tower form factors. Admittedly many "portable" gaming GPUs are still discrete chips, but in a laptop or tablet form factor they certainly aren't discrete boards anymore.



There's a large market for gaming-oriented mobiles, I won't disagree. Still, the traditional tower market is shrinking in every vertical EXCEPT gaming, where it has actually grown in the past couple years.

Personally, I detest laptops as anything but machines for on-the-go productivity - I don't want to replace a full system just to swap a CPU or GPU, or pay a huge premium for the benefits of portability that I simply don't need in a gaming system (not to mention the performance compromises that you ALWAYS make with lower power or thermally limited components). I can see an argument being made for students or other people with more mobile lifestyles, but in my house there's three desks with gaming rigs right next to each other in the family room for my wife, my daughter, and myself - the need for portability simply isn't there.

The discrete GPU still isn't going anywhere :)


I think the question is how big the niche remains. I think the diminishing returns of the performance benefits from GPU model year to model year seem to be pushing a lengthening upgrade cycle where I find my towers outlasting the need to upgrade their GPUs. The last GPU replacement I did (a tower ago and a couple years back) was because the GPU fried itself, rather than for any perceived performance gain. Sure, it was a benefit in that case that the one failed component was not enough to sink that particular Ship of Theseus at that time, but on the other hand, I don't think I otherwise would have replaced the GPU on it before I replaced the entire tower. Unlike the tower before that where a GPU upgrade was a magical thing with drastic quality improvements.

I use a tower myself currently, but I had an (early) gaming laptop in college and lately have been considering a return to laptop gaming with all the progress that has been made since the last time I tried it. Partly, because it's one of the few ways to differentiate today between PC and Console gaming is having the freedom of more mobile gaming experiences. (Nintendo's Switch, of course, says "hello" with its docking-based approach to the console. If Nintendo is right, the future of even Console gaming is probably mobile, too.)

Anyway, yes the discrete GPU is still around today. I'm just suggesting it might not be guaranteed to stay. As someone that has been using PCs since the 386 era, there are past versions of me that would have been surprised that the Sound Card was reintegrated into motherboards, even for use cases like 5.1+ speaker setups. Dolby Atmos support on the Windows PC is a "free" "app" you download, that asks your video bus (!) to pass through certain sound data over its HDMI port(s) (or that supports existing 5.1+ mainboard outputs if you pay the license fee, or that supports headphones if you pay the license fee). There's a PC world where that would seem unimaginable without an extra expansion board or three and some weird cable juggling. With the diminishing returns of discrete GPU cards over time (despite AMD and nVidia marketing), it does feel like the importance, even to gamers, of discrete GPU cards could similarly come to an end as it did for sound cards.


I want to argue it the other way around. The CPU/memory, motherboard, sound card etc can easily become extensions of and integrated/absorbed into the graphics card.

Why should the design stay as it is?

The GPU is already a parallel design, it just needs to be able to handle generic current CPU tasks and connect to existing media such as hard drives etc. Integrate system memory onto the card, and add on sound features etc.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: