True .. it’s crazy but what’s interesting is the 5090 core count is 2x that of the 5080 (of course these are rumored specs). So with the 5090 you’re getting 2 5080s in almost every sense: 2x the ram, 2x the cores, 2x the bandwidth for 150w more in draw (450 vs 600) or for only 1/3rd more wattage.
So if they don’t gimp it too much for AI/ML things it could be a beast.
I think it’ll cost north of 3k and given supplies and stuff retail will likely be closer to 4k+
10 years? Multiple vendors have 1600w PSUs on market right now.
What's more, you'll need an AC to play pc games. My home office room is ~6m2, and even the current 500w machine I have there can noticeably bump the temperatures under load.
The PSU also powers the motherboard (keep in mind all those power hungry USB-C PD peripherals), CPU, cooling (fans and AIO liquid cooling pumps), storage, and RAM. Furthermore, when building a PC one would keep some buffer of a few hundred watts in the event that one would want to down-the-road add another GPU (or two (particularly for ML work)) to spare PCIe slots of the motherboard, or simply future-proof the PC for future more-power-hungry GPUs.
(USA) I guess I better call an electrician to install one of those huge 240v stove/clothes dryer outlets in my PC room so I can run a bigger PSU for my dual 5090s.
As someone who had a 3090 because that's all I could get my hands on at the time, I would not recommend anyone consider the XX90 products. Leave those for the over-the-top, money is no object, "I'm just flexing on you" builds. XX80 should be considered the limit for people who are building a reasonable gaming PC.
I ended up switching to a 4080. It certainly doesn't sip power and is also quite expensive, but the difference is large enough that I don't feel anymore like I need to find a way to send the heat output directly outside during the summer.
Nobody needs it. It's a luxury product. If you don't want a 600w GPU just get a different one. I only recently upgraded from the 1070 that's served me well for almost a decade. Didn't really notice much of a difference even though I splurged a bit.
Buying top end gpus these days seems to me like a total waste of money unless you have very specific needs. For playing any mainstream game at 1440p you'll be completely fine with a mid-tier previous-gen card.
Well think of it this way, why should gpu manufacturers only build for the lowest common denominator? If someone wants a 600W chip shouldn't they be able to get one?
On the other hand, game studios can obviously not design their games around such chips.