I would love to get some insight from Nvidia engineers on what happened here.
The 3 series and before were overbuilt in their power delivery system. How did we get from that to the 1 shunt resistor, incredibly dangerous fire-hazard design of the 5 series? [1] Even after the obvious problems with 4 series, Nvidia actually doubled down and made it _worse_!
The level of incompetence here is actually astounding to me. Nvidia has some of the top, most well paid EE people in the industry. How the heck did this happen?
Either you’re not going to get a choice, or you’ll need a 240v three phase outlet installed and possibly an expanded electrical panel.
Nvidia has been all about “as fast as possible, no compromise” for a very long time now. So was Intel, until the Pentium 4 forced a big reset.
At a certain point this stuff is just totally untenable. Throw more amps at it stops being workable.
Apple seems to be able to get a decent fraction of the performance using drastically less power. I have the impression Intel is doing the same, though I’m less sure on that one.
Given how parallel graphics problems are, maybe it’s time to give up on 100 cores that go uber fast as behave like a space heater and move to 250 cores that go quite fast but use 1/4 of the power each.
Plenty of people are working hard to tackle the performance-per-watt problem while others simultaneously tackle the absolute performance problem. There's no one single metric you can focus exclusively on that's going to satisfy everyone's use cases. Obviously not everyone needs a high-end enthusiast product like the 5090 and plenty of people are going to be seeking out a middle-of-the-road product that focuses on performance-per-watt instead, which will be accommodated by the lower-tier products in the lineup as well as their competitors' products.
> Given how parallel graphics problems are, maybe it’s time to give up on 100 cores that go uber fast as behave like a space heater and move to 250 cores that go quite fast but use 1/4 of the power each.
It wasn't too long ago that people were saying the same things about single-core and dual-core NetBurst CPUs and how we need to start considering dual-core and quad-core for consumer CPUs. And now we have consumer CPUs with 64 cores and more, so aren't we moving in the direction you want already? Performance-per-watt is improving, parallelism is improving, and also absolute performance is improving too in addition to those other metrics.
Idk but if computers are drawing enough power to melt themselves just to play ray traced Minecraft or write boilerplate Python code, perhaps we've gone wrong somewhere.
My boss has a few neighbors who are Nvidia and former Nvidia employees and they're all driving their shiny, fancy sports cars and such and living the high life thanks to the stock they got. It doesn't surprise me they're having churn and / or retiring cutting in to their workforce. Probably not enough to be substantial, but enough to count for something.
In general, poorly designed cables will use steel instead of tin plated copper conductors. The differences become more relevant at higher power levels, as energy losses within the connectors will be higher. Thus, the temperatures rise above the poorly specified unbranded plastic insulator limits, and a melted mess eventually trips the power supply to turn off.
I would assume a company like NVIDIA wouldn't make a power limit specification mistake, but cheap ATX power-supplies are cheap in every sense of the word.
derbauer did an analysis of the melted connector. It was nickel & gold, not steel.
buildzodes analysis showed Nvidia removed all ability to load balance the wires which they used to have with the 30xx connector that introduced it, and they had even fancier solutions for the multi 8/6 pin connectors before that.
The specification also has an incredibly low safety factor, much much lower than anything that came before.
They seem to be penny pinching on shunt resistors on a $2000 GPU. Their engineering seems suspect at this point, not worthy of a benefit of the doubt.
Nickel & gold plating is common, but if it were solid (rare because that would be expensive and stupid)... you would be looking at 3 to >5 times worse performance than the same sized plated copper plug with a beryllium copper leaf spring.
Have a gloriously wonderful day... I could be wrong, but it has been awhile =3
The 3 series and before were overbuilt in their power delivery system. How did we get from that to the 1 shunt resistor, incredibly dangerous fire-hazard design of the 5 series? [1] Even after the obvious problems with 4 series, Nvidia actually doubled down and made it _worse_!
The level of incompetence here is actually astounding to me. Nvidia has some of the top, most well paid EE people in the industry. How the heck did this happen?
1: https://www.youtube.com/watch?v=kb5YzMoVQyw