I chuckled a little at this because I used to wonder the same thing until I had to actually bring up a GDDR6 interface. Basically the reason GDDR6 is able to run so much faster is because we assume that everything is soldered down, and not socketed/slotted.
Back when I worked for a GPU company, I occasionally had conversations with co-workers about how ridiculous it was that we put a giant heavy heatsink CPU, and a low profile cooler on the GPU, which in today's day and age produces way more heat! I'm of the opinion that we make mini ATX shaped graphics cards so that you bolt them behind your motherboard (though you would need a different case that had standoffs in both directions.)
Memory and nvme storage are dense enough to make anything larger obsolete for the practical needs of most users. Emphasis on ‘practical’, not strictly ‘affordable’.
The only advantages of larger form factors are the ability to add cheaper storage, additional PCI-e cards, and aftermarket cosmetic changes such as lighting. IMO, each of these needs represent specialist use cases.
I'd really like to find a way to daisy-chain them, but I know that's not how multi-gigabit interfaces work.
Raspberry Pi hats are cool. Why not big mini ITX hats?
Yes, I just want to go back to the time of the Famicom Disk System, the Sega CD or the Satellaview.
Missed that one. I still mourn BTX.
Is there a case like this, but that is not part of a prebuild computer?
I think by the late 2000’s, though, their discreet gpus on laptops were problematic because they got so hot they detached from the motherboards or fried themselves. In a lot of cases, these systems shipped with these gpus and intel processors with integrated igpus.
This happened to a thinkpad t400 I had a while ago, the nvidia graphics stopped working and I had to disable it/enable the intel gpu in the bios (maybe even blind).
Iirc this snafu was what soured relations between apple and nvidia.
That was indeed around ~2008-2010 - the issue was not that the chips fried themselves or got too hot. The issue was the switch to lead-free solder ... thermal cycling led to the BGA balls cracking apart as the lead-free solder formulation could not keep up with the mechanical stress. It hit the entire NVIDIA lineup at the time, it was just orders of magnitude more apparent in laptops, as these typically underwent a lot more and a lot more rapid thermal cycles than desktop GPUs.
> Iirc this snafu was what soured relations between apple and nvidia.
There have been multiple issues . The above-mentioned "bumpgate", patent fights regarding the iPhone, and then finally that Apple likes to do a lot of the driver and firmware development for their stuff themselves - without that deep, holistic understanding on everything Apple would stand no chance at having the kind of power efficiency and freedom from nasty bugs that they have compared to most of the Windows laptops.
I guess the main problem with that we have this cycle where most people have a single graphics card with one or two monitors plugged in, and the power users have two of them to drive four screens, or to have 2 cards drive one monitor by daisy chaining them together.
But in the case of the latter two, it's usually a problem of a bottleneck between the card and the main bus, which of course if you were making the motherboard, you'd have a lot more say in how that is constructed.