I don't understand why they can't apportion a reasonable amount (30%?) of cards for people who are willing to send in actual ID and want to buy a card at retail so that at least there is some effort to beat the scalpers who are going to immediately charge 2X on ebay
These high-end cards has 2 or 3 fans. Aren't they super noisy ... like having a jet engine in you living room? or do they go very quiet (turn off the fans) when there is no load (e.g. not playing games, doing desktop tasks like web browsing)
What's the use case for these high end graphics card? Is it for professional use or do gamers actually buy this? Which games require such high end graphics?
Actual use cases? Ludicrously quick AI inferencing, media transcoding, 3D render offloading, SIMD-bound simulations and anything that uses CUDA. Dedicated graphics will dominate in these scenarios, and it's such a large market that the 4090 is hardly the tip of the iceberg. Nvidia's server-grade AI chips are genuine monsters. People definitely use these cards, and there's a demand to scale them for datacenters too.
Gaming still drives the sales, but you'd be surprised by what you can do with a $500 GPU plugged into a Linux box.
Is there really datacenter demand for these anymore?
In addition to the Nvidia EULA against "datacenter use" they (allegedly) shut down the last dual slot 3090 (with server compatible cooling configuration) because it was eating at higher margin datacenter SKUs[0].
I don't see a lot of datacenters making use of a 3.5U card with challenging power and cooling requirements but I certainly could be wrong.
What’s the point of the infamous “No Datacenter Deployment. The SOFTWARE is not licensed for datacenter deployment, except that blockchain processing in a datacenter is permitted.” clause if it only applies to Windows?
I wonder if it's the other way around... maybe this rule was grandfathered in from a time when Microsoft feared companies offering "DirectX as a Service" or whatever. It doesn't really seem like the sort of thing Nvidia would care about, especially considering the licensing of their Unix drivers and considerable enterprise support efforts.
if you're making multiple images in a batch, you should be able to run those in parallel. or have one GPU training concepts (e.g. textual inversion) while the other renders, or if you're doing style transfer you could do frames in parallel.
hmm ok valid point. if I can run two instances of the software & configure it to use separate GPUs I should be OK. I'll have to see if I can try that out somehow.
"Had" to buy a 3090 at msrp during the pandemic, because everything else had big markups on it.
It turns out, just being able to max everything out, have raytracing, and play at 4k/60 is just a much fancier but vapid way of accessorizing playing videogames.
No game "requires" a really high-end graphics card, but some can certainly take advantage of it if you're into pretty, high-resolution, high-framerate graphics.