Hacker News new | past | comments | ask | show | jobs | submit login
Despite High Price, Nvidia's RTX 4090 Sells Out Fast on Launch Day (pcmag.com)
17 points by rntn on Oct 14, 2022 | hide | past | favorite | 23 comments



I don't understand why they can't apportion a reasonable amount (30%?) of cards for people who are willing to send in actual ID and want to buy a card at retail so that at least there is some effort to beat the scalpers who are going to immediately charge 2X on ebay


These high-end cards has 2 or 3 fans. Aren't they super noisy ... like having a jet engine in you living room? or do they go very quiet (turn off the fans) when there is no load (e.g. not playing games, doing desktop tasks like web browsing)


I have bought the new 4090 and it doesn't seem any louder than the 3080 Ti it's replacing.

Quite often the GPU turns the fans off when the PC is idle or doing low-intensity tasks (e.g. web browsing, word processor).

Even at max fan speed it's not much louder than other fans you might find in a PC.


The fans definitely quiet down during low load.


What's the use case for these high end graphics card? Is it for professional use or do gamers actually buy this? Which games require such high end graphics?


Actual use cases? Ludicrously quick AI inferencing, media transcoding, 3D render offloading, SIMD-bound simulations and anything that uses CUDA. Dedicated graphics will dominate in these scenarios, and it's such a large market that the 4090 is hardly the tip of the iceberg. Nvidia's server-grade AI chips are genuine monsters. People definitely use these cards, and there's a demand to scale them for datacenters too.

Gaming still drives the sales, but you'd be surprised by what you can do with a $500 GPU plugged into a Linux box.


Is there really datacenter demand for these anymore?

In addition to the Nvidia EULA against "datacenter use" they (allegedly) shut down the last dual slot 3090 (with server compatible cooling configuration) because it was eating at higher margin datacenter SKUs[0].

I don't see a lot of datacenters making use of a 3.5U card with challenging power and cooling requirements but I certainly could be wrong.

0 - https://www.crn.com/news/components-peripherals/gigabyte-axe...


The NVIDIA datacenter use clause is present in the license agreement for the Windows driver, but not on the Linux one.


Do you have a reference?

What’s the point of the infamous “No Datacenter Deployment. The SOFTWARE is not licensed for datacenter deployment, except that blockchain processing in a datacenter is permitted.” clause if it only applies to Windows?



Wow, thanks!

This is fascinating and now I'm really confused. Who is using CUDA on Windows in any datacenter application?


I wonder if it's the other way around... maybe this rule was grandfathered in from a time when Microsoft feared companies offering "DirectX as a Service" or whatever. It doesn't really seem like the sort of thing Nvidia would care about, especially considering the licensing of their Unix drivers and considerable enterprise support efforts.


>do gamers actually buy this?

They do. It is not like we cant push these GPU to the limit. 4K Ray Tracing 120 frames.

And like I said when it was launched, $1599 4090 for N4 ~700mm2 Die + GDDR6 isn't expensive at all. Whether one can afford one is a different matter.


Lots of people playing with stable diffusion, 4090 has much more ram and speed for generating images.


Yeah, I was actually wanting to pick up a second 3090 recently to run Stable Diffusion faster. But it apparently doesn't support multi-GPU.


if you're making multiple images in a batch, you should be able to run those in parallel. or have one GPU training concepts (e.g. textual inversion) while the other renders, or if you're doing style transfer you could do frames in parallel.


hmm ok valid point. if I can run two instances of the software & configure it to use separate GPUs I should be OK. I'll have to see if I can try that out somehow.


"Had" to buy a 3090 at msrp during the pandemic, because everything else had big markups on it.

It turns out, just being able to max everything out, have raytracing, and play at 4k/60 is just a much fancier but vapid way of accessorizing playing videogames.

I really enjoy it though :)


No game "requires" a really high-end graphics card, but some can certainly take advantage of it if you're into pretty, high-resolution, high-framerate graphics.


Playing games at 4k and at high frame rates.


I think the main use case of buying a 4090 on day one is to show that you have money to burn.


Scalpers be scalping let’s see if it holds for 2-3 months


I'd love to see CUDA benchmarks.

I do want more RAM, though. That's more important than more speed.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: