Hacker News new | past | comments | ask | show | jobs | submit login
Nvidia Short Sellers Lose $2.3B in One Day as Stock Soars (bloomberg.com)
36 points by mfiguiere 11 months ago | hide | past | favorite | 24 comments



It's a shame that their consumer products seem to be stagnating, be it because of no resources going there or because they actually can't get a reasonable uplift.

The current generation of GeForce is essentially just the last gen with marginal improvements, scaled up on power draw, and doubling down on a technology so badly adopted, most people can't even tell the difference. [1]

[1]: https://www.youtube.com/watch?v=2VGwHoSrIEU


Not to mention a massive consumer hostile price hike for no good reason.

Ironically the 4090 might be the only one approaching appropriately priced for its performance, it is a monster after all. The rest of the 4000 series are units traditionally a grade or two lower on the scale but masquerading as higher tier parts, for whatever reason.


There is a reason: chips are costlier to manufacture now on a per-transistor basis. As Jensen says, Moore’s Law in its original economic sense is dead.


Not anywhere close to proportional to the cost of manufacturing the chips.

Nvidias high prices are a reflection of them charging what they can and the inventory risk caused by all the 3000 series cards sitting on retail shelves for sale still post-crypto.


> 4000 series are units traditionally a grade or two lower on the scale but masquerading as higher tier parts

Can you give some reasoning for this? Reddit certainly feels this way but I can't find anything other than "feels likes" for reasoning. If you compare die SKUs to generations, 3000 series got a "bump" (xx80 -> xx102) while 1000, 2000, and 4000 are consistent (xx80 -> xx104, xx80Ti -> xx102, xx60 -> xx106)


I mean, it is going to be 'feels like' because there is no qualification that sets apart the card tiers. The metric is more to do with the fact that as a gamer you could realistically upgrade every second generation to a mid-range card and be good playing AAA games until you have to repeat. The pandemic shortage completely destroyed that and made gamers realize how much they are at the mercy of a very specialized market -- which hasn't been a problem until desktop GPU chips became really useful for things besides gaming like crypto mining and AI.

Suddenly the gamers got put on the back burner and start paying the real market value for essentially magic tiny supercomputers used exclusively for entertainment by (comparatively) rich young men.

Oh, and gamers are notoriously belligerent about anything that involves choice, taste, or change to any of them, or critique of any of them. This is the community that would regularly bully, berate, and harass others because of a choice of hardware brand -- and call SWAT teams to grief other players.

This is not meant to be insulting or the minimize how much it sucks to have your hobby get made worse and sometimes impossible -- I am suffering to and it sucks, but I also acknowledge the truth of the matter. From the outside, all parties in the GPU drama look terrible.


Gamers can be pretty toxic, but I feel this this post is projecting a little much. It's not like the market in general is broken. AMD is releasing great cards and Intel is catching up fast. The value proposition of GeForce cards is just not there.


How dumb can you be to short the company that has basically a monopoly over AI infrastructure? These folks are probably AI skeptic and they are having a reality check right now.


I wouldn’t do it, but the price to earnings ratio is around 170 for Nvidia and 26 for Alphabet so you could argue that the AI revolution is already priced in for a considerable time, especially if companies starts to use TPUs on cloud providers or if AMD actually pretends to be in the space.


The short answer is its PE Ratio is obscene (even more so today), and last quarter the ER was kind of “doom and gloom” not just for NVDA but AMD, DELL, HPE, etc.


NVDA is the new TSLA, regardless of how their business doing great


I’m out of touch on the GPU market. Is AMD/ATI competitive these days? Are there any other noteworthy players?


AMD GPUs are not as good as Nvidia's. The other issue that AMD and any other competitor has to face is that much of the new AI software uses CUDA, which as far as I know is limited to Nvidia hardware only.


ATI hasn't existed as a brand since 2010. Every GPU since then has had AMD branding.

Typically, AMD GPUs have a slight edge over NVIDIA GPUs in the price/performance metric, but NVIDIA simply offers higher performance options overall.

But NVIDIA cards also absolutely demolish AMD in compute power needed for AI.


Even if an AI skeptic has good reason to be right, I can't imagine a scenario where the AI bubble collapses so quickly. If it's a bubble, seems like the beginning of one.


If Intel can figure out GPUs(admittedly a big if) nvda is fucked. AMD less so because they're still reliant on expensive TSMC fabs. There are tons of reasons to believe nvda will not retain its monopoly on the data center and they need to if they're going to 10x earnings which is what the current price is predicting.


Or they’re true AI believers, and believe AI assisted chip design will erode NVIDIA’s edge


Whether they are right or wrong we will now in next years


The situation with Nvidia right now if giving me flashbacks to Tesla a few years ago.


I just sold all my NVDA today. My hypothesis came true: NVDA was going to power the AI revolution and see substantial upside.

The reason I sold is because the price is just too high. In order to get to a sane P/E ratio, their profits need to like 10X. I think that’s a little too optimistic.



Does anyone know how tied OpenAI is to CUDA? I would imagine it is not locked in. Seems like that would be an easily avoidable mistake.


NVIDIA/CUDA acceleration for AI/ML are the gold standard, and when there is a ton of investment money coming in (especially when way more is coming in, much faster than usual) the safe bet is to stick with the standard. Additionally, they have the backing of NVIDIA which is huge and has ton of resources poured in to this, compared to any other competitor. On a risk adjusted basis, NVIDIA is a solid choice to standardize on and stick with.


Everyone is designing their own bespoke AI silicon now. CUDA is just a backend for things to run on top of, like how if a whole industry uses a certain power tool brand so everyone in that industry has a backup battery for it and inertia goes into effect. However the vast majority of people working in AI couldn't write 'hello world' in CUDA. The language that makes AI and ML run is Python. If some other hardware came along that could seamlessly take over and let all the Python engines work just as fast no one would notice. That is my perspective anyway.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: