Hacker News new | past | comments | ask | show | jobs | submit login

I would not bet against Nvidia right now.

Yes, H100s are getting cheaper, but I can see the cheap price drawing in a wave of fine tuning interest, which will result in more GPU demand for both training and inferencing. Then there’s the ever need for bigger data centers for foundational model training, which the article described as completely separate from public auction prices of H100s.

I don’t think the world has more GPU compute than it knows what to do with. I think it’s still the opposite. We don’t have enough compute. And when we do, it will simply drive a cycle of more GPU compute demand.




I don’t think I’m betting against Nvidia. I’m betting against Nvidia being worth 3.3 trillion.


Still a bad bet. Their moat is deeper now than it was in 2022. The engineers you need to poach are all paid well over 1+ million USD per year now. The number of people capable of writing quality CUDA code to optimize transformer language models world-wide is likely less than 10000, and I'm being very generous. Nvidia holds a significant portion of that group, and some of the others you'll never find in the market at all since they hide behind discord profile pictures and mental illness.


Only reason that is hard is that CUDA is general purpose. You don’t need a graphics + GPGPU platform to run transformers. Nvidia is eating its own moat.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: