Hacker News new | past | comments | ask | show | jobs | submit login

You are assuming they are paying retail price, which they certainly are not.



Wouldn't it still be $15bn? If I manage to buy $20 worth of gold for $10 through a special deal, is it not still $20 worth of gold?


Used GPUs cannot be sold for the same price that new GPUs are bought.


Does anyone know where this hardware gets trickled down once decommissioned?


Maybe ebay? Not much good though as Nvidia doesn't provide drivers for those to the public.


Drivers for the H100 are available right on their website


Really? I had no idea. From what I knew, they didn't. My bad then.


No, but they still got $15bn worth, regardless of discount.


you're making very good and clear points, but it's still not clear whether Zuck is referring to the budget spent or the street value received


your comment would make sense if there wasn't 340,000 k in your parent comment


Not always. For high in demand products they could pay more to guarantee supply and delivery dates.

Some people will pay more to be first in line.


it's not going to be an order of magnitude difference. It's a significant investment in hardware.


Even if that was true, how much discount do you suppose that they can have?

Given that GPU production is mostly sold out, and that giving META a bigger discount would simply mean losing money from other purchasers.


Given the demand why wouldn't Nvidia be able to charge sticker price?


You can afford to take a hit off your profits when you can simply ramp up production for retail sales. Looks great too for shareholders.


They can't just ramp up production though. Isn't TSMC booked for years by them, Apple, Intel and AMD?


Nobody really knows. It certainly suits them for everyone to believe there is some secular reason, some supply crunch, it even suits AMD and Intel.

Presumably all the chip supply issues regarding autos have been resolved, and yet prices have risen 30% in a decade, and there’s no reversal.


We know OpenAI and Azure was struggling to get enough GPU. That was implied not just by their words but also by action. And considering these two companies are most aggressive and making most money out of this AI. If GPUs are available they would have been able to buy it.


volume customers always get special price.


What makes you think they are getting a good discount?

What are they going to do? Buy AMD, yeah right.

Nvidia's sales are only limited by the number of wafers they can get from TSMC.


>What are they going to do? Buy AMD, yeah right.

Build their own? It's what Microsoft, Google, and AWS are doing.

>Nvidia's sales are only limited by the number of wafers they can get from TSMC.

No, they're limited by the cost per operation vs. Facebook building their own. The cloud providers have already decided it's cheaper to do it themselves. Sure they'll keep buying GPUs for general public consumption but that may eventually end too.


At some point Google Cloud, AWS, Alibaba Cloud, Apple, etc are going to make their own specialized chips (Google tried a bit with their tensors chip).

There is no value into the NVIDIA-part by itself, only the raw power is interesting.

If tomorrow this is AMD, or China-Town chip, it's perfectly fine.

I wouldn't miss the CUDA toolkit mess.


If raw power per dollar would be all that's interesting we'd all run 7900 XTX clusters like geohot in his tinybox.

We are not, because there's clearly value in the CUDA ecosystem.


There certainly is a lot of value in the CUDA ecosystem, today. The problem is that when all the big companies are buying up hundreds of thousands of GPUs, that doesn't leave much for anyone else.

Sane business people will look to decentralize their compute over time and not be reliant on a single provider. AMD will be able to take advantage of that and they've already stated that is their focus going forward.

ROCm/HIP are getting incrementally better, MI300x have 192GB and benchmarks are looking good, the only problem is that nobody has access to the higher end hardware from AMD today. That's why I'll have MI300x, for rent, soon.


That's a big issue in AMD land imho. Everyone can pickup a 200$ GPU (talking about the RTX 3050) which will behave like a scaled down A100 and get started playing around with CUDA. You can't really do that with AMD GPUs, their cheapest officially supported GPU is the 7900 XTX and that has a different architecture than the data center ones.


I agree. Maybe one idea would be to also make 7900 XTX's for rent (cheaply) too.


That's another thing. I have some stuff I'd like to try, but I can't even find places where I could quickly rent a GPU without applying for quotas.


That is indeed an issue, and I am actively working on it.


Nvidia has a vested interest in FB being beholden to their chips, so much so that it's worth giving them a discount to ensure it happens, and human nature being human nature a face saving discount has to be offered.


use less




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: