Hacker News new | past | comments | ask | show | jobs | submit login

According to Observer they are: Microsoft, Meta, Google, and Amazon.

Other big buyers area: Oracle, CoreWeave, Lambda, Tencent, Baidu, Alibaba, ByteDance, Tesla, xAI.

https://observer.com/2024/06/nvidia-largest-ai-chip-customer...




"Who could it be... Hmm... Such a tough nut to crack..."

(Even without a report on this it would be obvious)


Meta can be confirmed as one since they’ve literally mentioned their infra investments and Billions in capex increases until the end of 2025 in every earnings call this year.


I guess Apple is using their custom silicon?

That was a major payoff for Apple - I wonder if any of the other fangs will actually be able to follow suit.


Apple uses TPUs on Google Cloud Platform. https://www.cnbc.com/2024/07/29/apple-says-its-ai-models-wer...


And a weird deal with OpenAI (which I think would show up as Microsoft for the actual physical hardware). https://openai.com/index/openai-and-apple-announce-partnersh...


for training, but for the interference apparently they use their own chips


Except in cases where responses are outsourced from OpenAI. All the ChatGPT-based inference likely happens on Nvidia hardware too.


Is there evidence that Apple is training a model large enough to require a huge amount of compute?


https://arxiv.org/abs/2407.21075

AFM-server was trained on 8192 TPUv4 chips

Someone more versed can say if that is huge or not.


It is a far larger scale than most high performance clusters offer.


At one point over 50% of their server revenue came from hyperscaler. Which is exactly the same as those four listed.


Utterly surprising


Where is Apple in all of this?


Apple historically dislikes NVIDIA and I they would likely rather use their own in-house chip team. They also rely on it by virtue of using OpenAI in upcoming iOS release.


They don't like the high margins? :P


TBF Apple dislikes any other company they work with. They're seething with rage any time they cannot do something in-house.


I wonder if the split happened with jobs or after jobs? I thought jobs was good at relationships with everyone else in silicon valley (intel, ati, nvidia, even microsoft)


Apple dropped Nvidia after a few years of Nvidia falsifying thermal specifications on GPU chips.

It drove apple crazy both with high failure rate of MacBooks where the GPU was desoldering itself and general problem of a hot as fuck bottom. Nvidia refused to pay out for damages to Apple as well from what I recall.


To add to that, NVIDIA tried throwing OEMs under the bus when the issues cropped up. It wasn’t just Apple who were affected.

Then a few years later they made up but NVIDIA didn’t want to partner on drivers, so they had another rift


IIRC it was with Jobs. Apple wanted to develop their own drivers for their chips from ground up, and NVIDIA was very secretive of their tech, so things went south.


> NVIDIA was very secretive of their tech

Oh the irony for Apple to dislike others being secretive...


On their own right, they contribute more than many other companies, though. Their kernel is open source, they have given their secret sauces like Grand Central Dispatch away, allowed complex technologies like mDNS (Bonjour), AirPrint, multipath networking to be implemented freely and used widely in a vendor agnostic manner.

macOS is 1000 times better for talking UNIX systems than Windows and is POSIX compliant.

Lastly, they are not hindering the development of Asahi Linux, and did nothing when their devices were reverse engineered. On the contrary, they left a couple of ways open for Asahi guys to boot their distribution directly.

They are not the band of saints, but they are not the underhanded evils like a couple of others.


They're shipping queries off to ChatGPT so I guess this ends up as nVidia cards on Azure?


Apple hates Nvidia so wouldn't buy directly from nvidia.


Kind of embarrassing for Google to be on that list, no? Shouldn't their in-house TPUs be cost-advantageous for them?


No, because GPUs are not only for AI. They are MATMUL machines, and MATMUL is useful way beyond AI and tensor applications.

Some of us use them at double precision mode.


Yes, but demand for these chips went through the roof because of AI. If Google is on this list it's because they're using them for AI, not because they've got a secret project rendering an insane number of 3D images or something.


Everything from material simulation to weather forecasts use GPUs very actively and effectively, for a long time.

There’s a whole world using GPUs to accelerate things.


Right, I'm not arguing against that.

I'm saying that Meta and Amazon and Microsoft are all buying these chips in insane numbers for AI—their usage for all other types of GPU activity is at least an order of magnitude less. That's why Nvidia skyrocketed to become the most valuable company over just a few years.

For Google to be on that short list of whales would either mean that they for some reason have a much larger demand for GPUs for non-AI purposes than any of the others have for AI purposes (doubtful) or that they're using GPUs for AI.


Those are likely for Cloud, used by clients.


Google has addressed this. They offer GCP as an AWs alternate.

They’d rather offer their clients what they need than push them on to their own products.


I understand why the state of affairs is, the point is that it's pathetic. Google, an AI hardware manufacturer[0], has to eat a direct competitors not in substantial margins in order to offer their customers, external and internal, a viable product.

[0]and supposed software powerhouse.


Microsoft has Linux on Azure.

Amazon doesn’t force everyone onto graviton cores, and offers competing cores.

It’s just the nature of business.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: