In the end, it means Nvidia is partnering with everyone and their mother to build AI infrastructure. That means Nvidia in the end doesn't care if Big CSPs (Azure, AWS, GCP), small CSPs (Coreweave) or Taiwanese HW builders buy their HW.
What it also implies is that Nvidia can build AI factory which also require SW on their own. Nvidia DOESN'T need Big Tech at all for AI. Nvidia can address any enterprise on the world with AI enterprise solutions on-prem (100% Nvidia) or cloud (CSPs used for renting).
Big Tech is developing their own chips not because of Nvidia making large margin in that market but because Nvidia has a platform which can disrupt Big Tech in AI totally as Nvidia unlike Intel, AMD, Qualcomm, etc. doesn't need Big Tech to run it.
But Jensen isn't stupid, today he is all friends with the large CSPs so that they build Nvidia infrastrcuture. Therefore, Nvidia systems are spread, Nvidia earns tons of money and can provide Nvidia Enterprise SW bundled for free. If Nvidia would build infrastructure to rent then it would be a lot of CapEx so better let CSPs build it while you focus on mindshare and spreading of Nvidia SW solutions.
During COMPUTEX, CEO Jensen Huang, announced that big shot companies like ASRock Rack, ASUS, GIGABYTE, Ingrasys, Inventec, Pegatron, QCT, Supermicro, Wistron and Wiwynn will deliver cloud, on-premises, embedded and edge AI systems using NVIDIA GPUs and networking.
They are not moving. This title doesn't match the details of the announcement:
> Computer Industry Joins NVIDIA to Build AI Factories and Data Centers for the Next Industrial Revolution
> COMPUTEX—NVIDIA and the world’s top computer manufacturers today unveiled an array of NVIDIA Blackwell architecture-powered systems featuring Grace CPUs, NVIDIA networking and infrastructure for enterprises to build AI factories and data centers to drive the next wave of generative AI breakthroughs.
The title, "Nvidia is moving to build AI data centers" is misleading.
I read the press release, and it doesn't sound like NVIDIA is building data centers. It looks like NVIDIA is helping other companies build and design equipment for data centers. This is a very different thing.
In the end, it means Nvidia is partnering with everyone and their mother to build AI infrastructure. That means Nvidia in the end doesn't care if Big CSPs (Azure, AWS, GCP), small CSPs (Coreweave) or Taiwanese HW builders buy their HW.
What it also implies is that Nvidia can build AI factory which also require SW on their own. Nvidia DOESN'T need Big Tech at all for AI. Nvidia can address any enterprise on the world with AI enterprise solutions on-prem (100% Nvidia) or cloud (CSPs used for renting).
Big Tech is developing their own chips not because of Nvidia making large margin in that market but because Nvidia has a platform which can disrupt Big Tech in AI totally as Nvidia unlike Intel, AMD, Qualcomm, etc. doesn't need Big Tech to run it.
But Jensen isn't stupid, today he is all friends with the large CSPs so that they build Nvidia infrastrcuture. Therefore, Nvidia systems are spread, Nvidia earns tons of money and can provide Nvidia Enterprise SW bundled for free. If Nvidia would build infrastructure to rent then it would be a lot of CapEx so better let CSPs build it while you focus on mindshare and spreading of Nvidia SW solutions.