
Nvidia CEO says Google is the only customer building its own silicon at scale - lawrenceyan
https://www.cnbc.com/2019/08/15/nvidia-ceo-google-is-the-only-customer-with-silicon-at-scale.html
======
jonplackett
Nvidia really lucked out with VR, crypto and AI all happening at once and all
just happening to need exactly what they make (surly they can’t have seen all
that coming deliberately?)

I wonder if consumer VR would have faired better if gamers didn’t have to
compete with miners and data centres for chips and had more reasonably priced
cards a few years back.

~~~
varelse
Sure, but the really lucky break IMO was a decade of INTC repeatedly punching
itself in the face w/r to manycore computation. I was at Nvidia from 2001 to
2011 (and now back) and I spent 2006-2011 as one of the very first CUDA
programmers. It was pretty obvious within a week or two that this technology
was going to be huge, and I more or less made my career with it.

But instead of stepping up to the plate and igniting a Red Queen's Race that
would have benefited everyone, INTC first tried to discredit the technology
repeatedly, then they built an absolutely dreadful series of decelerators that
demonstrated how badly they didn't understand manycore. Eventually, they gave
up, and now they're playing catch-up by buying companies that get within
striking distance of NVDA rather than building really cool technology from
within.

Now if someone threw a large pile of money at AMD again, things could get
really interesting IMO. But the piles of stupid money seem biased towards
throwing ~$5M per layer at the pets.com of AI companies these days.

~~~
jcranmer
Nvidia's coup was in getting people to switch to a different programming model
and rewrite their code to achieve the necessary performance. Intel, especially
in upper management, is stuffed with people who assumed that was impossible.
And faced with new competition from GPUs, the only acceptable response to
management was the many-x86, "you don't have to rewrite your code to get
performance" approach which didn't actually work out.

It's not that Intel doesn't have people that recognize the issues, but rather
the people who do have that foresight are drowned out by people who don't
realize the game has changed. Intel, to be fair, does have the best
autovectorizer--but designing vector code from scratch in a purpose-built
vector language is still going to produce better results, as shown when ispc
beat the vectorizer.

But Nvidia can also get drunk on its kool-aid, just as Intel has been.
Nvidia's marketing would have you believe that switching to GPUs magically
makes you gain performance, but if your code isn't really amenable to a vector
programming style, then GPUs aren't going to speed your code up, and the shift
from CPU-based supercomputers to GPU-based supercomputers are going to leave
you happy. There's still room for third-way architectures that is anyone's
game.

~~~
hos234
People are all full of foresight until their foresight doesn't work. Which is
most of the time when working at the cutting edge of Tech. That's when
companies crash and burn. Intel has not, despite missing the boat on big calls
almost 10 to 15 times now. What does that say about Intel? People think
missing the call is a sign of bad management.

Bad management is when your company evaporates because you make one bad call.

Management gets points for surviving and then fighting back despite being
wrong. And when you look at Intel's history, there are few companies on the
planet who have managed to do that multiple times. They have a good mix of
people who know what they are doing technically AND people who do whatever it
takes to keep the company from sinking when those bad technical calls happen.

If Nvidia survives whatever their next bad call maybe expect them to start
looking more and more like Intel.

~~~
FighterMafia
"Bad management is when your company evaporates because you make one bad
call."

That's a pretty low bar!

------
verdverm
A well known Googler once told me that if you aren't building you own silicon,
you don't know security. This is one of my main reasons for being all in on
GCloud.

~~~
dogma1138
Or you don’t have the resources that Google has.

This statement holds water only to a handful of companies that can actually
afford to build their own silicon.

~~~
Shebanator
Poor amazon and microsoft, unable to afford to build their own silicon

~~~
positr0n
Doesn't AWS make their own NICs to support the security features in their
custom network stack?

~~~
dogma1138
Do they build their on NICs for security or because it’s the cheapest way of
segregating network traffic?

As in relying on logical separation of client traffic rather than physical one
so they don’t need as many physical ports, switches and most importantly
network cables(often the highest actual cost in many data centers as far as
networking goes)?

~~~
Hikikomori
You can get that with vlans.

Buying gold plated cables?

~~~
positr0n
Not at AWS's scale. You can only have 4094 VLANs.

~~~
Hikikomori
Per switch. Much much more if you use vxlan. My point is that they're not
doing it for separation as normal nic's are capable of that.

~~~
dogma1138
Per LAN not necessarily per switch, overall getting out of the 16bit tag limit
takes a lot of work.

I’m not sure if anyone actually uses VXLAN yet.

You didn’t actually made a point because you haven’t provided proof that what
Amazon did for AWS wasn’t done because of operational requirements.

------
jjtheblunt
Is Apple not counted because they're not favoring Nvidia at the moment?

~~~
dmoy
I think Apple is not counted because this article was only about datacenter
stuff, and Apple doesn't factor in there yet.

It's not to say that Apple doesn't matter to nvidia as a customer or potential
customer, just that it has nothing to do with this article because Apple
doesn't have as much datacenter footprint... yet.

But just as a point of scale, Apple spending $10B on datacenters globally over
5 years is less than Google spends in a single year in just the US.

~~~
dmoy
And on the flip side, if this was an article about consumer gpus for
computers, then Apple would probably feature in the article as a customer (or
obvious non-customer), whereas nobody would even consider mentioning Google.

------
giacaglia
For people that are interested in Nvidia and AMD's strategy, there is an
interesting podcast about it: [https://ark-invest.com/research/podcast/nvidia-
podcast-crypt...](https://ark-invest.com/research/podcast/nvidia-podcast-
crypto-implosion)

~~~
nootka
> Notably forecasted the global Crypto crash of 2k18... an opinion against the
> grain in his field.

Was his field /r/bitcoin? I'm pretty sure the entire world saw that one
coming.

~~~
penagwin
Hindsight is 20/20 and all but - I'm sure most everyone knew it would crash
(just like todays stock market) but the speculation was on when.

~~~
edmundsauto
Even for someone who correctly predicted the when, how can we know if it was
luck or skill?

~~~
mrlala
>how can we know if it was luck or skill?

Easy, it's luck.

------
shmerl
Hopefully higher competitive pressure on Nvidia including from Intel joining
high end GPU market, will finally push Nvidia to upstream their Linux driver
or at the very least to unblock Nouveau, to use GPU reclocking properly.

Google mentioned open source drivers as one of the reasons for picking AMD
GPUs for Stadia.

~~~
sipherhex
There are many good reasons to embrace better integration with open source. I
think the benefits like helping users with custom/broad distro needs and
increasing the velocity of collaboration with the ecosystem and partners
outweighs factors like 'competition.'

The cogs do turn, albeit slower than people prefer.

[https://www.phoronix.com/scan.php?page=news_item&px=NVIDIA-O...](https://www.phoronix.com/scan.php?page=news_item&px=NVIDIA-
Open-GPU-Docs)

------
qes
The only customer of NVIDIA, contracting NVIDIA to fab chips for them.

Microsoft's also been developing and using ASICs in Azure. I guess they're
just not contracting NVIDIA for any part of the process.

~~~
PeCaN
>The only customer of NVIDIA, contracting NVIDIA to fab chips for them.

Nvidia doesn't own fabs.

The article says nothing about Google contracting Nvidia to make chips for
them, only that Nvidia is not particularly concerned about competition from
Google.

------
tylerl
Itd be interesting to see where Google fits in the list of the worlds largest
computer manufacturers. I suspect that if they made their numbers public
they'd be way higher up than anyone expected.

