
Chip Hall of Fame: Nvidia NV20 - rbanffy
https://spectrum.ieee.org/tech-history/silicon-revolution/chip-hall-of-fame-nvidia-nv20
======
jacquesm
Nvidia is one heck of a pivot. I remember buying the most powerful card I
could find in 2006 or so to simulate my windmill blades before cutting them,
the voxel array was so large that doing it all in software was next to
impossible but with the Nvidia card it was so fast that it allowed real time
simulation with voxels of 1 mm^3. Those runs shaved months of our time and
saved half a forest of trees in runs that we would have to abort that would
end up with unusable scrap.

Today all the focus is on machine learning and mining crypto, but there are
some excellent use cases for these cards that have nothing to do with either.

Another one, much closer to playing games is 'serious gaming', simulation of
real world events to train first responders with access to scenarios that
would be way too expensive, disrupting or complex to set up in real life.

~~~
ibeckermayer
Nice, encouraging to hear that massively parallelized operations are being
used for more than trendy buzzwords (ML, crypto) and gaming. Engineering
simulations are less sexy than "AI" but can hugely increase efficiency.

~~~
m_mueller
You may also like that Switzerland is running its national weather prediction
services on GPU since a couple of years (the first and AFAIK still only
national weather service to do so).

------
pjmlp
Not a single word about Renderman?!!

Before GeForce 3 was a thing, those of us with access to NeXT machines already
had a blick what would mean to write shaders.

[https://en.wikipedia.org/wiki/Pixar_RenderMan](https://en.wikipedia.org/wiki/Pixar_RenderMan)

[https://en.wikipedia.org/wiki/RenderMan_Shading_Language](https://en.wikipedia.org/wiki/RenderMan_Shading_Language)

"The RenderMan companion : a programmer's guide to realistic computer
graphics.", 1990

I am also missing 3DFx's work on shading languages before they got acquired by
NVidia.

~~~
IrishJourno
I'm the editor of the Chip Hall of Fame: if you follow the link in the
citation where Pixar/Toy Story is mentioned, it takes you to a whole Spectrum
feature about the development of Renderman :) But this article was intended to
be tightly focused on the creation of the NV20, and how it enabled real time
shaders, so I didn't have room to do more than mention Pixar in passing!

~~~
pjmlp
Faire enough, thanks for clarifying it.

The linked story is actually quite interesting.

------
charleyma
Nvidia is my favorite example of "the next big thing that started out as a
toy" ala A16Z's Chris Dixon ([http://cdixon.org/2010/01/03/the-next-big-thing-
will-start-o...](http://cdixon.org/2010/01/03/the-next-big-thing-will-start-
out-looking-like-a-toy)).

Literally started off as a device for playing games, and significantly
expanded out use cases since then.

------
gattr
Let's not forget that before fully-programmable shaders in NV20, NVidia had
register combiners [0] in GeForce 256 (NV10), allowing more flexible texture
application & blending compared to the older (sequential) pipeline.

[0]
[http://www.nvidia.in/object/registercombiners.html](http://www.nvidia.in/object/registercombiners.html)

------
sien
It's disappointing that the write up doesn't include something on Molnar's
work on Pixel Flow at UNC:

[https://dl.acm.org/citation.cfm?id=133994.134067](https://dl.acm.org/citation.cfm?id=133994.134067)

~~~
IrishJourno
I'm the Chip Hall of Fame's editor: the citations are intended to be brief and
really tightly focused introductions to the development and legacy of a
particular integrated circuit, so I'm afraid we just didn't have the scope to
include Molnar's other work!

~~~
sien
Fair enough.

The thing is that Pixel Flow was relevant for NV20. It's pretty much the
commercialisation of what was developed.

But kudos to you, it's a really good write up generally.

~~~
IrishJourno
Thanks!

------
k__
I remember the GeForce3 as the point were GPUs got much more expensive again.

With the GeForce2 there were some budget version, I had the MX.

Not so with the 3, there was only one model and nobody of my friends had
enough money for it.

~~~
smcl
Ha I just noticed the significance of this timing. As a teenager I could just
about afford a GF2 MX when it came out, but never ended up being able to
afford any of the subsequent couple of generations (the GF4 MX were barely a
step up). I lost interest in PC gaming shortly after and never really
returned, so even when I could afford a decent graphics card later I never got
one.

~~~
k__
Luckily this was also the time when people played the same games for years.

I played HL&Mods until 2004 or something. So I didn't really "need" a better
GPU.

------
agumonkey
Always interesting to think how Nvidia struggled for its first 2 generations
of graphic chips nv1 and nv2 (partnered with SEGA) but made a first big dent
in the market with nv3.

~~~
bigger_cheese
I can remember growing up my Dad purchased a new Pentium 133, this must have
been around 1995/1996\. It was absolutely cutting edge machine at the time and
had a "S3 trio" PCI card in it. I have vague memories of playing Quake on it
and being blown away at the graphics. A few years later 3dFX Glide cards were
cutting edge for gaming.

I started university around that time stopped following gaming/computer
hardware scene but really felt to me like Nvidia came from absolutely nowhere.

~~~
carroccio
3dfx with vga passthrough were the real life changers thanks to advanced
texture rendering. Still poly were not anti-aliased. It was a real paradigm
change and captured a majority (>80%) of the market. If you remember the game
Turok with and without Voodoo you lived that days :)

~~~
MrBuddyCasino
That was a huge improvement, unmatched by anything that came after. Maybe the
move to multi-core cpus comes close.

------
snvzz
Fair. But keep in mind NVIDIA's a scummy company.

[https://www.youtube.com/watch?v=H0L3OTZ13Os](https://www.youtube.com/watch?v=H0L3OTZ13Os)

