
The history of the modern graphics processor - kaptain
http://www.techspot.com/article/650-history-of-the-gpu/
======
rayiner
> The evolution of the modern graphics processor begins with the introduction
> of the first 3D add-in cards in 1995, followed by the widespread adoption of
> the 32-bit operating systems and the affordable personal computer.

What? No.

The evolution of the modern graphics processor begins with the development of
commercial 3D systems in the 1980's. Jim Clark founded SGI in 1982 based on
his work into hardware acceleration of geometry computations for 3D at
Stanford. By the mid-1980's, SGI workstations were able to handle 3D modeling
and animation locally:
[http://en.wikipedia.org/wiki/Silicon_Graphics#IRIS_2000_and_...](http://en.wikipedia.org/wiki/Silicon_Graphics#IRIS_2000_and_3000_series).

You can find some of Clark's papers on the hardware here:
[http://www.computer.org/csdl/mags/co/1980/07/01653711.pdf](http://www.computer.org/csdl/mags/co/1980/07/01653711.pdf)
(1980);
[http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.359....](http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.359.8519)
(1982). I can't find a good reference describing the rasterization side of the
pipeline, though the second paper describes it very briefly on page 132.

There's a really neat website going into details of early 3D consumer chips:
[http://vintage3d.org](http://vintage3d.org). It's very interesting, though
again it should be noted that these products came out in the 1990's, well
after hardware acceleration of 3D was well-established in the workstation
market.

~~~
greggman
I could be wrong but didn't Evans and Sutherland have 3D accelerated graphics
in the 70s for military flight simulators? Or is my memory bad?

~~~
rayiner
My first inclination was to start with E&S, but my understanding is that their
3D technology in the 1970's was based on vector displays:
[http://archive.computerhistory.org/resources/text/Evans_Suth...](http://archive.computerhistory.org/resources/text/Evans_Sutherland/EvansSutherland.3D.1974.102646288.pdf).
Which is clearly hardware-accelerated 3D, but not quite how a modern GPU works
because it doesn't involve rasterization of triangles into a framebuffer. Of
course this was before my time, so I'd be happy to be corrected...

------
adwf
It's an interesting topic in that I remember just how much of a revolution the
original Voodoo card was to gaming (for me anyway). These new cards didn't
just enable slightly prettier graphics like a new card would today, they
enabled entire new _genres_ of game. If anything, the art quality actually
took a step back for a while (some of those sprites were gorgeous), but the
new 3D effects more than made up for it.

I imagine products like the Oculus Rift will hopefully have the same effect on
gaming. New ideas that we could barely imagine before will now be within
reach.

~~~
dualogy
> they enabled entire new genres of game

Such as? There have been great enjoyable DOS-based 3D games for a number of
years ever since Wolfenstein so I'm curious what 'new genres' you're thinking
of that the introduction of a 3D graphics processor enabled.. ;)

~~~
penguindev
Right. Even Unreal had software rendering (and looked damn good - water, sky,
fog).

------
jasode
A nice complementary text (programmer's perspective) to the history of the
hardware is the epic stackexchange answer:
[http://programmers.stackexchange.com/a/88055](http://programmers.stackexchange.com/a/88055)

~~~
ARothfusz
Very nice. Though I wonder why the author thought we (3Dfx) were "strongly
against OpenGL"? One of our founders, Gary Tarolli, worked on Iris GL[1][2].
We weren't against it, but our pixel pipeline was in the wrong order to
implement it precisely. Plus there was the GL tradition of handling things in
software that your hardware didn't implement, and we hated the "slow path".
Ok, maybe we were against it a bit :-) Coming back to me now.

[1] [http://tech-insider.org/unix/research/1990/1211.html](http://tech-
insider.org/unix/research/1990/1211.html) [2]
[http://techpubs.sgi.com/library/manuals/1000/007-1210-060/pd...](http://techpubs.sgi.com/library/manuals/1000/007-1210-060/pdf/007-1210-060.pdf)

------
mpweiher
More wrong: the 6845 was not used in the Apple II.

~~~
joezydeco
Correct. It was used in some of the 80-column addon cards, but Woz drove the
video signal directly from a sea of TTL gates. It was pretty amazing stuff for
the time.

------
Lennu
It always amazes me how so bad quality graphics back in the 90s or early 2000
looked so bad but still everybody thought they were great and amazing (at
least I thought).

Probably we think now that we have good looking graphics but after 15 years of
development they aren't so good anymore.. AMD/ATI and NVIDIA have done great
work over the years.

~~~
bluedino
Typically the things that previous made graphics 'good' were simply resolution
and color depth. Early 8-bit home computers were often limited to just 1 or 2
colors per sprite. Machines such as the NES increased this to 3 or 4 colors
per sprite, which makes a dramatic difference.

Along came something like the SNES, which although it shares the same
resolution as the NES (256x220 or something along those lines), it had access
to 32,000 colors instead of the 64 available to the NES, and it could use up
to 16 per sprite. That's the main reason why SNES screenshots look so much
better then SNES screenshots.

Once 3D came along, graphics looking 'good' became more complicated than just
resolution and color depth. The original Playstation suffered from texture
maps that showed artifacts, became distorted, and had edges that appeared to
'jump' around.

As the hardware became faster and fixed those issues, more techniques were
added for lighting, shading, and a hundred other enhancement which got us to
where we are today.

------
stox
Some interesting history here in a discussion of 3DFX:
[https://www.youtube.com/watch?v=3MghYhf-
GhU](https://www.youtube.com/watch?v=3MghYhf-GhU)

~~~
purplequark
If you go to
[http://www.computerhistory.org/collections/oralhistories/](http://www.computerhistory.org/collections/oralhistories/)
, and click on the Videos tab, you can view it with a sync'd/searchable
transcript. It's a beta feature right now, and I know they'd appreciate
feedback.

Incidentally, while only a few of the oral histories have videos up, there's
500 or so transcripts up, will worth a perusal.

------
blt
Ha ha, look at the capacitors soldered across the through-hole ICs on that
first ATI board from 1987. Only took a couple years to get to sophisticated
SMT boards.

------
aw3c2
An article like that should really cite it's sources.

~~~
Svenstaro
its

