
SGI's $250,000 Graphics Supercomputer from 1993 – Onyx RealityEngine² [video] - bane
https://www.youtube.com/watch?v=Bo3lUw9GUJA
======
danieltillett
I had the fun of using this system when I was doing my undergraduate thesis. I
had done some protein modelling work and ran straight into the problem that
there was no way to get the beautiful models into my thesis - we had no
printer.

I came up with what I thought was the genius idea of photographing the
monitor, the problem was the CRT screen was so curved my models were all
distorted. I ended up wheeling the whole SGI computer out into the hall and
setting my camera up with a 500mm lens (borrowed from the graphics unit) at
the other end of the hall (maybe 50m away). Worked great.

~~~
yaakov34
These early SGI machines did not have good support for printing. Even if you
did have a printer, the only way to print was to take a screenshot, which had
much lower resolution than laser printers and did not look good.

My first paid job - this was after the 10th grade in high school - was writing
a kind of printer driver for one of these machines. One of the researchers at
a local university had the same problem you did, and heard about this kid who
was supposed to be good with computers :)

The SGI GL manuals (this was before it became OpenGL) included all the
mathematical formulas they used to display the graphics - the rotation and
perspective matrices to transform the coordinates, the vector cross products
to calculate shading, and so on. I took these and implemented a subset of GL
which outputted PostScript commands to a text file. This was then sent
straight to a laser printer (Apple, I think). I didn't implement everything
that SGI did, of course - no smooth shading and I think I could handle only
the simplest types of occlusion. But it was good enough to handle the models
that the researchers needed to print.

I still remember the SGI manuals in big 3 ring binders. This is what I learned
linear algebra from - thanks guys.

~~~
timthorn
I spent a week at SGI UK in 1994. Everyone had at least an Indy workstation as
their machine, even for secretarial purposes. But still there were a couple of
486 PCs acting as print servers.

~~~
sureaboutthis
I worked at a remote office for SGI in 1992. I called my boss at the home
office complaining I didn't have a computer to test software with. He
apologized for having dropped the ball and immediately ordered a 16 processor
system for me.

I don't remember which model that was but I do remember almost falling out of
my chair as it was the top of the line system at the time.

------
deerpig
I have a slightly newer Infinite Reality R10000 sitting beside my desk here in
Phnom Penh. A professor in Berkeley gave it to us and somehow we managed to
get it to this side of the world. It uses enough power to run an apartment
block and yes it is noisy. I will eventually gut it and install a six node
linux cluster inside.

I used these boxes in Hong Kong in the early 90's, then later in Japan in the
late 90's for a number of projects and I was still using an Indy as my main
desktop until 2001.

SGI created hardware that almost took your breath away, you knew you were
seeing a future that not many people had the privilege to see back then in
person. To me, having the box sitting next to me every day, with "infinite
reality" label on the top reminds me of those days when anything seemed
possible and all of it was magical. I miss that sense of wonder and infinite
possibilities...

~~~
AriaMinaei
I wonder if it's still possible to put together hardware that's a few years
ahead of current high-end products.

Like, can you arrange, say, ten flagship graphic cards for realtime rendering?
Do we have game engines that can scale to that number?

~~~
erikpukinskis
I would approach it like this:

1) find some graphics problems which people say are not possible on any near-
term hardware

2) study the algorithms and identify low level calculations which, if you
could do orders of magnitude more of them, would allow you to solve the
problem.

3) get a bunch of FPGAs and try to design a machine which can (very slowly)
run that architecture

4) once you’ve got it working, slowly replace the FPGAs with ASICs

5) build a box with 16-64 of everything.

I would avoid polygons, since the current architectures are all extremely good
at filling polygons. SDFs and raytracing are where you may find the “not on
current gen” problems.

~~~
a1369209993
> SDFs and raytracing

An easy one would be: have each GPU raytrace a (say) 320x240 scene, each
offset by fractions-of-a-pixel[0] or multiples-of-a-screen from each other,
then have a final GPU stitch them together into a full-res video.

0: If you this with 60x1080 resolution, you might be able to replace the final
GPU with a dumb hardware multiplexer, though that would make compositing
painful at best.

~~~
jamesfmilne
That’s literally what we used to do for our cluster colour grading systems for
films.

We had hardware that the would merge DVI from up to 8 GPUs, in separate nodes,
and produce a single image.

------
corysama
I was very fortunate. This is the machine on which I originally learned
OpenGL. It was an undergrad lab job where I was paid $5/hour to eventually
write
[http://paulbourke.net/geometry/polygonise/marchingsource.cpp](http://paulbourke.net/geometry/polygonise/marchingsource.cpp)
But, in order to do that I had to learn how to 3D at all.

The most fun I had with the machine was playing GlQuake 1 at 1280x1024 on a T1
line and bragging about it on the chat where everyone thought I was lying. Of
course, a year later the 3DFX Voodoo 1 would come out and deliver
approximately 1/4 the fill rate for 1/1000 the price.

Besides that, it was a fine machine. Lots of cores that I didn’t know how to
utilize. Occasional bugs in the OpenGL implementation that might have been me
accidentally triggering edge cases. Command line compiles, gdb, lots of core
dumps.

SGI really lost it’s way after this. Years later they would be showing at
SIGGRAPH with multi-wall-sized displays, but without the programmable shading
that would be all the rage not just right then, but for the next two
decades... They went so far as to publish papers showing that with hundreds of
passes and temporary buffers using fixed-function shading, you could
eventually get mostly the same results as the new-fangled programmable BS...

The brilliant engineers of SGI soon moved on to become founding members of
Nvidia, ATI, Imagination and other graphics tech companies.

~~~
sureaboutthis
If you mean in 1993, you mean GL, not openGL, I think. I worked at SGI in
1992.

~~~
analbeads01
OpenGL was founded in 92 and basically IrisGL with bits removed. Anyway by the
time quake was released in 1996 OpenGL was very much a thing.

------
jaysonelliot
The VRML demo at 18 minutes in really took me back:
[https://youtu.be/Bo3lUw9GUJA?t=18m](https://youtu.be/Bo3lUw9GUJA?t=18m)

I remember how convinced some people were that VRML was going to be the future
of the Web. 3D graphics in your browser, all coded with a simple markup
language! How could it fail?

Ultimately it was just too complicated for the average web surfer to navigate,
especially in the '90s. For a while, though, it really looked like the visions
of virtual worlds that cyberpunk fiction loved might show up in your Netscape
browser.

~~~
twic
VRML! I wrote a modest amount of VRML by hand in the early '00s. One project
was a 3D model of a castle, which i worked on in large part because it was
something i could do together with my then-girlfriend - she would describe how
the castle should look, and i would code it up. I learned a lot about how to
use parametric prototypes, so i had a kit of parts i could rapidly use, to
speed up the feedback loop.

~~~
hpcjoe
I wrote a code called genvrml which converted molecular models from any format
(using Babel at the time for conversion) to VRML for display. I converted a
bunch of my models for this. I still have them here. And if I dig around, I
might even have genvrml as well.

I did believe that this was fairly game changing tech. I showed my 3D models
sticking 2m out from a rear projected Reality Wall powered by an IR engine in
the Detroit office.

But SGI was beset with inane leadership, and we began our terminal decline
mere weeks after I was hired ABD from grad school. I finished up my thesis and
developed an autoinstaller code for Irix at the same time. Writing the manuals
in LaTeX.

------
harel
Flashback scene begins.

When I was much younger my dad (who has a print business) was tasked with
printing panels for F16 jet fighter pilot cockpit to be used in a simulator of
the aircraft. The simulator was developed by a private company. One Saturday
he took me to their office. They had one of these and gave me a demo of it.
Until then all I knew was of standard consumer computers (commodore, pc, macs
etc) . In another pretty big room they had a lot of amiga computers used to
make the 3d models. I then had a go at the simulator. 180 degree screen, a
real cockpit, and pretty amazing 3d graphics.for the time. Flew my jet through
a mountain tunnel. I think the only thing that impressed me as much as the
simulator itself was the sgi. I've never experienced that kind of futuristic
computing power.

Flashback scene ends...

~~~
sureaboutthis
I was the systems engineer for SGI mainly to McDonnell-Douglas in St. Louis in
1992 for their flight simulators. If you think what you saw was something ...

~~~
harel
I'm sure my early teen eyes saw only a glimpse. The one I was shown was
designed for the Israeli air force. This was the closest I got to sit in an
F-16 cockpit, and the closest I got to super computing power even films
couldn't show me at the time. I do however remember them totting the 250K
price tag.

------
slobotron
Always loved the IRIX Motif and 4dwm aesthetics, and was lucky to have had
access to some Indigo machine in school.

Was really impressed by how responsive the file manager was wrt. the immediate
appearance of icons for files created from terminal. It also kept track of
which folders were open in other windows, giving them different appearance -
felt like one was interacting with actual objects.

Tried numerous times to get the same feel on Linux; there used to be a project
called 5dwm to recreate the Indigo Magic Desktop experience, I should check if
it is still around for additional kick of nostalgia...

~~~
Fnoord
I don't know about the SGI Indigo, but the SGI Indy had relatively low latency
[1]

[1] [https://danluu.com/input-lag/](https://danluu.com/input-lag/)

------
ChuckMcM
Fun times. Could have been the envy of all your engineer buddies with one of
those bad boys all to yourself. What a difference 25 years makes.

Personally I find that learning about these machines is a good way to better
understand computer architecture because each one was, during its reign, some
really smart bunch of peoples idea of what was state of the art. You can see
the evolution of ideas that way.

~~~
blattimwind
What I think is quite interesting about these era of machines is where the
power goes. First off, there aren't actually many heatsinks in the machine,
and those few that are used, are really tiny. This means that the power used
must be distributed rather evenly among all components, even on the CPU
boards. The CPUs can at most pull 10-15 W with those heatsinks, meanwhile the
big ASICs spread around everywhere can probably sink perhaps 3-5 W with no
extra heatsink. Compare this to modern systems, where the (single chip)
chipset has a TDP of ~5 W, while the CPU(s) run anywhere from 50 to 100+ W.

So the gap between high-power and low-power components was way, way smaller
back then, and I'd wager that a huge chunk of power in these SGI machines went
into their high-speed bus systems.

Another big difference is how the power system was laid out. Here are the
specs of a SGI Onyx 2 PSU [1]:

    
    
        1750 W @ 230 V
        3.45 V, 375 A
        5 V, 85 A
        12 V, 22 A
        5 V, 1 A (probably standby power?)
    

In that day and age, most of everything in there would've run off 3.3 V. So
the PSU _directly_ , centrally provides that voltage and due to the large
currents droop and such will have to be considered carefully in the entire
system design (and it almost certainly uses remote sensing at the backplane,
perhaps even sampling various points in the system). This makes sense, because
there are _many_ components and the difference in power between them is kinda
small.

Nowadays, a computer PSU provides essentially two voltages: 12 V and 5 V
standby. It still provides "legacy" power, like 5 V and 3.3 V rails, but these
are derived with simple buck-converters from the 12 V rail(s) in modern
designs. Because a modern computer has few components requiring a lot of
power, each of them has their own local converter sourced by 12 V to reduce
losses.

[1]
[https://web.archive.org/web/20170314114558/http://sgistuff.n...](https://web.archive.org/web/20170314114558/http://sgistuff.net/hardware/systems/images/onyx2-psufront-2200.jpg)
[https://web.archive.org/web/20170314160011/http://sgistuff.n...](https://web.archive.org/web/20170314160011/http://sgistuff.net/hardware/systems/images/onyx2-psuback-2200.jpg)

~~~
ebikelaw
When Intel released the Pentium with mandatory active heatsinks that was
called a big scandal by the industry magazines. Now of course we have smaller
CPUs drawing 150W or more.

------
duncanawoods
Great video!

Not one for nostalgia but I have found myself cooing over old computers the
way most react to babies. A BBC B can bring a tear to the eye. If you are in
the UK then [http://www.tnmoc.org/](http://www.tnmoc.org/) is well worth a
visit. Its like a themepark for nerds.

Working at one of those noisy brutes must have been harsh. It brings back
flashbacks of years spent in server rooms. Kids with their cloud computing
don't know what they missed!

~~~
mothsonasloth
I'm a member of TNMOC and go maybe 2-3 times a year. The Acorn machines RISC
OS always fascinates me.

~~~
duncanawoods
Excellent idea, definitely something I want to support. I have just joined
too.

I remember using an Archimedes after years of fighting to get my games to run
well on a BBC Micro. Wow - it felt like cheating!

------
gdubs
When I was about 13 years old I had my mom call up SGI to ask if they could
send us any information. This was the time of dialup internet / AOL, and info
came to our house primarily via snail-mail. A few weeks later a thick package
came in the mail, full of beautifully printed brochures and booklets on their
machines of the day; Indigo, Indy, Onyx... I treasured those pamphlets, and
dreamed of maybe one day getting to actually use an SGI.

------
mark_l_watson
I used two of these in the 1990s: one for Nintendo video game development
(thanks Nintendo!) and one for a Virtual Reality system for Disney (thanks
Disney!). Awesome tools, but the functionality is now more or less commodity
gaming PCs.

------
martyvis
We started using SGI Personal IRIS systems which were I think were using 33MHz
MIPS R3000 processors in the late 80s. I think they might been able to address
a heady 32Meg of RAM. We were developing a graphical operation control system
at the steelworks where I worked as an engineer. I remember them costing
something like $25-30K each. So to enable enough developers we had to add
serial terminals and X terminals. It was actually this aspect that got me
involved in finding open source X windows solutions that allowed us too
repurpose standard Intel PCs as terminals. Fun times. But the sgi's were ahead
of everything else out there. While we didn't have a real work need for their
3D capability it did let you see what the future could hold.

------
dev_dull
The scanlines on the CRT monitor reminded me how good the refresh rates were
on that hardware back then. How did we move backwards so spectacularly with
LCD panels? It’s only now that reasonably priced 144hz monitors are hitting
the market.

~~~
rasz
What good would high refresh LCD do you in 1995-2005? From no acceleration at
all to barely being able to run modern games at 40 fps (cod2 on 7800GTX in
2005).

CRT need high refresh, because they flash picture content at you. LCD do not
flash, they maintain steady picture.

~~~
saltcured
The main reason for 144 Hz refresh back then was not to tame normal CRT
flicker. It was to get 72 Hz refresh of a stereoscopic pair. You had to use
really fast phosphors to make this work and not have severe blurring/ghosting
of the left and right images. Monitors intended for non-stereoscopic use could
have slower phosphors which helped reduce the flickering effect at lower
refresh rates.

Having this rapid refresh helped give persistence of vision for each eye,
where if you had subdivided 72 Hz into 36 Hz cycle rates for a pair, you would
have very annoying strobe effects visible to most people.

No application was rendering at 144 Hz. They might be lucky to sustain 20-30
Hz, but the framebuffer held both a left and right image and the video output
switched back and forth between them on each frame scan. Complex
visualizations often got down to 12 Hz or so and considered that acceptable as
people would still perceive motion as in an animated cartoon.

------
localhost
We had one of these in Bryan Jones' laboratory in the Chemistry Department at
the University of Toronto. We also had the 3D polarized shutter-based glasses
as well - it was cool to see small molecules docking into active sites of
enzymes right in front of your face. Of course, none of that stuff amounted to
anything, and in hindsight it was a gigantic waste of money.

In my lab, we only had one of the SGI Indy machines - it was fun surfing the
web on the original NCSA Mosaic browser on that pizza box machine.

------
mstade
Man, this brings back memories. I still have an O2 and an Octane system in
storage. Been meaning to take them out and set them up again at some point.

My father was generous enough to give me an O2 in the late 90s so I could
learn 3D graphics, since I was interested in making special effects for
movies. We didn’t have any software then but found this thing called Blender
which was available for free and worked on IRIX. Still have a bunch of blender
books and manuals from back then. Good times!

~~~
XnoiVeX
:-) I have a fully working O2 setup somwehere in the attic. Could not get
myself to get rid of it. Also have an Indy that won't boot up.

------
Keyframe
I had the pleasure to work with various SGI machines back then (video and
animation). Being around Onyx was always a special feeling, even though I
prefered thw look of older Crimson chassis. SGI made a poster calendar even,
which was set like 40 odd years into the future or something, taking advanfage
of periodicity of calendar years. Nice gimmick, nice poster which I lost,
sadly.

------
PeterStuer
At one point our lab bought an O2 and an Indigo. Mostly these were fashion
statements as we didn't do much graphics work. I can't remember them being
remarkable performance wise for normal workloads compared to the Sun machines
that where our staple.

The machines most certainly stood out in a time where the only options you got
for an exterior finish were beige and beige.

~~~
gcb0
sgi was the preferred way for departments to blow their year/quarter budget so
it wouldn't be cut off next year.

------
blincoln
In the mid-90s, a friend of mine described an SGI demo where the player flew a
speederbike through a forest. Apparently it was quite impressive for the time.
Every time I come across footage of old SGI software, I hope it's in there,
but it never is. Has anyone run across a video of this?

------
a-dub
I had an Indy on my desk when I was in my first paid job in high school. I
loved that thing. We also had a Challenge named challenger. I blew it up once.

Years later I interviewed at Google. One of my interviewers turned out to be
one of Jim Clark's original grad students. That was pretty crazy.

------
russellbeattie
Years ago, Weird Stuff Warehouse (R.I.P.) had a shell of the larger closet-
sized Onyx system sitting in the back, and I always thought it would be fun to
do something with it, but couldn't think of anything. With a shell of a Cray,
you could at least make a couch out of it...

~~~
jsjohnst
It’s possible that was my old machine. I donated a lot of old SGI hardware to
Weird Stuff in 2008-2009 before moving from SV up to SF.

------
mnutt
My father worked in sales for sgi through most of the 90s; what started as him
bringing me to the office a few times on weekends turned into me asking to go
to the office most weekends. ("don't you have any work you need to finish
up?")

Every employee had an an Indy or Indigo2 in their office and most (us
included) had one at home too. Very few of the machines at the office had
passwords. I never really got very far past playing with the 3d tech demos or
surfing the web on Mosaic and Netscape, but it was a pretty formative part of
my career in technology.

------
some1else
Could anyone use an SGI Indigo 2 Impact (the purple one)? I've got one in
Slovenia, ready to part with it for 100€ + shipping. My mother threw away the
screen, keyboard and mouse (thinking I won't need them), so it's just sitting
around. The box is still fully operational, with Photoshop for Irix installed.

------
wslh
The main goal of going to exhibitions like COMDEX [1] was playing a little bit
with these computers. Those were strong feelings!

[1]
[https://en.m.wikipedia.org/wiki/COMDEX](https://en.m.wikipedia.org/wiki/COMDEX)

------
deng
Oh, who could forget the "not PS/2 connectors". I also vividly remember the
sysadmin telling me to never ever disconnect they keyboard while the machine
is running, or something inside might blow. Not sure if he was just paranoid,
but I also never tried... I also remember one German shop buying a slightly
used Onyx for half a million Deutsche Mark, and realizing they did not have
the key for turning it on, so they had to short the thing. And it was so loud
you had to wear earplugs...

~~~
sp332
Early Macs didn't have hot-pluggable keyboards and mice either. It was one of
those things where it would usually work, until one time some fluctuation
would just brick the controller chip.

------
redm
Interestingly, my Indy Irix “learning” workstation, circa 2001, is selling for
more today on eBay then i bought it for back then. I guess it really has
become a collectors item.

------
uptown
I used these in the computer lab at RPI back in the 90s to do 3D animation.
It’s crazy to think how much smaller and faster today’s tech is compared to
these behemoths.

~~~
Annatar
And yet, this smaller tech comes directly from SGI: lots of former SGI
hardware engineers ended up at NVIDIA.

------
herf
CMU's graphics lab had a Crimson when I was there as an undergrad - many hours
late at night! These models had an accumulation buffer (a way to
combine/average screenspace image buffers), which was fun for all kinds of
tricks, like motion blur. At the time, it felt amazing to get to use all that
fill rate.

Just five years later, the 3dfx voodoo2 did most of what that machine did
(without the accumulation buffer).

------
jacebot
When I was younger I was really into CGI, I remember thinking man, If I ever
get to work on one of those to do some 3d rendering, that will be something.

------
Annatar
Had the server version of this system as my private server in my basement at
home; the electricity bill was peppery.

I must have torn it apart and put it back together about ten times, studying
the hardware, and must have installed IRIX 6.5.30 on it just as many.

------
eveningcoffee
It is about $425,000 in inflation adjusted dollars. If we adjust it to the
real estate prices then even more.

This price is no wonder considering the amount of the silicon it contains.

What I do wonder is what would be the performance of the same amount of the
silicon today.

~~~
coldtea
> _What I do wonder is what would be the performance of the same amount of the
> silicon today._

Well, aren't there several comparable models from IBM and co?

------
shimfish
I have a vivid memory of having to carry one of these up a flight of stairs.

I used the machine as an undergrad, doing some coding as part of the
university's driving simulator project.

------
kilroy123
So what would be the modern-day equivalent to this machine?

~~~
rasz
CPU and 3D capabilities were late 1999 early 2000 gamer PC level.

------
akhilcacharya
I wonder how usable this would be with a modern browser.

------
dis-sys
I tried to search online to see whether DOOM has been ported to Onyx RE2, but
couldn't find any video proof. That is really strange.

~~~
saltcured
I remember seeing Quake ported to the CAVE environment and demonstrated on an
Onyx (well actually an Origin with reality engines) with a 3-wall and floor
projection room...

------
rwmj
Is the backplane 32 bit VMEbus with an additional central connector? [Edit: 32
bit, not 64 bit]

------
kyberias
How can we reach this guy? His site currently doesn't work and has a security
problem.

~~~
wincy
I think he mentions in the video that his site is down and how to get ahold of
him but I don’t care to watch it again to verify.

~~~
jsjohnst
iris.cc is the website mentioned.

------
pcurve
can you imagine companies spending on insane amount of money on these only to
have them become obsolete in 3 years?

~~~
rasz
No, because it didnt happen, commodity hardware got there ~7 years later. You
were buying a window into the future, allowing your researchers to do things
mere mortals could only read about in science fiction books.

------
browsercoin
I'm shocked by how crazy the graphics were for back then...I mean seeing the
metal logo, I couldn't quite believe that this was possible in 1993. I still
have the CGW magazine from this era.

------
liftbigweights
We had these and a few SUN systems in our computer lab storage room. We called
them the dinosaurs. It's amazing how things changed from 1993 to mid 2000s.
From technology rock stars to useless junk in a little more than a decade.

The "lab master" bribed us with pizza to help him clear the storage room.
Quickly found out how heavy these things were and what a raw deal we got.

~~~
wowtip
I heard somewhere SGI put a big chunk of iron/lead inside the case to make
them heavier and add to the expensive impression when handling the machines.

Not sure if that was the case, or if it is just an urban legend?

~~~
qubex
They used cast iron extensively.

------
qaq
Could be converted to a cool mini-fridge

~~~
octorian
Let's please stop doing that to interesting vintage computers. Well, at least
to working (or fixable) ones. I cringe every time someone shows off that sort
of project.

~~~
qaq
While I was not suggesting doing to this particular model I would imagine a
good chunk of them end up in a landfill.

------
corerius
We had a maxed out one of those in the company lab. I seem to remember that it
was on the loud side.

What a tremendous waste of money.

------
iamgopal
The true victory and underappreciated skill of first world country is in their
scalability to earn profit from massively overpriced machines.

------
justinator
Why is the dude wearing untied board shorts and no shoes? Is that his
signature Thing?

~~~
pvg
It's traditional in some parts of Canadia.

[https://www.youtube.com/watch?v=6vgoEhsJORU](https://www.youtube.com/watch?v=6vgoEhsJORU)

------
mattl
This looks good. A previous video on this channel said there was no WWW in
1993 and I stopped following things. Have things improved?

~~~
dodoid
Hi there!

I am Dodoid, the creator of this video. Created an account here to ask you
this. Where did I make that mistake? I'm fully aware that there was a world-
wide-web in 1993, and if I made the mistake of saying that there was not, I
have a post-release correction system and can add a correction to the video.

Anyway, I hope things have improved.

Thanks for the interest, -Dodoid

~~~
axilmar
Thanks for the awesome video!!!!

