I came up with what I thought was the genius idea of photographing the monitor, the problem was the CRT screen was so curved my models were all distorted. I ended up wheeling the whole SGI computer out into the hall and setting my camera up with a 500mm lens (borrowed from the graphics unit) at the other end of the hall (maybe 50m away). Worked great.
My first paid job - this was after the 10th grade in high school - was writing a kind of printer driver for one of these machines. One of the researchers at a local university had the same problem you did, and heard about this kid who was supposed to be good with computers :)
The SGI GL manuals (this was before it became OpenGL) included all the mathematical formulas they used to display the graphics - the rotation and perspective matrices to transform the coordinates, the vector cross products to calculate shading, and so on. I took these and implemented a subset of GL which outputted PostScript commands to a text file. This was then sent straight to a laser printer (Apple, I think). I didn't implement everything that SGI did, of course - no smooth shading and I think I could handle only the simplest types of occlusion. But it was good enough to handle the models that the researchers needed to print.
I still remember the SGI manuals in big 3 ring binders. This is what I learned linear algebra from - thanks guys.
I don't remember which model that was but I do remember almost falling out of my chair as it was the top of the line system at the time.
I used these boxes in Hong Kong in the early 90's, then later in Japan in the late 90's for a number of projects and I was still using an Indy as my main desktop until 2001.
SGI created hardware that almost took your breath away, you knew you were seeing a future that not many people had the privilege to see back then in person. To me, having the box sitting next to me every day, with "infinite reality" label on the top reminds me of those days when anything seemed possible and all of it was magical. I miss that sense of wonder and infinite possibilities...
In the recent 3-5 years, there is a clear revival of the cyberpunk subculture online. Many related hobbyist websites appeared, many new cyberpunk-inspired independent art, music and games are composed, new communities are formed, etc.
Themes include a general nostalgia of the 80s, especially vintage computers, also the 90s early pre-Web 1.0.
The reason? We can clearly see. The lost future that never comes...
It will come slowly at first, and then all at once.
Nobody is denying it, indeed we see many related development.
However, it's not only about "the lost future that never comes", I think it's also due to the people being increasingly alienated about the current state of computing. Primarily, it's about the lost mentality of the future.
Cyberpunk promised a future where computing is the disruptive technology. Since 2006, the ever-increasing clock speed came to a halt. Since 2013, the general performance of Intel processors remained constant. Selling of PCs keeps declining. No major breakthrough in practical operating systems are made beyond Unix (http://herpolhode.com/rob/utah2000.pdf).
A future where anything seemed possible and all of it was magical is definitely gone. But the new generation of developers, armed with decentralization, P2P, cryptography, and trustless system, no matter if it's going to be successful or not, would bring the Internet back to its cyberpunk ideals, revive the dream and set the history forward.
Like, can you arrange, say, ten flagship graphic cards for realtime rendering? Do we have game engines that can scale to that number?
Sidenote: I've read that John Carmack and id Software liked to develop on workstations that were "ahead of the curve" that way. It gave them an edge, in that they were able to develop future games for hardware that didn't yet exist, but knowing that consumer PCs would eventually catch up.
I think what made these SGI computers really amazing at the time is that there was no such thing as accelerated 3D graphics in the consumer market at the time (or much real-time 3D for that matter). They also had a cool Unix operating system with a UI that was way ahead of anything you could get on a consumer PC. I can also imagine that it was a much much more comfortable development environment than developing on say, MS-DOS, which didn't even have multitasking.
Add the other features like reliability (esp hot-swapping), servicability, and security (Trusted IRIX) to have some incredible machines. I always wanted inexpensive hardware with hot-swap, RAID, and something like NUMAlink connecting it. Never quite got that. One company did make a NUMA for AMD and Intel:
Unfortunately I wouldn’t say it feels like the future, more like a normal CentOS Linux desktop.
You’ll struggle to get a PC whose BIOS can handle much more than that too.
We used to build clusters for the same thing in the past, but that was largely standard supercomputing stuff but very similar to how the InfiniteReality machines were used. I believe our software once ran on Onyx machines in the dim & distant past.
So in short I wouldn’t say having loads of GPUs is enough to make it feel futuristic.
1) find some graphics problems which people say are not possible on any near-term hardware
2) study the algorithms and identify low level calculations which, if you could do orders of magnitude more of them, would allow you to solve the problem.
3) get a bunch of FPGAs and try to design a machine which can (very slowly) run that architecture
4) once you’ve got it working, slowly replace the FPGAs with ASICs
5) build a box with 16-64 of everything.
I would avoid polygons, since the current architectures are all extremely good at filling polygons. SDFs and raytracing are where you may find the “not on current gen” problems.
An easy one would be: have each GPU raytrace a (say) 320x240 scene, each offset by fractions-of-a-pixel or multiples-of-a-screen from each other, then have a final GPU stitch them together into a full-res video.
0: If you this with 60x1080 resolution, you might be able to replace the final GPU with a dumb hardware multiplexer, though that would make compositing painful at best.
We had hardware that the would merge DVI from up to 8 GPUs, in separate nodes, and produce a single image.
It shows up to the host computer as one really big GPU. Of course, you're going to get worse performance than just a single Titan V because it can handle any game already and there's inevitably going to be latency added by doing work over NVLink/NVSwitch. Those massive GPU products are targeted toward offline rendering or machine learning applications, not so much realtime simulation.
The most fun I had with the machine was playing GlQuake 1 at 1280x1024 on a T1 line and bragging about it on the chat where everyone thought I was lying. Of course, a year later the 3DFX Voodoo 1 would come out and deliver approximately 1/4 the fill rate for 1/1000 the price.
Besides that, it was a fine machine. Lots of cores that I didn’t know how to utilize. Occasional bugs in the OpenGL implementation that might have been me accidentally triggering edge cases. Command line compiles, gdb, lots of core dumps.
SGI really lost it’s way after this. Years later they would be showing at SIGGRAPH with multi-wall-sized displays, but without the programmable shading that would be all the rage not just right then, but for the next two decades... They went so far as to publish papers showing that with hundreds of passes and temporary buffers using fixed-function shading, you could eventually get mostly the same results as the new-fangled programmable BS...
The brilliant engineers of SGI soon moved on to become founding members of Nvidia, ATI, Imagination and other graphics tech companies.
It’s called “the University of Silicon Graphics.”
You are probably misremembering things. GLQuake came out in 1997, after 3DFX Voodoo in 1996, not the other way around.
I remember how convinced some people were that VRML was going to be the future of the Web. 3D graphics in your browser, all coded with a simple markup language! How could it fail?
Ultimately it was just too complicated for the average web surfer to navigate, especially in the '90s. For a while, though, it really looked like the visions of virtual worlds that cyberpunk fiction loved might show up in your Netscape browser.
It was exactly that.
I participated on one SGI demo around 1995. It included tehnology demo and the grand vision from some guy I don't remember.
They had few the these bad boys (bigger refrigerator sized models) and very fast internet running Netscape browser. They wanted us to start to develope for this to this new thing called Java that would allow code run everywhere.
They demoed what they thought internet would look like after 3-5 years:
* 3d browsers with VRML. HTML is replaced with 3d virtual rooms with 3d surround audio.
* you can attach pieces of Java code to the VRML and freely interact with objects
* people would buy stuff in the internet using micropayments possibly using new fancy new digital currencies like DigiCash (this was still hand waving). There was also this thing called E-gold few years later.
* the demo included virtual kitchen design app, virtual rooms with water fountains and java controlled robots, and fish that responded to their environment.
It was all mindblowing technologically, but none of the ideas were new to me because I had been reading Snow Crash (1992) and others. Jaron Lanier and VPL Research were also known to work with these things.
In fact, it looks like we are repeatedly trying to bring the same vision alive. Every wave of attempts is getting us little bit closer but always falling short little bit.
I did believe that this was fairly game changing tech. I showed my 3D models sticking 2m out from a rear projected Reality Wall powered by an IR engine in the Detroit office.
But SGI was beset with inane leadership, and we began our terminal decline mere weeks after I was hired ABD from grad school. I finished up my thesis and developed an autoinstaller code for Irix at the same time. Writing the manuals in LaTeX.
There's some modern VRML-esque things now, A-Frame and ReactVR come to mind.
Thanks for the link to A-Frame though! This looks great - I am loving the declarative nature of it, and the scene-inspector seems like a really useful tool.
I don't want to have to navigate a space in order to find something. Navigating space is a limitation of physical objects; replicating it in an online environment is an error. It's like if automobiles were designed so that steering and speed were controlled with reins.
We have tons of really popular game titles, and millions of people playing or watching or chatting, as tournaments unfold. We fill convention centers with people playing games, and stream a webcam feed of people at conventions playing the games. But it's all still games, world of warcraft, fortnite, minecraft, eve online.
So, what if it wasn't a game? What if it was more than video conferencing? What if there was more going on, and an immersive system of interaction closer to first person gaming engine was as relevant, in social terms as facebook, twitter, bitcoin, uber, amazon, google, github, along with email, chat, and file sharing, all as a single service?
What if all your home automation and mobile devices were integrated? What if it gave your GPS location to the world, and authenticated you selectively as context sensitive pseudonyms, or conversely your authentic legal identity, securely and as preferred?
How much of that is possible? And how much of it do we actually want? But most of all, if corporations collaborating with governments, or an open community of enthusiasts, put this in your hands, would you trust it? Would you put all your eggs in one basket like that?
How many IP addresses would a system like this require, to keep each context secure from another, to preserve pseudonyms, and not leak GPS location information? What about background audio, and geolocation by timing attacks? Plus the basics of platform security, knowing what we see every day in terms of zero days getting dropped, and leaving people out in the open?
It's cool to think about, but our technology as it exists, is a drafty outhouse, compared to the brick shithouse we really need.
When I was much younger my dad (who has a print business) was tasked with printing panels for F16 jet fighter pilot cockpit to be used in a simulator of the aircraft. The simulator was developed by a private company. One Saturday he took me to their office. They had one of these and gave me a demo of it. Until then all I knew was of standard consumer computers (commodore, pc, macs etc) . In another pretty big room they had a lot of amiga computers used to make the 3d models.
I then had a go at the simulator. 180 degree screen, a real cockpit, and pretty amazing 3d graphics.for the time. Flew my jet through a mountain tunnel.
I think the only thing that impressed me as much as the simulator itself was the sgi. I've never experienced that kind of futuristic computing power.
Flashback scene ends...
 http://www.system16.com/hardware.php?id=832 - Nameco Magic Edge Simulator Ha
rdware at System-16, The Arcade Museum
 https://ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa.gov/199400... - Vert
ical Motion Simulator Familiarization Guide (pdf) (1993)
- NASA Vertical Motion Simulator
One day, the din of the machinery paled and withered under a thunderous earth-shaking draconian roar - whiplashing my head out of the rack toward the windows - an F/A-18 tilted sideways at the edge of ground effect ripping a minimum radius turn back to from whence it came ... without ever leaving the airfield. Sigh. There is no simulator for That.
Was really impressed by how responsive the file manager was wrt. the immediate appearance of icons for files created from terminal. It also kept track of which folders were open in other windows, giving them different appearance - felt like one was interacting with actual objects.
Tried numerous times to get the same feel on Linux; there used to be a project called 5dwm to recreate the Indigo Magic Desktop experience, I should check if it is still around for additional kick of nostalgia...
Personally I find that learning about these machines is a good way to better understand computer architecture because each one was, during its reign, some really smart bunch of peoples idea of what was state of the art. You can see the evolution of ideas that way.
So the gap between high-power and low-power components was way, way smaller back then, and I'd wager that a huge chunk of power in these SGI machines went into their high-speed bus systems.
Another big difference is how the power system was laid out. Here are the specs of a SGI Onyx 2 PSU :
1750 W @ 230 V
3.45 V, 375 A
5 V, 85 A
12 V, 22 A
5 V, 1 A (probably standby power?)
Nowadays, a computer PSU provides essentially two voltages: 12 V and 5 V standby. It still provides "legacy" power, like 5 V and 3.3 V rails, but these are derived with simple buck-converters from the 12 V rail(s) in modern designs. Because a modern computer has few components requiring a lot of power, each of them has their own local converter sourced by 12 V to reduce losses.
 https://web.archive.org/web/20170314114558/http://sgistuff.n... https://web.archive.org/web/20170314160011/http://sgistuff.n...
Not one for nostalgia but I have found myself cooing over old computers the way most react to babies. A BBC B can bring a tear to the eye. If you are in the UK then http://www.tnmoc.org/ is well worth a visit. Its like a themepark for nerds.
Working at one of those noisy brutes must have been harsh. It brings back flashbacks of years spent in server rooms. Kids with their cloud computing don't know what they missed!
I remember using an Archimedes after years of fighting to get my games to run well on a BBC Micro. Wow - it felt like cheating!
CRT need high refresh, because they flash picture content at you. LCD do not flash, they maintain steady picture.
Having this rapid refresh helped give persistence of vision for each eye, where if you had subdivided 72 Hz into 36 Hz cycle rates for a pair, you would have very annoying strobe effects visible to most people.
No application was rendering at 144 Hz. They might be lucky to sustain 20-30 Hz, but the framebuffer held both a left and right image and the video output switched back and forth between them on each frame scan. Complex visualizations often got down to 12 Hz or so and considered that acceptable as people would still perceive motion as in an animated cartoon.
In my lab, we only had one of the SGI Indy machines - it was fun surfing the web on the original NCSA Mosaic browser on that pizza box machine.
My father was generous enough to give me an O2 in the late 90s so I could learn 3D graphics, since I was interested in making special effects for movies. We didn’t have any software then but found this thing called Blender which was available for free and worked on IRIX. Still have a bunch of blender books and manuals from back then. Good times!
The machines most certainly stood out in a time where the only options you got for an exterior finish were beige and beige.
Years later I interviewed at Google. One of my interviewers turned out to be one of Jim Clark's original grad students. That was pretty crazy.
Every employee had an an Indy or Indigo2 in their office and most (us included) had one at home too. Very few of the machines at the office had passwords. I never really got very far past playing with the 3d tech demos or surfing the web on Mosaic and Netscape, but it was a pretty formative part of my career in technology.
Is it an SGI thing or was it just sitting in a box as a spare all this time?
Just five years later, the 3dfx voodoo2 did most of what that machine did (without the accumulation buffer).
The "lab master" bribed us with pizza to help him clear the storage room. Quickly found out how heavy these things were and what a raw deal we got.
Not sure if that was the case, or if it is just an urban legend?
I must have torn it apart and put it back together about ten times, studying the hardware, and must have installed IRIX 6.5.30 on it just as many.
This price is no wonder considering the amount of the silicon it contains.
What I do wonder is what would be the performance of the same amount of the silicon today.
Well, aren't there several comparable models from IBM and co?
I used the machine as an undergrad, doing some coding as part of the university's driving simulator project.
 Pacific Gas and Electric, a California utility company
What a tremendous waste of money.
I am Dodoid, the creator of this video. Created an account here to ask you this. Where did I make that mistake? I'm fully aware that there was a world-wide-web in 1993, and if I made the mistake of saying that there was not, I have a post-release correction system and can add a correction to the video.
Anyway, I hope things have improved.
Thanks for the interest,