Hacker News new | comments | show | ask | jobs | submit login
Fall of Voodoo (tedium.co)
128 points by discreditable 7 months ago | hide | past | web | favorite | 105 comments



The 3dfx timeline gives you an idea of how much the computer field has slowed down and consolidated. Voodoo was released in 1996, into a market with half a dozen competitors. Voodoo 2 in 1998, the same year as the first viable NVIDIA card (Riva TNT). NVIDIA released the GeForce 256 in 1999, cementing its dominance. 3dfx released its last card in 2000, just four years after its first. NVIDIA is now still dominant, 18 years after establishing dominance with the GeForce.

Think back on how much consumer computing changed from 1996 to 2000. Four years back from the present, in contrast, is the Haswell-based Macbook Pro and iPhone 5s (a perfectly fine setup today).


Most of the other companies are still there though. PowerVR still makes 3D chips, but only for the embedded market (e.g. Android phones). Matrox gave up on 3D and now only makes specialized cards for 8 and 16+ monitor displays (airports, kiosks). IIRC, even Trident switched to embedded graphics.

The thing that bothers me is that if you want a PCI-E card on a consumer system, you only have nVidia/AMD. Intel only does embedded now with none of their offerings on a card (like the only i740). There is no Ryzen 7 with an APU, so if you're building a small developer box, you have to buy a discrete graphics card and can't rely on embedded (unless you want to drop to the Ryzen 5, which was just released with an APU).

There are rumors that Intel might be getting back into this space. If anyone could try to compete with the big two, it's probably them.


I'm not sure what rumors you've heard but what's not a rumor is that Intel is releasing CPUs with integrated AMD GPUs: https://www.anandtech.com/show/12207/intel-with-radeon-rx-ve...

That suggests Intel isn't going to be competitive in the graphics space in the near future


I believe you might be thinking Intel hiring Raja Koduri this past November. It seems Raja Koduri will be heading up new products in the graphics arena, but that is at least two or three years into the future. https://www.pcper.com/reviews/Processors/Intel-Kaby-Lake-G-L...


Not on-die integrated though.

Intel seems to have made or licensed a high speed, short distance, data bus that sits between the Intel CPU and the AMD GPU that are mounted side by side on a small board.

Would be hilarious btw if this showed up on desktops as a dual socket motherboard, with one socket for the CPU and another for the GPU. Likely with one massive cooler covering them both.


iPhones also had a PowerVR GPU as part of A-series SoC. In iPhone 8/X (A11) it was replaced with Apple's own GPU design.


That design isn’t custom but appears to be using PowerVR licenses IP given the way some press releases were worded. I find it highly unlikely Apple could design a market leading GPU compatible with their existing products in such a short time. Their CPU took a lot longer. My guess is the A11 is a tweak of the PVR and subsequent versions later will introduce a ground up redesign.

PVR owns a lot of TBDR patent IP since they mostly went it alone on tiled based deferred renderers when most of their competitors were IMRs. It looked like a smart decision as desktop GPUs were hitting a memory bandwidth constrAint until GDDR and ultra wide buses started being used and GPUs got better at early-z rejection.

But on mobile, TBDR works great since mobile has different power and heat constraints than desktop.


I find it highly unlikely Apple could design a market leading GPU compatible with their existing products in such a short time.

Nobody noticed, but apple started working back on their GPU all the way back on the iPhone 5s (with their A7 SoC)

It wasn't a complete custom design, but it wasn't an off the shelf powervr either. Until that point, apple always released the exact model of GPU they were using in their SoC, then suddenly their documentation starts referring to "the apple A7 GPU" and "the apple A8 GPU". Wikipeida does give a model number for the GPU, but if you trace it back you find a andandtech article going through the logic speculating it must be a powervr 6 series. And all roads lead back to that speculation.

For later SoCs, andandtech actually invent a new 6 core powervr variant, because powervr wasn't listing one yet die shots of the new iPad showed a 6 core GPU.

With the release of the iPhone 5s, apple also switched from the powervr supplied driver to their own one written in house.

It appears that apple incrementally swapped out parts of the powervr design until they arrived at the A11 SoC GPU which they declared as their own, with none of the original powervr parts remaining. Though the result appears to be very close to the original design. (I suspect powervr accidentally gave apple a perpetual patent license at some point before realizing what apple was up to)

They didn't redesign the GPU overnight, it was designed and released incrementally over many years under our very noses. Nobody noticed.


Funny thing is that my first 3D accelerator was a PowerVR, and unlike the 3Dfx solution it used the PCI bus to shuffle the finished frame to the video card for output.


In 1998 3dfx was incredibly dominant in the market. I don't think that after the release of Voodoo 2 many people would have bet on another company being the market leader. Riva TNT was ok, but it wasn't a breakthrough for Nvidia.


TNT was the first viable Voodoo 2 competitor, though it didn't quite catch up. The year after, NVIDIA released GeForce, which was indisputably better. 3dfx never recovered after that.


3dfx also misread the trends. I remember something about them staying on 16 bit graphics for way too long.


Yeah, I was a pretty serious PC gamer (Quake/Q2/Q3) back during the rise and fall of 3dfx. Pretty much everyone I knew went from software rendering to 3dfx-chipset cards but then jumped ship when the Riva TNT2 cards came out, primarily because they just produced much better visuals due to support for 32-bit color while the 3dfx offerings were all limited to 16-bit.

There was still a bit of a split in the market back then between those who went TNT2 (amazing, at the time, image quality) and those who went to the Voodoo3 (good raw speed, but limited to 16-bit color and thus noticeable banding artifacts)... 3dfx still held on to a bit of a niche there for a while longer, but then once the GeForce and GeForce 2 came out 3dfx was just dead in the water with nothing that could compete anywhere on the performance versus image quality axes.


Color depth was never the problem, 3dfx used dithering comparable to ~22bit depth (according to company itself).

https://www.vogons.org/viewtopic.php?f=46&t=36548

Texture memory capacity, texture size limit (256x256!) and lack of s3tc/DXT1 compression were the real culprit. TNT shipped with 16MB, in TNT2 32MB came standard. Texture quality played the biggest role in perceived visual difference. Not to mention TNT performance went 1/2 down in 32bit mode, TNT2 fared only slightly better.

Quality is great, until you learn both Nvidia and ATI repeatedly cheated in benchmarks and games dropping quality below old 3dfx limits. ATI Quake 3 was quite dramatic: https://techreport.com/review/3089/how-ati-drivers-optimize-...


The year after the TNT was the TNT2 (which was a great chipset).

Geforce didn't come out until Oct 1999 and even then it was only one chipset. The rest of the product range came the following quarter (2000).

I skipped that generation and jumped to the Geforce 2 (this time first party rather than a 3rd party card with nvidia GPU), which was also an awesome card.

As for the fall of 3dfx, I dont think the rise in popularity of DirectX and OpenGL helped their case as that really levelled the playing field. After that, specifically targeting Voodoo cards felt more like a chore than an enhancement.


> I skipped that generation and jumped to the Geforce 2 (this time first party rather than a 3rd party card with nvidia GPU)

Wait, nVidia made their own cards? I can't ever recall seeing one that wasn't made by a partner.


I had an NV-branded GF3, which was sold or given to developers since they had some issues with supplying enough chips to OEMs. Looked like this: http://www.ixbt.com/video/images/leadtek-tdh/geforce3-ref-fr... (and judging by the URL name it was made by Leadtek).


AFAIK, Nvidia have never fabricated their own cards, but they do design the reference boards that most OEMs use as a basis for their own cards, and they did at one point release cards directly under the Nvidia brand (the cards were fabricated by Foxconn using Nvidia's designs).


Yeah I had an nvidia branded Geforce 2. Didn't realise it was fabricated by a third party


Oh man, the Riva TNT was the 3D card I ever had, back when popular opinion said that hybrid 2D/3D cards would never be as good as a dedicated 3D card.

So amazing to get 30fps with fullscreen trilinear filtering. It really felt like the future... and then you jump back to the present and your graphics card is basically a renderfarm that you send geometry and shaders off to and it does the rest.


That's pretty much standard in every industry, though. Competition will kick underperformers out of the market and some amount of consolidation is inevitable. It's not Nvidia doesn't have any competition nowadays.


Not entirely true. Software has natural monopoly characteristics. The only real solution around it has been alternative markets.

Apple couldn't budge Microsoft in the desktop market and they went "around" them by going into the mobile market.


And even then they may have been a bit player if Microsoft had not flinched and thrown out their existing PocketPC platform to ship Phone 7.

Keep in mind that PocketPC was not just used for "consumer" phones, but also all kinds of hand held terminals in warehouses and such.

This was a market that Microsoft basically abandoned when they announced that Phone 7 and onwards would not be compatible with PocketPC software, even though they tried to extend a fig leaf by rebranding PocketPC 6.5 as Phone 6.5 (i think HP tried to sell a couple of PDAs with that).

Didn't take long before we saw alternatives out there, though these days i think they are mostly Android based.


I remember those days vividly.

I was just getting to know the world of computers, I think it was the year 97, I had 13 years old and my first PC, Pentium 200mhz with Windows 95, I thought the games were fabulous! Quake 1, Duke Nukem, Carmageddon, Moto Racer.

But one day I bought the Diamond Monster 2 with a Voodoo 2 chip and 12MB Ram (3DFX accelerator card), I put that thing on my PC and it was like traveling to the future, the games looked amazing and I had a performance boost.

I think the game that impressed me most at the time was Need For Speed 2 Special Edition, before the card I had it installed and played, but the "Special Edition" was for being compatible with 3DFX and adding additional features, in addition to the smoothed textures, in a track mosquitoes were sticking to the screen, I do not remember what other special features.

Moto Racer, Descent Freespace and Tomb Raider 2 come to my mind now as others that I feel huge visual gap between regular graphics and 3DFX.

Then came Quake 3 Arena in 1999 and it blew my mind, I think that in 2000/2001 I upgrade to a Pentium 4 and maintain the legendary Diamond Monster 2 because it was still kicking ass. Awesome card, it has a place in my heart.


> I think the game that impressed me most at the time was Need For Speed 2 Special Edition, before the card I had it installed and played, but the "Special Edition" was for being compatible with 3DFX and adding additional features, in addition to the smoothed textures, in a track mosquitoes were sticking to the screen, I do not remember what other special features.

I think NFS2SE had transparent glass if you had a 3DFX card, or maybe that was NFS3, I remember wishing I had one either way!


I remember playing Need for Speed: High Stakes with a 4MB Permedia 2 card, and the game was clearly supposed to have alpha-transparent textures on some lights, but my card couldn't handle it and I got opaque textures with big black borders around all the streetlight lens flares etc.

The worst was the Dolphin Cove track where beautiful shafts of sunlight were meant to lazily filter through the tree canopy onto the road. On my screen, they appeared as fully opaque walls blocking any view of the track ahead.


I remember I bought a TNT card instead of a Voodoo for my first computer because it had a nicer looking box in the advertisement from the local electronics store. I didn't have internet at that time, so I used box art instead of user reviews for deciding on purchases. I think the box had a fighter jet.


> I put that thing on my PC and it was like traveling to the future,

Yes! It was 1998 or 1999. I remember coming home from Micro Center, installing the Diamond Voodoo2 card and trying Descent Freespace on it. I think it came with it as a demo or even a full version. Then lots of hours playing NFS as well. It was the future.


It feels weird that the article uses the term "GPU". This term wasn't widely used back then. IIRC, the Voodoo cards were called "3D accelerators".


From what I remember, the term GPU was invented by NVIDIA's marketing team for the GeForce campaign. It sounded a bit weird in the beginning. In hindsight, it was a strike of genius.


If I recall correctly, "GPU" was attached to the GeForce 3, which introduced programmable pixel and vertex shaders (planting the seeds for later, general-purpose processing developments like CUDA and eventually cryptocurrency mining).


Frankly, I don't remember with certainty. But Wikipedia seems to agree with what my memory: https://en.wikipedia.org/wiki/Graphics_processing_unit

" The term GPU was popularized by Nvidia in 1999, who marketed the GeForce 256 as "the world's first GPU", or Graphics Processing Unit,[2] although the term had been in use since at least the 1980s.[3] It was presented as a "single-chip processor with integrated transform, lighting, triangle setup/clipping, and rendering engines".[4] Rival ATI Technologies coined the term "visual processing unit" or VPU with the release of the Radeon 9700 in 2002.[5] "

I had forgotten that ATI initially tried to push the term VPU instead of accepting NVIDIA's terminology. The goal of both companies's marketing teams was the same--to convince the market that GPUs were as indispensable as CPUs and not just optional accelerators.


No, blue11 is right. They made a big deal of the "GPU" in the original GeForce 256.


Yes.

I even remember the need of two cards. A GPU and a 3D accelerator.


The classic combo of Matrox for 2D and 3Dfx for 3D. Surprisingly Matrox still exists.


Yep, but their cards aren't really consumer. They only do expensive specialized card for multi-monitor output (think serving up 8+ screens at a train station).


I remember seeing computers marketed as having both a "2D accelerator" and a "3D accelerator" -- separate cards. I never had the cash for a Voodoo card back in the day, but I certainly wanted one!


That's nothing, I remember the need for 4, a 2D card, two Voodoo 2s in SLI and a dedicated hardware DVD decoder card :-)

The picture quality was terrible with multiple pass through analog VGA cables...


I used a TV card and hooked my DVD player into it via the antenna cable.


Wasn't that only the Voodoo1/Voodoo2, with the pass through cable? I don't think any other manufacture did that.


I think my first graphics card was a Riva 128. If I'm remembering correctly, the box billed it as a "2D/3D Multimedia Accelerator".


It kinda feels weird that we use the term "GPU" to describe modern cards. Seems to me that there is much more demand for these components in the AI and cryptocoin mining markets than anything utilizing graphics.

I wonder if we're about to witness the second death of PC gaming?


Gaming is bigger than the movie industry at this point. I don't think even a sub-category like PC gaming is going anywhere soon.


While unlikely to disappear completely, it can certainly decline to a much smaller niche. If you recall the state of PC gaming before Steam took off, it was in a decline.


Why? Think about the root of the word "graphics" and why GPUs are well suited to processing ...graphs.

Why would high demand for GPUs signal the death of PC gaming? I would expect the opposite.


I'm not sure what you mean, but the root of graphics is graphikos, Greek for writing or drawing. Mathematical graphs are called that by analogy to their graphical forebears.


You have confused root and origin. The root of graphics is graph.

The comment I replied to said GPUs are more useful in cryptocoin mining and AI rather than graphics. Cryptocoin mining and AI are graph workloads. This is why GPUs are well suited to these tasks.


The root word of graphics is graphic.

The root word of graph is graphic; "graph" is a shortening of the phrase "graphic formula", apparently.


>Seems to me that there is much more demand for these components in the AI and cryptocoin mining markets than anything utilizing graphics.

Only outside home use.


Hardly matters if the components are too expensive for home users to feel comfortable buying them.


LGR on YouTube did a nice episode about this a while back:

LGR Tech Tales - 3Dfx & Voodoo's Self-Destruction

This episode covers the founding, voodoo-powered rise, and ridiculously powerful fall of 3Dfx Interactive. Join me in LGR Tech Tales, looking at stories of technological inspiration, failure, and everything in-between!

https://www.youtube.com/watch?v=rrn-QYdT4F8


You might also like watching Computer History Museums "3dfx Oral History Panel with Ross Smith, Scott Sellers, Gary Tarolli, and Gordon Campbell" https://www.youtube.com/watch?v=3MghYhf-GhU

Ross Smith, Scott Sellers and Gary Tarolli were 3dfx founders, they left Silicon Graphics to do this startup.


Access to information (e.g. the Internet) has been a game changer for buying "3D acceleration" as it was called back then.

I remember fondly in the late 90s just walking into a Best Buy (maybe Circuit City) and buying a GPU off the shelf.

I had no benchmark data like today. I bought based on whether or not I liked the packaging and how much video RAM did it have. Because I really didn't have much more data to go off of.

I subsequently bout the Diamond Viper 770, linked below.

https://www.anandtech.com/show/307

Times have really changed.


There were plenty of magazines doing those benchmarks back then, you certainly didn't need the Internet to make good buying decisions.


There was also the internet. Consider this article from Tom's Hardware circa 1998 http://www.tomshardware.com/reviews/nvidia,87-8.html


Few things worth pointing out that is missing from the article.

1. Direct X, and arguably Microsoft, trying to dethrone Glide, made a huge impact. Although that was later in the time line when 3Dfx was going downhill anyway.

2. Too late to the "Graphics" Card market. After Voodoo 2 it took them too long ( Long in that state of time, when everything was moving fast ) to release a 2D/3D Graphics Card. Voodoo Banshee wasn't good enough compared to RivaTNT.

3. Prices, Voodoo was quite a bit more expensive then others.

God, reading this article makes me feel old. Lots of memories coming back. Reading about benchmarks between all these Graphics Card. Internet was still in its early days back then. Most of the information were from Monthly magazine I got from WH Smith, PC Gamers used to do these test. Lot of cool games coming from those CD Demo disk. There is S3 Savage 3D, ATI Rage, Matrox, 3D Labs which was the Semi Professional Graphics Card company back then, which is now under Creative, aka Sound Blaster's company. Even Intel has i740. And PowerVR. Nvidia went from the underdog, and Geforce changes everything. I remember the early days how every new product from different company would hype their spec, like S3 is going to change the world etc. Then we soon learn all the paper spec didn't matter, if the drivers wasn't giving it the full potential. One of the reason why PowerVR didn't do well on desktop. ( Actually this is still the case in Mobile Phones.... )

GPU today isn't exciting anymore, at least in terms of Grpahics, not compute. We are limited by Technology node, and lots of drivers optimization to get AAA games to work better.

Gaming at the time was very geeky. Who would have thought today we have something called E-Sport, and it seems everyone is a gamer. Somewhere along the line I ditched PC gaming and went to console. I miss PC gaming, mainly because I bought a Mac, and despite what Apple say, they really don't gave a crap about gaming on Mac. Sigh

I miss 3Dfx, I miss Voodoo.

Good Old Times.

P.S - And somehow this song pops up in my head

https://www.youtube.com/watch?v=swVoXHVW-jI


i740 I had it. Was a piece of crap that rendered with a lot of bugs. Also I have, after, a card with Trident 3d chip. Later I grabbed a Creative GeForce 2 MX DDR. I managed to ran Doom 3 over it at a barely playable 20 FPS using some paths to allow Doom 3 ran over Voodoo 2 (mainly resizing textures ...)


Still fondly remember my passively-cooled! Voodoo 3, their rendering stack was top quality. Nvidia was the feeble newcomer. Times change fast...


> Nvidia was the feeble newcomer.

Nvidia was never really feeble. Even when 3Dfx was shocking everyone with their raw throughput and had a bit of a moat (until DirectX stopped sucking) with the Glide API, Nvidia got their foot in the door with 32-bit rendering and decent OpenGL. 3Dfx made sense for a lot of us - anyone using Quake would get it if they could, since raw performance mattered a lot and you don't suffer too much for only having 16-bit rendering of all those shades of brown - but with anything later than the original Voodoo, you felt like you were making a bit of a trade-off.

I've still got a Voodoo 3, I think. No telling if it actually works, but it was the only add-on card I've ever had that warped with time. Poor heat sink design and/or not enough layers on the PCB, I guess.


IIRC, which is a poor bet, I probably choose NVidia over 3dfx because of the OpenGL drivers. I was doing CAD, VRML, OpenGL stuff thru the 90s.


If your card was anything with "Geforce" in the name that was certainly the best choice. Even before the Geforce era, 3dfx never had anything except problematic OpenGL "mini drivers" which worked okay with Quake-based games but not much else.


Hehe, all my friends had Voodoo GPUs back in the days.

A friend of mine even got a Voodoo 5500 and we tried to get Quake 3 Arena so tuned that we couldn't see any pixels anymore.

I was the first one with a GeForce 2MX, which worked quite well and long considering it was the budget version of the gf2.

If I look at the performance indexes, the Voodoo 5500 wasn't very much better than the GF2MX.


And there was simple resistor resolder hack to make GeForce 2MX think it is Quadro. It was very useless and very cool at the same time. Few people got into hardware hacking (read electronics engineering) thanks to this little and yet very positive soldering experience.


I remember in the same era, carefully shorting a blown fuse on the package of my Athlon using a graphite pencil, so that I could overclock it. I heated my apartment with that PC.


and the GF2 MX with DDR ram, can move Doom 3 at a barely playable FPS, if you downgrade the game textures. I played the whole game with these GPU.


>all my friends had Voodoo GPUs back in the days.

Mostly becoz many games were better on Glide then OpnenGL


Still remember enjoying Descent. And how I argued with Commodore staff at conferences on why Amigas didn't move into hardware 3d as it was the clear future to see.


I love Amigas but seems like their fate was sealed by then. By the release of Doom in 1993 it was over for sure. Their whole shtick was that they had special hardware that let them do things PCs couldn't. Doom proved them wrong.


Pretty much on the money. The hardware guys tried though. Management delayed and delayed until they'd blown all the money.

In 90-91 ish they had a prototype AAA architecture and the specs were great. (3000+). PC's had caught up some way.

Instead management pushed a cheap, horrible, slower machine with none of the features, no SCSI II but CPU driven IDE and AGA in place of AAA. No DSP or voice recognition. No 16 bit Paula. That was A4000.

Then in 93(?) there was the planned 3D PowerPC reboot as PCs had finally caught up with AAA! Then they were no more.

Was really depressing going to DevCons and getting all the NDA info in this period.


It doesn't look like AAA [1] has any 3D?

From what I remember it was more like my Retina Z3 card but better. Had some discussions at conferences, it seemed Amiga hardware engineers didn't believe in 3D and were frozen in their graphics mindset of blitter and copper (huge when I wrote 68k demos in the mid-80s, but meaningless when 3D happened).

[1] https://en.wikipedia.org/wiki/Amiga_Advanced_Architecture_ch...


No it didn't, but it was started in 88 when keeping planar modes and adding 1280x1024 still made some sense keeping compatibility. Adding the DSP would have made a few new niches. In 93 it was all a bit too late - and they weren't even ready to ship then.

3D was in Hombre that was the PowerPC. Wildly different and no compatibility, so heaven knows how that would have played out. Only that was when we were all starting to play Doom at work :)

Edit: I searched and found this from Dave Haynie - who always seemed pretty straight talking. Weirdly some of the dates have 10 years added :D The 3D section and the following on Gould and Ali seems to sum up the train wreck pretty well. eg:

"When he got to Engineering, he hired a human bus error called Bill Sydnes to take over. Sydnes, a PC guy, didn’t have the chops to run a computer, much less a computer design department. He was also an ex-IBMer, and spent much time trying to turn C= (a fairly slick, west-coast-style design operation), into the clunky mess that characterized the Dilbert Zones in most major east-coast-style companies"

http://www.landley.net/history/mirror/commodore/haynie.html


"keeping compatibility."

This killed the Amiga.


It doesn't have 3D, but what it does have is 8 and 16bit chunky graphics modes, which make doing software renderers a lot easier than the planer modes of the old chipsets.

With chunky graphics, you can write out an entire pixel with a single write, which is essential for rasterisation in software 3d games.

It's not like PCs got 3d graphics cards until late-95, the AAA chipsets released in 1993 might have allowed the Amega to hold on until 96, get proper Doom clones (and maybe even quake clones) and then get it's own 3d accelerators.


"which make doing software renderers a lot easier than the planer modes of the old chipsets."

Which is the reason that 3D gaming took really off with chipsets that support software rendering instead of GPUs.

The PS1 and the CD32 are around the same time and show how Commodore had no clue about 3D, but Sony delivered (Also Nintendo with the Super FX).


I remember those gpu wars. Also had a Voodoo 3000 card which was relatively cheap but still very fast. It took quite some time to get those games running optimally though. Playing Quake on it was fantastic.

Specs and scores were all they could talk about. I also remember that very nice parody: https://imgur.com/SLG6hp4


I remember my Voodoo3 quite fondly. There was quite a performance boost in games based on the Quake engine. I also recall their being poor Linux driver support until 2005 or so.

My Voodoo3 ran right up through CS v1.6 (albeit struggling at times). A new system I built in 2005 used 2x BFG 6800s in SLI (3dfx tech nvidia inherited). While impressive I seem to recall their being some variation in nvidia's SLI spec/implementation. Perhaps 3dfx SLI performance was seemingly better because of Glide support but my memory escapes me (obviously not an apples to apples comparison).

FWIW, there was also a huge aftermarket of Voodoo cards on eBay (1999-2004) and a decent community providing 3rd party driver support.


The article blames management failure, but I'm not convinced. There's a general bias to blame management any time a company fails. Of course management is always responsible, but I don’t think anyone could have saved say, Kodak.

In this case, Nvidia and ATI were already making 3D accelerators for the professional 3D rendering market. All they had to do was to make stripped down cards that wouldn't cannibalize their high margin business, and sell them in volume for a price gamers could bear.

Maybe 3Dfx could have made a moat out of GLide, but OpenGL already existed and Microsoft was working on Direct3D.


Nvidia and ATI were already making 3D accelerators for the professional 3D rendering market.

Were they? I don’t see any professional cards in Nvidia’s early line-up, at least not the one shown on Wikipedia. The first Quadro doesn’t show up till 1999 and the verbiage suggests the pro cards were an offshoot of the gaming cards, not the other way round.

https://en.m.wikipedia.org/wiki/List_of_Nvidia_graphics_proc...


It looks like I was wrong. I was going off memory.


> but I don’t think anyone could have saved say, Kodak

Why not? What if Kodak management had went hard on digital initially instead of clinging to film for so long?


interesting... my first digital camera was a Kodak. Only enough internal flash storage (no removable) that can hold around 75 JPEG photos at 640x480 . And without screen. How ever no body had a digital camera except me, for a few years. I kept it


Kodak was never an electronics company, or even a camera company. Nikon and Canon would have eaten their lunch.


Kodak literally invented the digital camera, and were significant in both consumer and professional digital photography in the early 00s.

They missed by not innovating fast enough compared to the competition, not by ignoring the market.

Unimaginative management was very definitely a problem - although to be fair, it's easy with hindsight to say they should have started developing digital much earlier, when in reality consumer digital photography only started to make commercial sense when computers and public networking became powerful enough to support it.

That didn't happen until the later 90s. Until then, most computers lacked the photo quality displays we take for granted now.


I can remember my oldest digital camera was a Kodak my parents bought me, with a CF card in it I believe.


What kind of company is 3M? Or IBM? Or GE?

Management is responsible for the strategy and long term success of a business. Pivots are necessary and a core part of successful strategies.


Then they would've softened their blow and perhaps delayed their relative demise until the rise of smartphones. But I do think it's fair to say that the transition was going to be a bloodbath of layoffs and declining revenue in any case.


Someone has to make the cameras in the phones. That plausibly could've been Kodak, in an alternate reality.


There still would've been a bloodbath because chemical engineers aren't EEs/SEs. All the Kodak scientists/engineers/factory workers who were designing and making film would not have immediately pivoted to digital camera design and software development.

The "camera design" portion of kodak isn't where the primary bloodbath happened. The bloodbath happened when film died.

In other words, kodak the company may have thrived, but only by shedding huge portions of its workforce and retooling.

I guess if all you care about is stock value, then sure, Kodak could've pulled it off. But for the folks on the ground, the bloodbath would've happened either way.


I see it the other way. What if Kodak hadn't wasted money trying to enter businesses like digital cameras and printers - ie. low-margin products where Kodak had no competitive advantage. Sure, Kodak would have still gone out of business but it wouldn't have wasted so much shareholder capital in the process.


I'm glad the article goes into STB. I really think this was what killed 3Dfx. It was a big move, but if you were following 3Dfx back the, you saw the issues.

I think they tried to lay off duplicate staff. With the Voodoo3, you saw drivers that just didn't work half the time. Textures were rendered totally wrong, people would downgrade drivers to get things working, etc. I have a felling they laid off or lost people critical in the hardware itself.

The website went to shit too. It got terrible. They either axed or lost a ton of web staff and focused more on the graphics used on all their box art.

Finally, the cards/chips just didn't scale well. I even remember a ton of articles at the time talking about how much efficiency was lost with the multi-chip design; how many things had to be duplicated in memory for the different GPUs. The Voodoo5 even required an external plugin power source.

nVidia to this day still sells its chips to everyone else. 3Dfx was trying to get into the ATi space, where you only had one ATi card and didn't have to worry about reference vs vendor drivers. It might have worked if they tried to keep both units separate for a while and not merge them into one entity.


Good leaders take responsibility


Wait, so what went wrong? They took too long to release new product? That's not surprising to anyone watching at the time. But… why? What went wrong internally?


I think it was the failed merger with STB. They should have kept with the nVidia route of making chips for other companies instead of trying to integrate the entire pipeline like ATI.


Yes, this. They tried to go from being a company that just licensed IP to OEMs (who would do the actual manufacturing and marketing of 3dfx-powered graphics cards) to being a company that did the manufacturing and marketing themselves, and in the end couldn't pull that off.

Acquiring one of the more prominent OEMs, STB, was supposed to get them the expertise they needed to do that rather than having to develop it in-house, but (as in many acquisitions) absorbing the new company turned out to be more complicated than originally expected. This resulted in the post-acquisition products (Voodoo 3/4/5) taking longer to get to market than planned and being somewhat underwhelming by the time they did arrive, which gave NVidia -- who kept on licensing their tech, now also to grumpy former 3dfx OEMs who had found themselves cut out by the STB acquisiton -- the opening they needed to eat 3dfx's lunch.


None of this was in the article, what a shame!


I had the Wicked3D H3D voodoo glasses, 3D on 20 inch monitor gave me > 100 FOV when I put my nose near the screen.


My Voodoo 5 5500 was, at the time, and for years afterward, the best consumer level card I'd seen for OpenGL rendering. I still remember playing Diablo 2 on it and never lagging, even when insane numbers of mobs were on the screen. RIP


Diablo 2 doesn't use OpenGL. It does use Glide, the native hardware API for Voodoo cards. And it looks amazing and silky smooth with Glide compared to the game's Direct3D mode.


I remember the day clearly when a case of Voodoo 2's showed up in our dorm room, 1998. Going from software rendered Quake 2, to accelerated was the beginning of the end for my grades that semester.


This making me reminisce so hard. I remember fondly the days where I convinced my mom that the Voodoo card was for some school thing and was so excited to slap that baby into the ISA slot of my Pentium 2.


TDFX shareholders got screwed when they were "acquired" by nVidia. I suppose that was a good thing, it made be skeptical of crummy companies during the dot com days.


I have many fond memories configuring my voodoo card on Linux, reveling in my high glxgears score and super smooth Tux racer rendering. A simpler time.


Off-topic: the PS4's PSVR box is a bit reminiscent of the old Voodoo cards with their passthrough video.


Voodoo 3dfx 5500 baby ...


Could also be titled: when not to pivot




Applications are open for YC Winter 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: