Think back on how much consumer computing changed from 1996 to 2000. Four years back from the present, in contrast, is the Haswell-based Macbook Pro and iPhone 5s (a perfectly fine setup today).
The thing that bothers me is that if you want a PCI-E card on a consumer system, you only have nVidia/AMD. Intel only does embedded now with none of their offerings on a card (like the only i740). There is no Ryzen 7 with an APU, so if you're building a small developer box, you have to buy a discrete graphics card and can't rely on embedded (unless you want to drop to the Ryzen 5, which was just released with an APU).
There are rumors that Intel might be getting back into this space. If anyone could try to compete with the big two, it's probably them.
That suggests Intel isn't going to be competitive in the graphics space in the near future
Intel seems to have made or licensed a high speed, short distance, data bus that sits between the Intel CPU and the AMD GPU that are mounted side by side on a small board.
Would be hilarious btw if this showed up on desktops as a dual socket motherboard, with one socket for the CPU and another for the GPU. Likely with one massive cooler covering them both.
PVR owns a lot of TBDR patent IP since they mostly went it alone on tiled based deferred renderers when most of their competitors were IMRs. It looked like a smart decision as desktop GPUs were hitting a memory bandwidth constrAint until GDDR and ultra wide buses started being used and GPUs got better at early-z rejection.
But on mobile, TBDR works great since mobile has different power and heat constraints than desktop.
Nobody noticed, but apple started working back on their GPU all the way back on the iPhone 5s (with their A7 SoC)
It wasn't a complete custom design, but it wasn't an off the shelf powervr either. Until that point, apple always released the exact model of GPU they were using in their SoC, then suddenly their documentation starts referring to "the apple A7 GPU" and "the apple A8 GPU". Wikipeida does give a model number for the GPU, but if you trace it back you find a andandtech article going through the logic speculating it must be a powervr 6 series. And all roads lead back to that speculation.
For later SoCs, andandtech actually invent a new 6 core powervr variant, because powervr wasn't listing one yet die shots of the new iPad showed a 6 core GPU.
With the release of the iPhone 5s, apple also switched from the powervr supplied driver to their own one written in house.
It appears that apple incrementally swapped out parts of the powervr design until they arrived at the A11 SoC GPU which they declared as their own, with none of the original powervr parts remaining. Though the result appears to be very close to the original design. (I suspect powervr accidentally gave apple a perpetual patent license at some point before realizing what apple was up to)
They didn't redesign the GPU overnight, it was designed and released incrementally over many years under our very noses. Nobody noticed.
There was still a bit of a split in the market back then between those who went TNT2 (amazing, at the time, image quality) and those who went to the Voodoo3 (good raw speed, but limited to 16-bit color and thus noticeable banding artifacts)... 3dfx still held on to a bit of a niche there for a while longer, but then once the GeForce and GeForce 2 came out 3dfx was just dead in the water with nothing that could compete anywhere on the performance versus image quality axes.
Texture memory capacity, texture size limit (256x256!) and lack of s3tc/DXT1 compression were the real culprit. TNT shipped with 16MB, in TNT2 32MB came standard. Texture quality played the biggest role in perceived visual difference. Not to mention TNT performance went 1/2 down in 32bit mode, TNT2 fared only slightly better.
Quality is great, until you learn both Nvidia and ATI repeatedly cheated in benchmarks and games dropping quality below old 3dfx limits. ATI Quake 3 was quite dramatic: https://techreport.com/review/3089/how-ati-drivers-optimize-...
Geforce didn't come out until Oct 1999 and even then it was only one chipset. The rest of the product range came the following quarter (2000).
I skipped that generation and jumped to the Geforce 2 (this time first party rather than a 3rd party card with nvidia GPU), which was also an awesome card.
As for the fall of 3dfx, I dont think the rise in popularity of DirectX and OpenGL helped their case as that really levelled the playing field. After that, specifically targeting Voodoo cards felt more like a chore than an enhancement.
Wait, nVidia made their own cards? I can't ever recall seeing one that wasn't made by a partner.
So amazing to get 30fps with fullscreen trilinear filtering. It really felt like the future... and then you jump back to the present and your graphics card is basically a renderfarm that you send geometry and shaders off to and it does the rest.
Apple couldn't budge Microsoft in the desktop market and they went "around" them by going into the mobile market.
Keep in mind that PocketPC was not just used for "consumer" phones, but also all kinds of hand held terminals in warehouses and such.
This was a market that Microsoft basically abandoned when they announced that Phone 7 and onwards would not be compatible with PocketPC software, even though they tried to extend a fig leaf by rebranding PocketPC 6.5 as Phone 6.5 (i think HP tried to sell a couple of PDAs with that).
Didn't take long before we saw alternatives out there, though these days i think they are mostly Android based.
I was just getting to know the world of computers, I think it was the year 97, I had 13 years old and my first PC, Pentium 200mhz with Windows 95, I thought the games were fabulous! Quake 1, Duke Nukem, Carmageddon, Moto Racer.
But one day I bought the Diamond Monster 2 with a Voodoo 2 chip and 12MB Ram (3DFX accelerator card), I put that thing on my PC and it was like traveling to the future, the games looked amazing and I had a performance boost.
I think the game that impressed me most at the time was Need For Speed 2 Special Edition, before the card I had it installed and played, but the "Special Edition" was for being compatible with 3DFX and adding additional features, in addition to the smoothed textures, in a track mosquitoes were sticking to the screen, I do not remember what other special features.
Moto Racer, Descent Freespace and Tomb Raider 2 come to my mind now as others that I feel huge visual gap between regular graphics and 3DFX.
Then came Quake 3 Arena in 1999 and it blew my mind, I think that in 2000/2001 I upgrade to a Pentium 4 and maintain the legendary Diamond Monster 2 because it was still kicking ass. Awesome card, it has a place in my heart.
I think NFS2SE had transparent glass if you had a 3DFX card, or maybe that was NFS3, I remember wishing I had one either way!
The worst was the Dolphin Cove track where beautiful shafts of sunlight were meant to lazily filter through the tree canopy onto the road. On my screen, they appeared as fully opaque walls blocking any view of the track ahead.
Yes! It was 1998 or 1999. I remember coming home from Micro Center, installing the Diamond Voodoo2 card and trying Descent Freespace on it. I think it came with it as a demo or even a full version. Then lots of hours playing NFS as well. It was the future.
The term GPU was popularized by Nvidia in 1999, who marketed the GeForce 256 as "the world's first GPU", or Graphics Processing Unit, although the term had been in use since at least the 1980s. It was presented as a "single-chip processor with integrated transform, lighting, triangle setup/clipping, and rendering engines". Rival ATI Technologies coined the term "visual processing unit" or VPU with the release of the Radeon 9700 in 2002.
I had forgotten that ATI initially tried to push the term VPU instead of accepting NVIDIA's terminology. The goal of both companies's marketing teams was the same--to convince the market that GPUs were as indispensable as CPUs and not just optional accelerators.
I even remember the need of two cards. A GPU and a 3D accelerator.
The picture quality was terrible with multiple pass through analog VGA cables...
I wonder if we're about to witness the second death of PC gaming?
Why would high demand for GPUs signal the death of PC gaming? I would expect the opposite.
The comment I replied to said GPUs are more useful in cryptocoin mining and AI rather than graphics. Cryptocoin mining and AI are graph workloads. This is why GPUs are well suited to these tasks.
The root word of graph is graphic; "graph" is a shortening of the phrase "graphic formula", apparently.
Only outside home use.
LGR Tech Tales - 3Dfx & Voodoo's Self-Destruction
This episode covers the founding, voodoo-powered rise, and ridiculously powerful fall of 3Dfx Interactive. Join me in LGR Tech Tales, looking at stories of technological inspiration, failure, and everything in-between!
Ross Smith, Scott Sellers and Gary Tarolli were 3dfx founders, they left Silicon Graphics to do this startup.
I remember fondly in the late 90s just walking into a Best Buy (maybe Circuit City) and buying a GPU off the shelf.
I had no benchmark data like today. I bought based on whether or not I liked the packaging and how much video RAM did it have. Because I really didn't have much more data to go off of.
I subsequently bout the Diamond Viper 770, linked below.
Times have really changed.
1. Direct X, and arguably Microsoft, trying to dethrone Glide, made a huge impact. Although that was later in the time line when 3Dfx was going downhill anyway.
2. Too late to the "Graphics" Card market. After Voodoo 2 it took them too long ( Long in that state of time, when everything was moving fast ) to release a 2D/3D Graphics Card. Voodoo Banshee wasn't good enough compared to RivaTNT.
3. Prices, Voodoo was quite a bit more expensive then others.
God, reading this article makes me feel old. Lots of memories coming back. Reading about benchmarks between all these Graphics Card. Internet was still in its early days back then. Most of the information were from Monthly magazine I got from WH Smith, PC Gamers used to do these test. Lot of cool games coming from those CD Demo disk. There is S3 Savage 3D, ATI Rage, Matrox, 3D Labs which was the Semi Professional Graphics Card company back then, which is now under Creative, aka Sound Blaster's company. Even Intel has i740. And PowerVR. Nvidia went from the underdog, and Geforce changes everything. I remember the early days how every new product from different company would hype their spec, like S3 is going to change the world etc. Then we soon learn all the paper spec didn't matter, if the drivers wasn't giving it the full potential. One of the reason why PowerVR didn't do well on desktop. ( Actually this is still the case in Mobile Phones.... )
GPU today isn't exciting anymore, at least in terms of Grpahics, not compute. We are limited by Technology node, and lots of drivers optimization to get AAA games to work better.
Gaming at the time was very geeky. Who would have thought today we have something called E-Sport, and it seems everyone is a gamer. Somewhere along the line I ditched PC gaming and went to console. I miss PC gaming, mainly because I bought a Mac, and despite what Apple say, they really don't gave a crap about gaming on Mac. Sigh
I miss 3Dfx, I miss Voodoo.
Good Old Times.
P.S - And somehow this song pops up in my head
Nvidia was never really feeble. Even when 3Dfx was shocking everyone with their raw throughput and had a bit of a moat (until DirectX stopped sucking) with the Glide API, Nvidia got their foot in the door with 32-bit rendering and decent OpenGL. 3Dfx made sense for a lot of us - anyone using Quake would get it if they could, since raw performance mattered a lot and you don't suffer too much for only having 16-bit rendering of all those shades of brown - but with anything later than the original Voodoo, you felt like you were making a bit of a trade-off.
I've still got a Voodoo 3, I think. No telling if it actually works, but it was the only add-on card I've ever had that warped with time. Poor heat sink design and/or not enough layers on the PCB, I guess.
A friend of mine even got a Voodoo 5500 and we tried to get Quake 3 Arena so tuned that we couldn't see any pixels anymore.
I was the first one with a GeForce 2MX, which worked quite well and long considering it was the budget version of the gf2.
If I look at the performance indexes, the Voodoo 5500 wasn't very much better than the GF2MX.
Mostly becoz many games were better on Glide then OpnenGL
In 90-91 ish they had a prototype AAA architecture and the specs were great. (3000+). PC's had caught up some way.
Instead management pushed a cheap, horrible, slower machine with none of the features, no SCSI II but CPU driven IDE and AGA in place of AAA. No DSP or voice recognition. No 16 bit Paula. That was A4000.
Then in 93(?) there was the planned 3D PowerPC reboot as PCs had finally caught up with AAA! Then they were no more.
Was really depressing going to DevCons and getting all the NDA info in this period.
From what I remember it was more like my Retina Z3 card but better. Had some discussions at conferences, it seemed Amiga hardware engineers didn't believe in 3D and were frozen in their graphics mindset of blitter and copper (huge when I wrote 68k demos in the mid-80s, but meaningless when 3D happened).
3D was in Hombre that was the PowerPC. Wildly different and no compatibility, so heaven knows how that would have played out. Only that was when we were all starting to play Doom at work :)
Edit: I searched and found this from Dave Haynie - who always seemed pretty straight talking. Weirdly some of the dates have 10 years added :D The 3D section and the following on Gould and Ali seems to sum up the train wreck pretty well. eg:
"When he got to Engineering, he hired a human bus error called Bill Sydnes to take over. Sydnes, a PC guy, didn’t have the chops to run a computer, much less a computer design department. He was also an ex-IBMer, and spent much time trying to turn C= (a fairly slick, west-coast-style design operation), into the clunky mess that characterized the Dilbert Zones in most major east-coast-style companies"
This killed the Amiga.
With chunky graphics, you can write out an entire pixel with a single write, which is essential for rasterisation in software 3d games.
It's not like PCs got 3d graphics cards until late-95, the AAA chipsets released in 1993 might have allowed the Amega to hold on until 96, get proper Doom clones (and maybe even quake clones) and then get it's own 3d accelerators.
Which is the reason that 3D gaming took really off with chipsets that support software rendering instead of GPUs.
The PS1 and the CD32 are around the same time and show how Commodore had no clue about 3D, but Sony delivered (Also Nintendo with the Super FX).
Specs and scores were all they could talk about. I also remember that very nice parody: https://imgur.com/SLG6hp4
My Voodoo3 ran right up through CS v1.6 (albeit struggling at times). A new system I built in 2005 used 2x BFG 6800s in SLI (3dfx tech nvidia inherited). While impressive I seem to recall their being some variation in nvidia's SLI spec/implementation. Perhaps 3dfx SLI performance was seemingly better because of Glide support but my memory escapes me (obviously not an apples to apples comparison).
FWIW, there was also a huge aftermarket of Voodoo cards on eBay (1999-2004) and a decent community providing 3rd party driver support.
In this case, Nvidia and ATI were already making 3D accelerators for the professional 3D rendering market. All they had to do was to make stripped down cards that wouldn't cannibalize their high margin business, and sell them in volume for a price gamers could bear.
Maybe 3Dfx could have made a moat out of GLide, but OpenGL already existed and Microsoft was working on Direct3D.
Were they? I don’t see any professional cards in Nvidia’s early line-up, at least not the one shown on Wikipedia. The first Quadro doesn’t show up till 1999 and the verbiage suggests the pro cards were an offshoot of the gaming cards, not the other way round.
Why not? What if Kodak management had went hard on digital initially instead of clinging to film for so long?
They missed by not innovating fast enough compared to the competition, not by ignoring the market.
Unimaginative management was very definitely a problem - although to be fair, it's easy with hindsight to say they should have started developing digital much earlier, when in reality consumer digital photography only started to make commercial sense when computers and public networking became powerful enough to support it.
That didn't happen until the later 90s. Until then, most computers lacked the photo quality displays we take for granted now.
Management is responsible for the strategy and long term success of a business. Pivots are necessary and a core part of successful strategies.
The "camera design" portion of kodak isn't where the primary bloodbath happened. The bloodbath happened when film died.
In other words, kodak the company may have thrived, but only by shedding huge portions of its workforce and retooling.
I guess if all you care about is stock value, then sure, Kodak could've pulled it off. But for the folks on the ground, the bloodbath would've happened either way.
I think they tried to lay off duplicate staff. With the Voodoo3, you saw drivers that just didn't work half the time. Textures were rendered totally wrong, people would downgrade drivers to get things working, etc. I have a felling they laid off or lost people critical in the hardware itself.
The website went to shit too. It got terrible. They either axed or lost a ton of web staff and focused more on the graphics used on all their box art.
Finally, the cards/chips just didn't scale well. I even remember a ton of articles at the time talking about how much efficiency was lost with the multi-chip design; how many things had to be duplicated in memory for the different GPUs. The Voodoo5 even required an external plugin power source.
nVidia to this day still sells its chips to everyone else. 3Dfx was trying to get into the ATi space, where you only had one ATi card and didn't have to worry about reference vs vendor drivers. It might have worked if they tried to keep both units separate for a while and not merge them into one entity.
Acquiring one of the more prominent OEMs, STB, was supposed to get them the expertise they needed to do that rather than having to develop it in-house, but (as in many acquisitions) absorbing the new company turned out to be more complicated than originally expected. This resulted in the post-acquisition products (Voodoo 3/4/5) taking longer to get to market than planned and being somewhat underwhelming by the time they did arrive, which gave NVidia -- who kept on licensing their tech, now also to grumpy former 3dfx OEMs who had found themselves cut out by the STB acquisiton -- the opening they needed to eat 3dfx's lunch.