I think it's maybe time for game developers to spend more effort/time on gameplay and a bit less on AAA+++ graphics that only top GPUs can handle. Maybe hard targeting iGPU/APU for mid-level support in games.
For that matter, plenty of room to remake/reskin older (fun) games... Bring the Black Mesa effect to lots of existing games.
Games now don’t even look good enough to justify the enormous computing power they require. They’re a stuttery and blurry mess, augmented by fake AI frames, LOD popping, weird screen space rendering defects, and so on. Graphics peaked 10 years ago.
After I read this article, I realized that I have a huge backlog of unplayed/partially played games in my steam library. Significant chunk of those games are from Indie devs. Probably enough games to last me a year or two. Plus, most of them seem like they will run just fine at max to high settings using my 3080 Ti on a 1440p display. Point is, I really need to stop looking for the next hardware refresh so I can keep playing newer big budget titles. This and current memory prices made it easy for me to forget about upgrading for the next 2-3 years.
Nvidia is chasing trends again, they did it for crypto mining, and I bet some new fad will come up soon. It looks like the gaming industry needs to move to something else than the current GPU ecosystem in the long term. Intel Arc and Moore Threads have tried, but we really need a new way of designing and rendering graphics.
before (what? when? covid?) nvidia annual revenues were circa us$10b and amd was competing with it to make GPUs. after (what? when? covid?)..or now.. nvidia is a accelerator manufacturer with a small legacy GPU business and revenues are closer to us$200b. and amd is still no closer to competing with nvidia at making GPUs. intel started something in the mean time though. but everyone seems to be expecting them to try compete with nvidia at making accelerators. nobody is interested in GPUs anymore. accelerators seem to be the much bigger and more interesting market.
I do feel there might be a day of reckoning where Nvidia bet the farm too hard on this AI bubble and it ends up blowing up in their face.
I hope gamers, systems integrators, and regular PC enthusiasts don't have memories of goldfish and go back to business as usual. It needs to hurt Nvidia in the pocketbook.
Will this happen? Unlikely, but hope springs eternal.
NVidias share price will take a hit when consolidation starts in AI, because their business won't be growing as fast as their PE ratio implies. Also the circular deals could hurt them if one of the AI providers they've invested in goes bust.[1],[2]. They won't go out of business but holders of their shares may lose lots of money. But will this happen after Anthropic and OpenAI have their IPOs, possibly next year? NVidia stands to make a lot on paper if those IPOs do well.
If OpenAI has their IPO, this is likely going to result in retail getting fleeced, given how their return on their investments to date has been absolutely pitiful. They are seeing revenues of around $13 billion for 2025, with an alleged over $100 billion or more by 2030, but the investments they are making are orders of magnitude greater. Who is ultimately going to pay for this?
Surely OpenAI has customers buying their pro packages for ChatGPT, but that can't really be it. And businesses are starting to realize that AI can't replace the workforce that easily either.
Hardly taking this personally. Just calling out how I see it going most likely. Also... Nvidia has done quite a bit unethically. Namely violating anti-monopoly laws (though with our current US administration - they may as well be not worth the paper they are printed on), screwing with product reviewers, pulling a 90s-era Microsoft to obliterate their competition at all costs, and screwing over their board partners, like EVGA. GamersNexus on Youtube has covered plenty of this.
That said, although AI has some uniquely good applications, this AI mania is feeding into some ridiculous corporate feedback loop that is having a negative impact on the consumer.
Having to pay several thousands of dollars for a top tier consumer GeForce when it was possible to do the same with only a few hundred dollars less than a decade ago is telling me the customer is being taken for a ride. It stinks.
I don't get this. Nvidia didn't "bet the farm" on AI. They are simply allocating limited resources (in this case memory) to their most profitable products. Yes, it sucks for gamers, but I see Nvidia more reacting to the current marketplace than driving that change.
If/when the AI bubble bursts, Nvidia will just readjust their resource allocation accordingly.
I also don't understand common sentiment that if/when the AI bubble pops and hardware manufacturers come crawling back, we consumers are going to make manufacturers regret their decision.
Isn't the whole problem that all the manufacturers are pivoting away from consumers and toward AI? How are we going to "hurt Nvidia in the pocketbook?" Buy from their competitors? But they are also making these pivots/"turning their backs on us." Just abstain from buying hardware out of protest? As soon as prices go down there's gonna be a buying frenzy from everyone who's been waiting this whole time.
If/when the bubble pops, manufacturers will find that they can't butter their bread like they could when the datacenter craze was booming. In a world that is paved by growth, companies aren't very good at shrinking.
It doesn't matter what consumers do or don't do -- we plebians are a tiny portion of their present market. We can buy the same GPUs from the same folks as before, or we can do something different, and it won't matter.
Whatever we do will be a rounding error in the jagged, gaping, infected hole where the AI market once was.
This is an even-handed take. I still think consumers in general should vote with their wallets, even if all of them put together won't hold a candle to their datacenter customers. If nothing else, it can grant the competition more market share, and maybe AMD and Intel can invest more into Radeon and Arc, respectively. That can only be a good thing, since I'd love to see more broad support for FSR and XeSS technologies on games, and ROCm and oneAPI for compute.
Oh, for sure. It's often good to bet on the underdog in a competitive market -- it helps ensure that competition continues to exist.
When I sold PC hardware, I'd try to find the right fit for a customer's needs and pricepoint. Way back then, that often meant selling systems with relatively-inexpensive Cyrix or AMD CPUs and more RAM instead of systems with more-expensive Intel CPUs that had less RAM at any given price -- because those were good tradeoffs to make. By extension, I did a very small part to help foster competition.
But gamers drive the bulk of non-datacenter GPU sales and they don't necessarily act that way.
Having observed their behavior for decades, I feel confident in saying that they broadly promote whatever the top dog is today (whether they can afford to be in that club or not), and aren't shy about punching down on those who suggest a less-performant option regardless of its fitness for a particular purpose.
Or at least: The ones who behave this way sure do manage to be loud about it. (And in propaganda, loudness counts.)
I suspect they'll be fawning over nVidia for as long as nVidia keeps producing what is perceived to be the fastest thing, even if it is made from pure unobtanium.
I had one of those for what seemed like an eternity.
At first, right out of the gate: I overclocked it from 300MHz to 350MHz just to see what would happen. It worked perfectly without further adjustment (and the next step did not), so I left it right there at 350MHz. For the price, at that time, it kept up great compared to what my peers had going on.
As the years ticked by and it was getting long in the tooth, it stayed around -- but it shifted roles.
I think the last thing it was doing for me was running a $25 SoundBlaster Live! 5.1's EMU10k1 DSP chip under Windows, using the kX audio drivers.
kX let a person use that DSP chip for what it was -- an audio-oriented DSP with some audio-centric IO. With kX, a person could drop basic DSP blocks into the GUI and wire them together arbitrarily, and also wire them into the real world.
I used it as a parametric EQ and active crossover for the stereo in my home office -- unless I was also using it as a bass preamp, in a different mode. Low-latency real-time software DSP was mostly a non-starter at that time, but these functions and routings were all done within the EMU10k1 and end-to-end latency was low enough to play a bass guitar through.
Of course: It still required a computer to run it, and I had a new family at that time and things like the electric bill were very important to me. So I underclocked and undervolted the K6-2 for passive cooling, booted Windows from a CompactFlash card (what spinning HDD?), and hacked the power supply fan to just-barely turn and rotate air over the heatsinks.
It went from a relatively high-cost past-performer to a rather silent low-power rig that I'd remote into over the LAN to wiggle DSP settings on that only had one moving part.
Neat chips, the K6-2 and EMU10k1 were.
Fun times.
(And to bring it all back 'round: We'd be in a different place right now if things like the K6-2 had been more popular than they were. I don't know if it'd be better or worse, but it'd sure be different.)
Dude seriously this is such a nice story. I especially love how you used the EMU10k1 DSP in conjunction with your K6 system to its fullest potential. :D
Speaking of sound cards, I distinctly remember the Sound Blaster Audigy being the very last discrete sound card my dad obtained before we stuck with AC’97, and later the HDA codec audio solution on the motherboard.
I do vaguely recall the kX drivers you mentioned, but I’m pretty sure we stuck with whatever came stock from Creative Labs, for better or for worse. Also… that SB16 emulation under DOS for the Live! and Audigy series cards was not great, having been a carry over from the ENSONIQ days. The fact that I needed EMM386 to use it was a bit of a buzzkill.
On the K6-II+ system we had, we used an AWE64 Gold on the good ol’ ISA bus. Probably my favorite sound card of all time, followed by the Aureal Vortex 2.
Sound cards were cool. Kids these days with their approximately-perfect high-res DACs built into their $12 Apple headphone adapters don't know what it was like. ;)
My mom had a computer with a SoundBlaster 16. I carried that sound card across the room one day for whatever reason a kid does a thing like that, and it got zapped pretty bad with static. It still worked after that, but it learned the strangest new function: It became microphonic. You could shout into the sound card and hear it through the speakers.
But other than being microphonic, the noise wasn't unusual: Sound cards were noisy.
At one point around the turn of the century, I scored a YMF724-based card that featured an ADC stage that actually sounded good, and was quiet. I used this with a FreeBSD box along with a dedicated radio tuner to record some radio shows that I liked. That machine wasn't fast enough to encode decent MP3s in real-time, but it was quick enough to dump PCM audio through a FIFO and onto the hard drive without skipping a beat. MP3 encoding happened later -- asynchronously. It was all scheduled with cron jobs, and with NTP the start times were dead-nuts on. (Sometimes, there'd be 2 or 3 nice'd LAME processes stacked up and running at once. FreeBSD didn't care. It was also routing packets for the multi-link PPP dialup Internet connection at the house, rendering print jobs for a fickle Alps MD-1000 printer, and doing whatever else I tossed at it.)
I used 4front's OSS drivers to get there, which was amusing: IIRC, YMF724 support was an extra-cost item. And I was bothered by this because I'd already paid for it once, for Linux. I complained about that to nobody in particular on IRC, and some rando appeared, asked me what features I wanted for the FreeBSD driver, and they sent me a license file that just worked not two minutes later. "I know the hash they use," they said.
There's a few other memorable cards that I had at various points. I had a CT3670, which was an ISA SoundBlaster with an EMU 8k that had two 30-pin SIMM sockets on it for sample RAM.
There was the Zoltrix Nightingale, which was a CMI8738-based device that was $15 brand new (plus another $12 or something for the optional toslink breakout bracket). The analog bits sounded like crap and it had no bespoke synth or other wizardry, but it had bit-perfect digital IO and a pass-through mode that worked as an SCMS stripper. It was both a wonderful and very shitty sound card, notable mostly because of this contrast.
I've got an Audigy 2 ZS here. I think that may represent the pinnacle of the EMU10k1/10k2 era. (And I'm not an avid gear hoarder, so while I may elect to keep that around forever, it's also likely to be the very last sound card I'll ever own.)
And these days, of course, things are different -- but they're also the same. On my desk at home is a Biamp Tesira. It's a fairly serious rackmount DSP that's meant for conference rooms and convention centers and such, with a dozen balanced inputs and 8 balanced outputs, and this one also has Dante for networked audio. It's got a USB port on it that shows up in Linux as a 2-channel sound card. In practice, it just does the same things that I used the K6-2/EMU10k1/kX machine for: An active crossover, some EQ, and whatever weird DSP creations I feel like doodling up.
But it can do some neat stuff, like: This stereo doesn't have a loudness control, and I decided that it should have something like that. So I had the bot help write a Python script that watches the hardware volume control that I've attached and assigned, computes Fletcher-Munson/ISO 226 equal-loudness curves, and shoves the results into an EQ block in a fashion that is as real-time as the Tesira's rather slow IP control channel will allow.
Holy cow. Again, kudos for the details. This has been a fantastic digression so far lol.
So I do strongly remember Sound Blaster cards, specifically of the SB16 variety, being jokingly referred to as “Noise Blasters” for quite some time, due to the horrible noise floor they had as well as all the hiss. One of the reasons I loved the AWE64 Gold was because Creative did manage to get that well under control by that point, along with other fixes introduced with DSP 4.16. I still have an AWE64 Gold in my collection, complete with the SPDIF bracket, that I will never sell, due to sentimental reasons.
The YMF724 card you mentioned… did that happen to have coaxial SPDIF perchance? I heard that, unlike the SPDIF implementation found on the AWE series cards from Creative, the YMF724 SPDIF carried all audio over it, even under DOS. Not just 44.1 kHz specific sound, which I believe Creative sourced from the EMU8k. Plus, as an added bonus, if your motherboard offered SBLINK (also known as PC/PCI), you could interface with the PCI sound card interrupts directly in DOS without memory-hogging TSRs.
As for my final sound card I ever owned before abandoning them, mine was the rather unique ESI Juli@ back in the 2011/2012 timeframe. I loved how the audio ports had a zany breakout cable for MIDI and RCA features, as well as the board that could flip around for different style jacks.
One other remark that leads to a question. Linux users back in the day had a penchant for choosing one audio API over the other in Linux, like ALSA, OSS, or PulseAudio. Did you play around much with these in the dog days of Linux?
For the YMF724: I really don't remember that part of it, but I'd like to think that if it had SPDIF built out that I really would have paid attention to that detail. The only reason I went through the horrors of using the cheap-at-every-expense CMI8738 Zoltrix card was to get SPDIF to feed an external DAC (and finally live in silence), and if the YMF724 I had included it then my memories would be shaped differently. :)
And I'm usually pretty good with model numbers, but it's possible that this card really didn't have one. Back then, I got a lot of hardware from an amazing shop that sold things in literal white boxes -- stuff that they'd buy in bulk from Taiwan or wherever and stock on the shelves in simple white boxes with a card (in a static bag) inside. No book, no driver disk.
These boxes had a description literally pasted onto them; sometimes black-and-white, and sometimes copied on one of those fancy new color copiers, sometimes with jumper settings if appropriate -- and sometimes without. Some of the parts were name-brand (I bought a Diamond SpeedStar V330 from there with its minty nVidia Riva128 -- that one had a color label), but other times they were approximately as generic as anything could ever be.
Or, I'd pick up stuff even cheaper from the Dayton Hamvention. There were huge quantities of astoundingly-cheap computer parts of questionable origin moving through that show.
But no, no SPDIF on that device that I recall. It may have been on the board as a JST or something, but if it was then I absolutely never used it.
I do remember that bit about the EMU8k's SPDIF output -- my CT3670 had that, too. IIRC it was TTL-level and not galvanically-isolated or protected in any way, on a 2-pin 0.1" header. IIRC, it didn't even have the 75 Ohm terminating resistor that should have been there. I was disappointed by the fact that it only output audio data from the EMU8k, since that part didn't handle PCM audio.
But! There was a software project way back then that abused the EMU8k to do it anyway: Load up the sample RAM with some PCM, and play it. Repeat over and over again with just the right timing (loading samples in advance, and clearing the ones that have been used), give it a device name, and bingo-bango: A person can play a high-latency MP3 over SPDIF on their SoundBlaster AWE-equivalent.
I was never able to make it work, but I sure did admire the hack value. :)
That ESI Juli@ is an a very clever bit of kit and I've not ever seen one before. I'm a bit in awe of the flexibility of it; the flip-card business is brilliant. There's got to be applications where that kind of thing could be used in the resurgent analog synth world.
It's very different, but for some reason it reminds me of the Lexicon Core 2 we used in a studio from 1999 until 2002 or so. This had its (sadly unbalanced) 4 inputs and 8 outputs on an external breakout box, and we gave it another 8 channels in each direction by plugging it into an ADAT machine. That was an odd configuration, and bouncing through the hardware reverb on the card was even odder.
The Core 2 did not work with the then-cutting-edge Athlon box we built for it and that was a real bummer -- we spent a lot of money on that rig, and I spent a ton of time troubleshooting it before giving up. (We then spent a lot more money replacing that board with a slotted Pentium 3.)
ALSA, OSS, PulseAudio: Yeah, all of those. I paid for OSS fairly early on, and that was also always very simple to make work -- and it did work great as long as a person only did one thing at a time. I really enjoyed the flexibility of ALSA -- it let me plug in things like software mixers, so I could hear a "ding" while I was playing an MP3. And I liked the network transparency of PulseAudio ("it's kind of like X, but for sound!") but nobody else really seemed interested in that aspect around that time.
If I had to pick just one as a favorite, it would definitely be OSS: The concept of one sound card with exactly one program that completely owned the hardware until it was done with it allowed for some very precise dealings, just like with MS-DOS. It felt familiar, plain, and robust.
So with regards to the YMF724... what you describe about how spartan the offering was to you doesn't surprise me in the least. That specific chip was pretty much offered to OEMs very cheaply by Yamaha, and a ton of Chinese card makers (mostly of the generic variety) snapped them up and implemented the chip however they saw fit. As for Yamaha, they apparently did produce their own branded soundcard based on a later revision of the YMF724 chip called the WaveForce WF-192XG.
There is some guy on Youtube that reviewed his specific generic YMF724-based sound card many many years ago that did have SPDIF on board and he seemed quite fond of it. Though later on, he redacted his initial recommendation due to how hard it is to find a YMF724 card built exactly like his was. So to be honest, it's likely you didn't miss out after all.
Regarding the SPDIF implementation on the Creative AWE-series cards, those SPDIF brackets turned out to be very important due to their proper support for the TTL-level signal, yet back in the day, most users discarded it or lost them in the shuffle, making them exceedingly rare in the long run. If you are out shopping for any AWE32 or AWE64, good luck finding one with the bracket! Frankly, I wish Creative just slapped the coaxial SPDIF port on the card itself rather than on a separate bracket, though I suppose Creative wanted the world to know how they half-assed the implementation anyway. I digress. :)
That ESI Juli@ was the most interesting sound card I ever used, and I've not seen one like it since. I even recall ensuring that it had a dedicated PCI slot for it that did not connect over a PCI-to-PCIe bridge, in case that would introduce latency. Hilariously though, my needs were never particularly stringent. I was just being OCD.
For me? As far as Linux audio subsystems go, I preferred the ease of PulseAudio, even if it was rather buggy in its earlier days. I even played around with JACK many years later, but would go right back to PulseAudio, since it's truly set-it-and-forget-it in the current era... or I guess we default to pipewire now? I kinda stopped paying attention since audio is so seamless now.
That's a ton of good information about the YMF724.
You know, I don't think I ever played with the synth on that card at all. I was even a bit surprised earlier: I was trying to jog my memory about what the card looked like, and found the XG branding again (a quarter of a century after ignoring it the first time) and said to myself "Oh! That. I probably should have played around with that."
But MIDI music was never really my thing. I'm not much of a musician, and I found myself enjoying downloaded mod/669/s3m a lot more than any of the various notation-only formats. The games with revered synth work just never really crossed my radar. By the time I got into shooters, CD-ROM was definitely a well-entrenched concept -- along with PCM soundtracks. I still have the shareware Quake CD that I bought from a local record shop (which they only stocked because Trent Reznor did the soundtrack -- software wasn't their thing at all).
In the 1990s, I really hoped that SPDIF would become a common audio interface, with preamps and receivers and source devices (like computer sound cards) using it for IO. One cable. Perfect signal integrity using digital audio and affordable fiber optics -- at home! In a marketplace that was driven by buzzwords, it could have been a huge hit.
Instead: Even though it was common on things like MD, DCC, and unobtanium DAT gear, it was barely known amongst regular folks -- and receivers with digital IO didn't really become common until DVD.
But CRTs ruled during the peak DVD era, and many of those TVs had perfectly-adequate speakers for casual use that folks were content with. So the likelihood of them just happening to have two bits of kit in the same pile that could talk together with SPDIF was always very low, even then.
It seemed very much like a Catch-22: People didn't know about it because it was uncommon, and manufacturers didn't take it seriously because people didn't know enough about it to select gear that used it. It thus defaulted to remaining uncommon.
Its greatest market success seems to have been its utility in plugging a sound bar into a flat TV with terrible built-in speakers. Which is great, and all: It's a perfectly-cromulent use. But that started a decade or two too late.
I myself didn't own a CD player with an SPDIF output until 2012 or so, which is just bizarre in retrospect. What's even weirder is that it was an $8,000 Krell (that someone gave to me), and I found that I consistently preferred the sound of its internal DAC over that of anything else that I could connect it to digitally.... so I wound up never using the SPDIF outputs anyway. (But as experiences go, that's definitely in the realm of an outlier.)
Thanks. Of course, a quarter of a century or so after these went out of production, this isn’t exactly useful information, but fun nonetheless.
That’s the biggest issue with MIDI. No matter the equipment you had, you were never sure what the musician intended the composition to sound like, unless they explictly mentioned the exact synth used in the metadata, like a Yamaha XG synth or a Roland SoundCanvas. I really appreciated how compact the file sizes were, but I can definitely understand sticking with PCM formats off audio CD or even WAV/AIFF/MP3 back then, depending on the application.
So possible fun tidbit about SPDIF. Coaxial SPDIF, despite seeming more old school compared to its optical TOSLINK counterpart, could achieve higher bit depth and frequencies (sometimes up to 24-bit/192 kHz!!) whereas TOSLINK was officially limited to 16-bit/48 kHz, with manufacturers pushing as high as 24-bit/96 kHz off spec. Perfectly fine for your average music enjoyer of the time, but still an interesting limitation.
On mention of DAT and MD, those were two formats I would have loved to get into, if they weren’t so compromised due to RIAA shenanigans or too pricey. Such is life I suppose.
Yeah I’d say overall, I haven’t touched SPDIF in a long while myself. My current TV uses an eARC over HDMI soundbar setup and my PC connects using good old fashioned 3.5mm audio jacks.
One neat thing about specifications like toslink is how flexible they are -- or perhaps, how arbitrary they are.
At the core, both coaxial and toslink were just transport mediums for the same SPDIF bitstream. One used copper, and the other used bendy plastic fiber optics.
And yeah: Toslink was more-limited on bandwidth, by specification.
And one would think that this would be because the optics are not so good (they're definitely not so great), or something.
But then: Alesis showed up with ADAT, and ADAT's Lightpipe could send 8 channels of 24-bit 48KHz audio over one bog-standard Toslink.
They used different encoding, of course. Even at a very low level, rather than detecting a rising edge 1 and a falling edge as 0, it detected any edge as 1 and a lack of an edge as 0. This did let them pack a lot more bits in.
But in doing that (and whatever else they did), they multiplied the functional bandwidth of a lowly Toslink cable by a factor of about 6 -- using the same optical components at each end that Toshiba sold, and the same Toslink cable from the big box store.
I think we've beaten sound cards and SPDIF to death here. :)
It's been fun. Perhaps we can do this again some day.
I certainly have no delusions of Nvidia going bankrupt. In fact, they will certainly make it to the other side without much issue. That said, I do foresee Nvidia taking a reputational hit, with AMD and (possibly) Intel gaining more mindshare among consumers.
For that matter, plenty of room to remake/reskin older (fun) games... Bring the Black Mesa effect to lots of existing games.
reply