You can buy used nvidia tesla p40 with 24GB from 2016 and unlike AMD, they still have CUDA support. The only thing they are missing is support for the latest data types like fp8.
I don't know if I'm in a parallel Universe or hallucinating, but these are graphics cards, designed and created for people to install in they deskop PCs and run games. Just like they have been doing it for decades now.
What's with the P40, CUDA and fp8? Seriously people, chill. AI has been using graphics cards because that was the best available route at the time, not the other way around.
Otherwise I must question, why don't you talk about DisplayPort2.1, DirectX12, FidelityFX.. and other dozen features graphics related.
Because, unless you are a Big Corp, you can't afford to buy H100s. Being able to run AI on consumer hardware is essential for many AI hackers out there. And no, renting hardware on "AI clouds" is not reliable — still outrageously expensive, and the rug can be pulled out of you at any moment.
My rig is 4x4090, and it did cost a fortune to me already (still less expensive than a single H100). I would have happily used cheaper AMD cards, but they are not available for reasons like this.
Last time I checked, this site was called "Hacker News", not "Bigtech Corporate Employee News".
> I don't know if I'm in a parallel Universe or hallucinating, but these are graphics cards, designed and created for people to install in they deskop PCs and run games.
Graphics cards have been for more then running games even before they were broad-purpose computer engines that they have been since NVidia radically reshaped the market—decades ago.
Heck, Nvidia has had multipe driver series for the same cards because of this, even before AI was a major driver.