Hacker News new | past | comments | ask | show | jobs | submit login
Nvidia GeForce GTX 980 Through GeForce RTX 5080/5090 GPU Compute Performance (phoronix.com)
27 points by mfiguiere 2 days ago | hide | past | favorite | 15 comments





People always talk about gpus being overpriced.

The 980 was $549 in 2014, which seems to be about $730? today.

that means the 5080 at $999 is 1.3x the price of the 980. Yet the geometric mean score is 8.5x.

if you compare 980 vs 5090, you get 2.7x price, but 14.3x perf.

I would imagine this must be pretty good compared to many other components. Well, cpus maybe, not sure about memory or hd/ssd.


> but 14.3x perf.

Back then GPUs were designed to maximize gaming performance (no CUDA cores etc.). Now they are general purpose devices that can also be used for gaming. So it’s a bit of an apples to oranges comparison.

Rasterization performance seem to be only 3x higher (980 vs 5090):

https://www.videocardbenchmark.net/high_end_gpus.html

Of course you still get additional features like ray tracing, DLSS etc. which make the real gap much wider (but probably not 14x)


The 980 did come with CUDA cores, it was not only a raster card. CUDA popularity back then pales in comparison to what it is today, but even the 9XX-series cards commit to a clear GPGPU capability.

>People always talk about gpus being overpriced.

This is true, but only if you're exclusively considering the lower-end or comparing with consoles (which, let's be honest, most people are). The 3060/4060 are 400USD GPUs that promise the performance of a GTX1080Ti for half the MSRP (inflation-adjusted), but released 5-7 years later.

The problem starts with just how absurdly good the 3080 and 4090 objectively are. Normally, you only get a 30% generational uplift with the newest-gen hardware (and the RTX5090 has indeed regressed to that 30% uplift mean, and that's partially why they had to add the extra memory); but the 3080 is a 100% uplift over the 2080Ti (and 3070) and the 4090 is another 100% uplift on top of the 3080 (and 4070). And you will recall that the 3080 was not twice the price of the 3070, and the 4090 was not twice the (inflation-adjusted) price of the 3080.

Which is precisely the reason nVidia is taking its sweet time coming out with what the gaming market actually wants, which is a 3080 for 400USD. You can expect such a card to appear in 2029; we already know for certain the 5060Ti 8GB is just a 3070 (for the same non-inflation-adjusted MSRP, by the way), and following that pattern means the 6060 will be a 3070 for 400USD (and the 6060Ti a 3080 for 550), then the 7060 will finally match the 3080 at that price.


This calculation does not account for advancements in technology. I guess you have seen those ads from the 90's showing high performance computers costing $10k. Well, current computers have several orders of magnitude higher performance, yet we are not expecting the prices to be in the millions. Instead, we expect the technology to advance on its own pace while the prices keep up with inflation. The problem here is that Nvidia is effectively a monopoly in high-performance GPU segment so the prices are out of hand and supply doesn't match demand.

It still feels like more because of the way the cost has risen to the person looking to buy that “median system.”

If I spent in the $200s inflation adjusted on my processor 10 years ago I can spend about the same and expect to be in the sand place on the totem pole as before. It would be a mid-range to low mid-range CPU suitable for those sort of tasks.

But if I spent $1362 in today’s money in 2013 I got the best graphics card available from Nvidia, the Titan X. Today the best card they sell costs $2,000.


I only heard this during one of the brief cryptocurrency crazes when GPUs became very valuable, and it seemed true, seeing how I made a profit selling an RX580 I had bought new years before.

Wow hadn’t realised the 4090 is twice as fast as the 3090 at llama. I guess vram focus had lead me to believe they’re closer

When I was in grad school doing research on the still brand new "deep learning", having a GTX TITAN X was considered bragging rights. I remember complaining to my advisor about my paltry GTX 970. One lab even had a cluster of 4 of them!

Crazy that even the last generation cards are 15x faster for ML tasks.


It's neat to see the 4080 Super doing well compared to the 5080 and 5090. It's clearly behind them in many cases, but seems to have better performance per watt ratio than them while also having some of the lowest temps. Also interesting the regular 4080 isn't too far behind the Super.

This was an enjoyable and clean dataset to read through, thanks Michael.


It's a shame they didn't test the Titan V, it was released in 2017 yet still has the best FP64 performance of all the NVidia GPUs by a longshot.

Titan V wasn't included since I never ended up receiving any Titan V review sample back in the day. :/

I'd never seen this before but it is interesting. For the uninitiated, it has performance much higher weighted than others towards that metric. However, it only beats the GPUs. Looks like the more datacenter or professional cards completely outclass it. Seems to be similar FLOP/Joule, though.

I'd bet that like AVX512 on Intel, they decided the money was in AI not more traditional scientific computing.


I assume this is for architectural reasons?

Did anyone ever investigate if newer drivers cause older GPUs to perform worse?



Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: