Hacker News new | past | comments | ask | show | jobs | submit login
The first yearly drop in average CPU performance in its 20 years of benchmarks (tomshardware.com)
149 points by LorenDB 41 days ago | hide | past | favorite | 63 comments



I don’t believe these numbers mean what they think. Their sample size has dropped dramatically since the previous year, from 100k to 25k for laptops and from 186k to 48k for desktops. Given that all the data comes from people choosing to run the benchmark, I wonder what population has suddenly left the data, and if that is significant.

Also consider that the CPU is only one component of this benchmark. The article itself says that Windows 11 performance is worse than Windows 10. This might be another instance of “What Andy [Grove] giveth, Bill [Gates] taketh away.”


> Their sample size has dropped dramatically since the previous year, from 100k to 25k for laptops and from 186k to 48k for desktops.

I suspect that's just an effect of 2025 data being limited to just ~January, rather than a full 12 months.

If people run a benchmark only once every 4 months on average, that would certainly explain the sample size.


Yeah, good point. Or maybe it means people got slightly less-performant systems for Christmas than last year. Gates’ Law finally catching up with us all.


I think there are several factors at play here, starting with “my current machine is good enough” and lower PC hardware sales as well as indeed, a bit of a shift to lower-power, quieter, smaller machines, and CPU characteristics changing (with more e-cores).

Example: The mini-PC market has exploded, and people buying those are neither benchmark enthusiasts nor an active part of their readership.


> “What Andy [Grove] giveth, Bill [Gates] taketh away.”

Also known as https://en.wikipedia.org/wiki/Wirth%27s_law


Still, those sample sizes are huge


Sample size won't save you if you're sampling from a different population.


Sure, but we dont know


If there's an artifact here it's probably seasonality.

E.g. a proper comparison would look at what the results were for just January over the past few years.

It's easy to imagine that benchmarks in January might sample more from cheaper Christmas present laptops, while the rest of the year has more powerful professional purchases spread throughout.

Or after Christmas upgrades, people get hand-me-down computers and disproportionately benchmark those to figure out what they're good for.

You can invent a million scenarios, but it's extremely plausible that January results could be an outlier every year.


Sample size cannot compensate for sampling bias.


Could be several things, ranging from mundane to very concerning.

- People are downgrading their computers.

- Windows 11 is more bloated and slower, decreasing the test scores.

- All the security mitigations are making everything slower but they've been masked by hardware improvements in the past. Now there isn't much in terms of that so we start the slow descent to death.


It could also mean that consumers are buying more budget models and less pro- ones. Given cost of living crisis (at least in Europe) it would not be surprising.


It doesn’t even have to be a budgetary concern, it could mean that more people are choosing power efficiency over computing performance. Thin and light laptops are good enough for a lot these days, and they are still pricey.


It comes down to what the extra flops get you nowadays. Just like a videogame from 5 years ago doesn't look very dated anymore, we need massive processing power gains to notice anything, and even then, for very few uses.


What cost of living crisis is going on in Europe?


The combined effects of the pandemic and the war in Ukraine have led to a more severe inflationary shock than the US, but with much worse economic growth and lower wage increases. The majority of households have cut their spending in recent years.

https://d3nkl3psvxxpe9.cloudfront.net/documents/Eurotrack_Co...


Interestingly page 2 of the file doesn't seem to indicate a crisis. All the numbers a roughly stable, some are actually improving (for example less people struggling with housing costs in Germany and generally fewer people struggling in France). What sticks out is that more Germans expect a recession around the corner.


It would make sense if the benchmark would be about smartphones, but laptops and desktop computers are already non-essential 'luxury items'.

If the poorer half of people can't afford a new computer I would expect the benchmark to go up, because the rich people will still buy highend computers while the lower half would drop out of the benchmark completely.

FWIW, I stopped upgrading my gaming PC since 2018, not because I can't affort an upgrade, but because the hardware has been "good enough" (still sporting an RTX2070).

Also, the 'living cost crisis' in Europe has been quite dramatically overblown by Russian propaganda, at least we haven't frozen to death or eaten our pet hamsters yet ;)


> If the poorer half of people can't afford a new computer I would expect the benchmark to go up, because the rich people will still buy highend computers while the lower half would drop out of the benchmark completely.

Surely you can twist this 2-3 times more, can't you?

.. people have computers, because otherwise they die. If people have less money, they buy cheaper computers. Simple as that.


What I'm seeing in my wider family and friends-circle (outside dedicated 'computer nerds' of course) is that owning a (private) computer has become very unusual in the last 10 years. My nephew built himself a PC for gaming - but that's definitely a 'luxury item' because a game console would be much cheaper. And that's about it.

Everybody owns at least one smartphone though.


why would consumers downgrade their systems?


Hardware failure. Desire for less power to extend battery life.


Don’t need top of the line. Just need to get things done. Save money, buy lower performance processor. Done and done.


- People buying new high end computers are no longer benchmarking them (at least with PassMark) while people buying lower end or second hand computers still do so

I know I haven’t bothered on my last two computers, partly because CPU performance is so far past what I need for most workloads, and partly because for the rest, I care about actual workload rather than synthetic benchmarks.


That's because many new Windows PCs have Snapdragon ARM processors which are slower than x86 processors (but have much better battery life).


I got a Surface laptop with the Snapdragon. 32GB, fastest model.

In general use it feels way faster than my 1 year old Lenovo, which was 2x the price.

It's the the Lenovo one is specifically programmed to only clock up when the load is sustained for longer, whereas the Surface one clocks up much faster.

And still, it lasts 10 hours, and my Lenovo does 1.5 at max.


I feel like something's got to be weird with your Lenovo... I have a Lenovo Thinkpad T490 that's a little over 5 years old, use it on battery every day, never replaced the battery, and it still lasts around 3-4 hours.


I don't think so. I have a P16 Gen2 and this thing sucks power like nobody's business. 13950hk and 128gb of RAM. I barely get 1 hour of light programming work. I mostly use it as a desktop though, and I have a separate M2 Mac for traveling.


I believe H and HK (and HS?) are considered portable workstation chips. They draw about 55 watts compared to 15 or 28 watts at max load, and generally run much hotter as they have higher power ceilings. The U (and even lower than that) series are the “typical” laptop chips that give 5-10 hours or so of battery.

Caveat, apparently AMD workstation chips can give good battery. Plus, a down lock/undervolt/power limit will give much nicer battery life on a workstation laptop if you can do it.


How much did you pay for the Surface? I just got a Lenovo P14s w/ 64GB and Ryzen 8840HS for around $1100. I haven't used the Surface Pro Snapdragon yet but this laptop screams. I think it mostly comes down to disk performance for me though.


Except that the Snapdragon X Elite is actually faster than both AMD and Intel chips on average.


It is faster only for the things that casual users do, which are not computation-intensive.

It is slower for scientific/technical computing or anything else that contains great amounts of operations with either arrays or big numbers.

Even for the things where the Qualcomm CPUs may be faster, their performance per dollar is inferior to the Intel/AMD CPUs.


I wonder if there are a couple of trends that skew this data:

- 3D VCache: Are the X3D processors over represented on this benchmark?

- Focus on Battery Life: the latest mobile processors nearly double battery life with minimal increase in multicore performance.

Overall, CPUs are specializing a bit more than the past and that may be impacting the scores.


They must be excluding Apple Mx series as that has had a very clear year-over-year increase recently.


Performance per watt continues to increase tho


I wonder how widespread the adoption of steam deck + clones has been amongst benchmark participants (very very good perf per watt, relatively middling absolute performance), that could explain a lot. Not sure where they would end up on the desktop vs laptop categorization.


For some time there was stagnation on the performance per watt metric. But ever since Apple dropped the M1 there has been a huge change.


For a little while before that, CPU performance was stagnant with Intel on top until AMD released Ryzen and Zen, and Intel got stuck on 14nm for half a decade. Suddenly AMD is posting substantial performance improvements every cycle and Intel is cranking up TDP to compete. Now we have competitive x86 processors from two different sources AND competitive ARM processors from two others.


Apple has unlimited money cheat codes. Buy newest/best node, move memory on package, ignore compatibility/longevity.


The companies that don’t buy out the first year-or-whatever of capacity on new nodes should still get the same year-over-year advancement, though. Just, with a small but constant delay.


Microsoft had/has the same, yet no results.


it’s cuz developing countries are using more computers


What also happened in the same time frame according to this website:

  - [1] 1366 x 768 was the strongest growing monitor resolution
  - [2] Dual- and Quad core CPUs went up, 6-16 Cores are down
  - [3] 4 GB and 8 GB of RAM went up, 16/32 GB fell
So it comes down to: More old (ancient?) machines in the Dataset. Why? Unknown, but probably not indicating a trend regarding the hardware people use in the real World (TM) has changed.

[1] https://www.pcbenchmarks.net/displays.html

[2] https://www.pcbenchmarks.net/number-of-cpu-cores.html

[3] https://www.memorybenchmark.net/amount-of-ram-installed.html

[from 3dcenter.org : https://www.3dcenter.org/news/news-des-12-februar-2025-0 [German]]


So other than Windows constantly and actively slowing down machines, we have dust collecting which then causes CPUs to throttle sooner... I'd be interested to see what the trend in average scores looks like for machines that don't otherwise change over time, although I can't imagine anyone would run Passmark every day or every week for a few years.


Not only Windows but all the software that counters hardware-level vulnerabilities. I bet those tests don’t disable them.


That has got to be a data issue?

I can’t see many let alone a majority downgrading


Does this reflect multicore benchmark results or single thread? For pondering the relevance to real world app performance this seems a central thing as the vast majority of non-game apps can't make effective use of more than one or a few cores.


Is anyone sad that power efficiency is now important? Apple M chips have wiped the floor with everyone else and it's about time we got good performance without heating the room up.


This is a naive take, sensationalized, and clickbait article. You cannot logically look at a graph and jump to conclusions without understanding the why, and without eliminating other factors like the possibility that it could be compiler regression or compiler not generating optimal code for latest CPUs, or power related issues in latest machines.

I am not making any claims, but the reasoning in the article makes a huge leap of faith conclusion.


Why wasn't that a factor in the last 20 years though? If anything, it's surprising that it happens so late considering that the "Free lunch is over" paper is 20 years old by now: https://www.cs.utexas.edu/~lin/cs380p/Free_Lunch.pdf


> Or maybe Windows 11 is depressing performance scores versus Windows 10, especially as people transition to it with the upcoming demise of the latter.

Surely the stats track that internally?

This "mystery" seems like it should be easily solvable with some further category breakdowns -- how is the mix of hardware changing, how is the mix of OS's changing, is the change happening in some countries but not others.


I wonder if we need an inflation adjusted measure? Maybe this is CPU shrinkflation?


No, it's Wirth's law eventually overcoming Moore's law.


A good time to popularize the impact on the "Wirth's law" on CPU Performance, aside from the "Moore's law" - shifting focus to Lean Software alongside faster xLP hacks.


Some questions that might help answer the reasons ...

What are the sub categories

What are the outliers

What is the mix of same machines from past benchmarking versus newly added to pool

Etc.


Are laptops really less than 1/2 the performance of PCs on average? I had assumed less, but not less than half.


People have weird beliefs about performance.

A laptop can sustain desktop-like performance for short periods (tens of seconds) before they must throttle back because of heat limits. For sustained maximum performance you need the ability to dissipate lots of power for extended periods.

A laptop that can dissipate 50w is naturally going to have close to half the performance of a desktop expelling double that. For the most part, there is no technical differences between a laptop and a desktop processor other than the TDP target.

For most applications, though, this difference is irrelevant. You need to be doing something computationally expensive for an extended (10+ seconds) period for a desktop to be a sensible choice for performance reasons.


Most laptops are severely limited by heat dissipation. So it's normal that performance is much worse. The CPU cannot stay in turbo as long and must drop to lower frequencies sooner. On longer benchmarks they CPU starts throttling due to heat and becomes even slower.


Unless it’s a MacBook. In which case it will demolish the vast majority of desktop pc processors. My 16core amd 9950X is put to shame by M4 mobile processors.


Then you do mostly or only things that do not fully use a 9950X.

For things that fully use a 9950X (i.e. which include a lot of array operations or of operations with big numbers), the throughput of a 9950X in many times higher than of any M4.

A 9950X is not really optimal for a personal computer. The ideal configuration is to have both a mini-PC or a laptop that you use as your personal computer, because even for professional use most of the time you do only things that require very little CPU power, e.g. reading or editing documents, together with a desktop computer with a 9950X that you use as a server, launching on it any compute-intensive tasks.


For my use case I need Linux, many threads and high memory bandwidth, nvidia gpu.

M4 has almost 5x the memory bandwidth of top of the line DDD5 memory when paired with 9950x.

If Apple was selling standalone processors compatible with Linux they would sell like hot bread.


By this point, it feels like we ought to be benchmarking CPU+GPU. The same way it already seems to be measuring multi-core.

Maybe the drop is some kind of artifact. But it would also be interesting if it started making more sense to invest more in improving GPU's, even at the cost of CPU's, for a large part of the market.


They need to look at GPUs instead of CPUs now.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: