Hacker News new | past | comments | ask | show | jobs | submit login

Apple is the undisputed king of performance per watt but they need to keep up in single core performance. Sure, Intel draws a ton of power but most tasks we do day to day are single core and the Intel i9-12900K is already 15-20% faster than the M1. The M2 will have a modest single core boost (10%) but the next i9-13900K will be 15-20% faster than the 12900K and increase Intel's lead even further.



I think that's pretty poor speculation. We know nothing about M2 and the i9-13900K yet. On top of that, responsible use of energy is exactly where we should be heading, not absolute single threaded performance.

Also if you ever owned a top end Intel MBP you'll know how much pain Intel can inflict on you.


The 13900k will still be stuck on the current process node, so don't get your hopes up.

https://www.xda-developers.com/intel-13th-gen-raptor-lake/#i...


> On top of that, responsible use of energy is exactly where we should be heading,

I never really got this arguement, especially when you're talking about a laptop that will happily draw 40w+ if you crank the display brightness. Arguing about the merits of a 7w CPU vs a 15w one sounds like people are missing the forest for the trees.


If you sell half a million CPUs that’s a lot of megawatts.


40W is way high. https://www.notebookcheck.net/Dell-XPS-15-7590-OLED-Power-Co... shows 20w for max brightness on a laptop OLED monitor (or 6W at minimum brightness). For 13 inch laptops, it's around 25% lower (less area). Also, laptop CPUs can pretty easily use 30W (i7-1185G7 is an ultrabook class CPU and can be configured up to 28W).

Edit: was your 40W for total power consumption? If so, then going from 20W cpu to 10W cpu is still a 25% energy consumption reduction (as well as a cooler lap and better battery life).


Macs have long reserved ~50w of power for the display, whether or not all of it is used is a different question. Giving them the benefit of the doubt (and accounting for screens that can max out at 1500 nits), I think an upper bound of 40w is pretty close to the actual figure. OLED will always pull less power since it doesn't need a backlight, in contrast to the hundreds of backlights on newer Macs.


The macbook pro has a 70 watt hour battery, and has a 10 hour battery life (video playback). Also, if I'm reading correctly, the 1000 nits brightness is only achievable when plugged in, and apple automatically reduces brightness on battery to keep the power consumption lower.


The so-called breakdown of Dennard Scaling happened a little over 10 years ago. Since that time, CPUs have been forced to keep portions of the die powered off at any specific point in time - to prevent melting themselves. Each new generation requires a larger and larger area to be shut off.

Responsible use of energy directly leads to higher performance. You can also restate this: software is a performance bottleneck in a modern CPU.


Well, unless you get throttled down and run at 20% of your capacity because the temps is around 100 celcius ;)


Or it throttles down because you unplug your laptop from the wall.


We still didn't see Apple desktop m-series cpu And you know, maybe 12900k is 15-20% faster in single core, but cmon - it's 250W tdp cpu vs something that doesn't even need active cooling


> something that doesn't even need active cooling

Only M1 Ultra seems to be comparable to the 12900k so that's not completely fair. It still seems to be ~2-3 times more power efficient though.


M1 Ultra is a desktop CPU, and it has a huge cooler on top of it.


> Sure, Intel draws a ton of power but most tasks we do day to day are single core and the Intel i9-12900K is already 15-20% faster than the M1.

I have a separate heating system for my house thank you very much :)


My i7 12700k idles ~21c, and much like the M1 series, struggles to peak 50c unless you're running a Cinebench loop. YMMV, but I think x86 has life in it yet if Intel can get these results with silicon that's less than half as dense.


>My i7 12700k idles ~21c, and much like the M1 series

Yea, but unlike the M1 series, you probably have a big hunk of copper on top of your CPU. It really isn't a fair comparison. Intel needs a miracle. Actually, we all need a miracle, because Apple just changed the whole industry and I really hope they get some competition.


Intel just needs to not be using their podunk 10nm lithography they've been touting for the past 5 years or so. They plan to beat Apple's density by 2024, which will raise some interesting questions about how much efficiency they can recoup at this point. I don't think it's fair to claim anyone as the victor yet, we'll simply have to wait and see if the reports of x86's death were grossly overstated or not.


You have a custom cooling system? Because I see that CPU measured at 150W... just the CPU.


... and the fact that cooling works well enough to keep your CPU at 21 C doesn't mean the heat isn't dumped into your room. Which brings up the A/C costs in warm times.

[Wish HN would let me edit for longer.]




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: