Hacker News new | past | comments | ask | show | jobs | submit login

Your laptop consumes 200w? Maybe I'm not reading this right, but I have a modern 3.5Ghz/8-core server in the closet which runs a small army of VMs and its from-the-wall draw never exceeds 80w (about 35w on idle).



My laptop is far from modern. It also has to power an old screen.


Wild guess - the disconnect here is that you're interpreting the maximum power rating of your laptop's power supply as an indication of what it typically draws.

It's more an indication of what your laptop would draw if it were doing GPU-accelerated deep neural net training while recharging a completely drained battery, plus a healthy margin of error just to be on the safe side.


200W still seems really high. A +10yo MacBook Pro used an 85W charger, and it rarely if ever reached that max. I find it hard to believe that even an old gaming laptop would require over twice that.


Can we please stop the guesswork in this thread, comparing gaming laptops to 10 year old MacBooks, and look at the data? Just shy of that 10 years, in 2009/2010, mobile Nehalem processors were introduced, and more importantly, early 2011, mobile Sandy Bridge arrived. Sandy Bridge was super powerful. How much powerful?

2 years, 2 generations before to Sandy Bridge: Core 2 Quad Q9000 (Q1 2009) -> i7 2820QM (Q1 2011) = +100%

Sandy Bridge to 6 generations later, 6.5 years later: i7 2820QM (Q1 2011) -> i7 8550U (Q3 2017) = +37%

Yes, I'm mixing tick-tock cycles or low voltage with "standard versions". So let's see a more appropriate competitor:

Sandy Bridge to 5 generations later, 6 years later: i7 2820QM (Q1 2011) -> i7 7820HQ (Q1 2017) = +50 %

So we had processors that doubled the performance in 2 years (and more than double if we disregard not-so-common Core Quads and take often only two-core mobile processors). Yet, 6 years later, there was only a 50 % increase. Until Ryzen came recently and made Intel to quickly come with Skylake-X, i9s and FINALLY, OH FINALLY, 6 (powerful) cores in laptops, there was no real performance reasons to upgrade, only for heat or thickness, those went down, performance didn't went up so much.

So, around that 2011, those were that crazy times, when you could have:

45 W or 55 W CPU (i7 2820QM or 2920XM)

55 W GPU (Quadro 2000m)

two SATA drives

four DIMM modules

In a non-gaming, back then "standard" size laptop (ThinkPad W520). And under heavy load, anything under Tjunction (say, 95 °C) was good enough. Some laptops with only a single 55 W dissipating fan (such as the ThinkPad W520) often throttled under full CPU and GPU load and couldn't run Turbo with full GPU load, others (such as Dell Precision M6600) had two separate fans and some even came with Quadro 4000m GPU with a TDP of... 100 watts!

So, back then, those laptops ordinarily came with 170+ W power supplies (even over 200 W) and just a CPU+GPU could draw well over 100 W (add the rest of the systems, hard drive here, hard drive there, all four DIMM slots populated, charging the battery) and it would easily go over the twice of that 10 year old laptop with 85 W charger.

And these are workstation models. If you mention gaming laptops, those sometimes had desktop CPUs in them and required 300 W PSUs. Nowadays they also use desktop GPUs and require TWO 300 W PSUs...

Please, don't base observations on 10-year old Tick (probably Penryn?) with 85 W chargers, when 7-8 years ago, a Tock (Sandy Bridge) came with literal concrete bricks as power supplies and really pulled 200 watts. Thankfully, things got dialed back after that, but at the cost of performance until now and current mobile 6-core i9s and Xeons.

And to the lap comments - you really didn't want to have those laptops in your lap when they were plugged in and turbo-ing...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: