Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The power consumption of some of these old workstations is obscene. There's probably some optimal intersection between cost of the computer and the cost of power consumption.

an older macbook pro (2012ish) gives 12,000 on the MT benchmark and only draws what, 85w?

the author briefly touches on this, but unless you get free electricity, this is a bigger issue than presented.



Yes! I am shocked that almost no one seems to comment on power consumption for these units. Sure, they cost $200 while being just as fast as $700 machines, but they can easily pull $50 a month in power, assuming 25 cents per kwh.

Not only is that a waste of money for power, it is horrible for the environment. These machines are obscenely inefficient power wise. It's likte folks who buy a r720 for $150 that comes with 32GB ECC ram and dual cpu's, but it pulls $75 per month in power. Even if you don't pay for electricity directly (you do in rent then), it's still terrible for the environment.


While I agree with the premise, you're exaggerating far too much.

If an R720 pulls anywhere near $75 a month [1], you got some sort of major problem. In reality, these things pulls closet to anywhere from 120 to 180 watts on the higher end.

1. https://www.reddit.com/r/homelab/comments/7r90l8/average_pow...


He said dual CPU’s using the worst case of 27.5 cents per kWH in Hawaii. 2 * .18 * 0.275 * 24 * 30.4 (days per month) = 72$ per month + whatever the rest of the machine uses.

Though few people’s electric prices are that high, and it’s rare to sit at 100% CPU 24/7.


It's more like 25€/month for our R720 at work (with 128gb ddr3 memory and dual xeon e5-2690v1) - if you run with powersave governor in Linux and enable lower C-states in BIOS. I'm not joking the included iDrac7 reliable shows 110-130W. How course if you number crunching at 100% cpu are you more like using 300W.


> Though few people’s electric prices are that high, and it’s rare to sit at 100% CPU 24/7.

While I agree a rate similar to that isn't all that uncommon in for instance Europe, from that perspective it's a relevant calculation. On the other hand the prices of old hardware tend to very between markets as well so that would still impact the break even point.


It costs a lot yes, but bad for the environment?

How much energy and pollution goes into producing new hardware? Not so sure buying new is at all better for the environment.

That and that we have even more obsolete hardware to take care of in the future.


In a competitive industry, dollar price is a good approximation of energy usage, which is an approximation of pollution.


Too bad pollution is externalized to future generations or foreign countries, so not included in the economic equation.


OP is saying that there is pollution for both options, and the total cost of that pollution is proportional to the total cost of ownership of the product.

He doesn't provide evidence, but it seems at least plausible.


A lot of it depends what's happening to that Xeon machine if you don't buy it. If it's otherwise going to the landfill then you using it a little longer to defer the creation of brand new hardware is absolutely the greener option.

Same logic that applies to driving an older car— it may not be the most efficient, but especially if you're low usage, extending the life of something that already exists is almost certainly greener than using something new.


Yes, you'd hope so - but unfortunately that's not true. We're terrible at pricing externalities (by design, it makes it easier to profit; for example free use of public commons like fjords for farming salmon - creating ecological disasters).


I wish the people downvoting this guy would actually research this claim. Seems reasonable to me. Seeing for a scatterplot of consumer good MSRP's vs. energy of production would be enlightening.


I would assume that they're being downvoted not for their claim (which I would not agree with, though I would love to see that graph), but for missing the point of the comment they responded to which points out that even if older hardware consumes more electricity, it may still be less environmentally harmful than purchasing new hardware which required many resources to manufacture.


That would miss gowld's point, which seems to be that a good first order approximation of the energy expended in the manufacture of a new machine is probably captured in a fraction of how much its retail price is.

You would definitely not expect that a machine selling for $1000 would have used more than $1000 of energy in its manufacture. Even accounting for energy costs all the way down the chain (materials arguably don't have any other cost other than energy and attention involved in collecting/isolating/refining them), my uninformed guess would be that with something like a computer, it's something like a bit less than half, with the rest of it being amortized human attention at various stages plus some profit margin, though I'm sure there's more refined and accurate models available.

So, throw out $480 as an energy number for our $1000. That means old hardware that's less energy efficient to the tune of $40 a month will out-impact the manufacturing cost of the new machine in a year.

As an exercise, contrast this with older automobile. If you've got a well-functioning 10-20 year old vehicle, it's probably somewhere between 80% to 50% as energy efficient as the higher efficiency choices you can buy new off a lot today. But the sticker price of a vehicle will tell you that it probably $10-20k of energy to produce. Because that number is high, from today, it will probably take your used vehicle longer than its remaining lifetime to exceed energy use involved in making the new car (and most new vehicles won't get you where you're going any faster either).


> You would definitely not expect that a machine selling for $1000 would have used more than $1000 of energy in its manufacture.

This is a misconception. Energy is not a transportable, fungible quantity the way dollars are. It is entirely possible that the device one buys for $1000 would require more that $1000 of energy to make, if manufactured in a modern economy with high environmental standards.

The major forces in the global economy over the last 3 decades have been this imbalance in labor, energy, and environmental compliance costs. The $1000 retail price in the US does not contain the largely externalized costs that its place of manufacture may have permitted.


Buying items in highly competitive industries from far away places is the most efficient way to turn western dollars into pollution.

What the parent didn't enumerate is that the energy cost has to be calculated as the point of use. You can do this for say solar panels made in China. If you assume that 100% of the purchase price is translated into energy costs, zero physical resources used, etc. That 100% of what you pay, went into energy, used by the nastiest sources you can get an upper bound on energy costs to create an item are. In China, $/KwH is 2.5-5 cents US. Probably lower if you are in some direct use of coal scenario. Using the lower bound, say a $100 solar panel made in China consumes energy costing 2.5C/KwH. It could have used at most 4MWH of electricity.


> Energy is not a transportable, fungible quantity the way dollars are.

You mean energy and dollars aren't both transmitted up/down wires or moved around with mass that represents stored potential?

> It is entirely possible that the device one buys for $1000 would require more that $1000 of energy to make

This.... doesn't sound like a business model that will last long. Can you give a concrete example?


Electrical power does not cross oceans. Dollars do. A factory in China running on a local coal plant's energy does not pay the same cost that a factory in Texas would pay.


If I go out and burn enough coal to release 1000 Joules of energy into the atmosphere, will that cost me the same as 1000 Joules of electricity consumed by my electric oven? Of course not, it would be orders of magnitude cheaper. You can’t make simplistic assumptions about the cost of energy in a product based on its final price because the costs of energy in different forms vary enormously - by many, many orders of magnitude.


The costs involved in the manufacture of a product at any single point in time are exactly what I can make assumptions about from the price of the product, nothing simplistic about it, given that prices are a result of the negotiation of a lot of details, including whether the optimal industrial input is some raw mass of coal that's burned in some managed way or Watt-seconds of directly supplied electricity. It's not going to be a perfect signal (demand matters, and any single estimated or registered price reflects a certain degree of imperfect judgment/optimization), but energy inputs are going to be a bounded factor.

If what you're saying is the actual energy expenditure may be what we're concerned about if we're speaking about environmental impacts and might be more properly modeled by something more complex, that's a worthwhile point. But it's going to be much less a matter of 1000J from a given mass of coal vs 1000 J of directly supplied electricity -- this factor will disappear behind whatever market allocations/optimizations are available -- and much more a matter of general industrial energy costs circa 1995 vs 2015, combined with the relative efficiency of manufacturing processes at both points in time.

What would we expect on those two fronts? Personally, I'd expect energy prices to rise with economic growth and occasionally fall with recessions, absent some large new source coming online or state-imposed costs for use. I'd also expect process efficiency to increase as well. Which would lead me to, again, see energy expended as reasonably estimated by some bounded factor of a final product price.


> So, throw out $480 as an energy number for our $1000. That means old hardware that's less energy efficient to the tune of $40 a month will out-impact the manufacturing cost of the new machine in a year.

That seems like an obscenely high difference in monthly energy cost (if we're going for an apples-to-apples comparison, in contrast with the article's posed comparison of a desktop workstation v. a consumer laptop). For reference, I run multiple desktops, multiple laptops, a full-size fridge, lights, fans, and an Echo, all mostly 24/7, and per PG&E my total monthly power bill (near SF) is less than that (and most of my hardware is on the older side).

We're more realistically talking (from my experience, running a lot of the sorts of older desktops the article mentions) a difference closer to $4 than $40. Even $10 (which would still be a pretty high estimate) would extend your estimate to 4 years until break-even.


Of the several objections to my comment people have registered, this seems like the best one: on reflection it seems likely that an 85W laptop's daily use is likely to be around 1 kWh (maybe 2 kWh if driven near capacity 24 hrs), which is on the order of $5-$10/mo. So for things to come out something like I'd speculated, either older workstations would need to use much more power (5-10kWh) or the manufacture of something new would have to involve much less energy than I'd guessed.


In addition to the sibling comment's point that externalities of energy production are unevenly distributed, the difference between an old machine and a new machine is that in the manufacture of the old machine the energy/pollution/labor have already long since been expended, and the amortization of that expenditure across more years of use is likely to compete well against even a very efficient newly manufactured piece of equipment. This is most dramatic in e.g. the purchase of cars, where it's typically more environmentally friendly to drive an older, even significantly less efficient car, than to purchase a new car.


It depends where you are.

In New York, about 60-65% of your electricity is emmission-free, with most of the rest being gas. In Ohio or Kentucky, it's all coal and gas. You're probably emitting more in Kentucky with a laptop than you are in NY with a workstation.

The whole argument is tedious and obnoxious anyway, as the marginal negative impact of squeezing a couple of years out of an older device is overstated and minimal -- you'd be better off assuaging your guilt by taking a 5m shower.


What about Perry, Davis-Besse, and Beaver Valley nuclear plants along with https://en.wikipedia.org/wiki/Wind_power_in_Ohio and hydro on Ohio river?


That claim assumes no externalities and no shenanigans with business models. I too would like to see a scatterplot, but I already expect GP's claim not to hold for phones, IoT, and anything bought used.

And that's also only the manufacturing part. The energy used when operating a device is not usually incorporated in purchase price.


Which means by buying these workstations used and on-the-cheap, I'm polluting less than if I were to buy brand new hardware.

Then there's the electricity cost. I own quite a few old workstations like this, and the power consumption ain't that much higher. Yeah, maybe compared to a laptop, but that's like comparing the feeding habits of a hummingbird v. an emu (and you can buy old laptops for relatively cheap, too).


I don't think that's true, we have serious issues pricing externalities correctly, and making sure the correct party pays for them. Way too often the bill for risk and cleanup lands on the (local) goverment/public - sometimes long after the profit has been made.

Common and extreme cases are pollution of the type we see on Nigeria where global conglomerates are "allowed" to ignore safety standards by a corrupt government. Eg: https://www.bbc.com/news/10313107

See also:

Michael Woodiwiss Gangster Capitalism: The United States and the Globalization of Organized Crime

https://www.amazon.com/Gangster-Capitalism-United-Globalizat...

Ed: and another example of problematic incetives is allowing power production for profit. We generally agree we need to use less energy, and yet have businesses that make more money when they can sell more energy... Sure, it might be a benefit to relatively sell more "green" energy - but really, the ideal energy company would make the most money when it got its customers to buy less energy....


All the electricity in my city to normal households are from solar/wind/hydro power so using old hardware is definitely better for the environment for me to use old hardware over buying new. It hurts the wallet though.


Hydropower uses massive amounts of cement, and the cement industry is responsible for about 5% of all carbon emissions. Similarly, if you're counting the environmental impact of manufacturing the more energy-efficient computer, you must also count the environmental impact of the solar panels or wind turbines that do not have to be manufactured because of the increased energy efficiency.


Surely the one-time emissions in cement required to build a dam that lasts for decades amortizes down to nothing compared with the ongoing outputs associated with coal or gas fired electricity production? (not to mention the upfront emissions associated with constructing those facilities as well...)

Most of those global cement emissions are surely in building sidewalks, highways, bridges, and skyscrapers? Seems weird to blame an otherwise pretty green electricity source for this carbon.


But the argument for buying used hardware holds for existing infrastructure too: the hydro plant is already built.

I certainly agree that we need to factor in the impact of construction, though!


Instead you are hurting the global environment (not your local environment).

If you use an extra kW, the city doesn't get to sell the extra kW into the national grid. So the national grid needs an extra kW, which is most likely produced by gas!

In a nationally connected electricity grid, each extra kW you use, increases the usage of the next marginal kW of power on the network, which is most likely gas (unless you use the kW during off-peak, when it might be nuclear depending on your country).


No. This is a fallacious misunderstanding of how markets work.

The hypothetical user of the older hardware is not damaging anyone else by consuming "green" electrons. Their demand provides a market for green projects, and the fact that there is non-green supply still available is simply an opportunity for new green supply to supplant it.

If the demand for green electricity is there, supply will appear, as long as it is economic to do so.


So you are suggesting there is two types of electricity markets: one for green electricity and another for say black electricity. Let's say their prices are in equilibrium.

So you increase your green electricity demand, and green power generation capacity is increased and more green electricity is made.

However the demand for black electricity hasn't decreased.

But you have created an arbitrage opportunity e.g. someone decreases their green electricity usage, and increases their black electricity usage.

Your fallacy is that you think it is possible to create two separate electricity markets (maybe separate grids, or strong regulation) for a good that is quite fungible.


For at least 10 years I have had a contract for "green energy" from my municipal supplier. They, in turn, contract to buy power from solar and wind suppliers to fulfill the consumption of those who are part of that program.

It does indeed work as I describe. And, because the wind and solar providers are among the lowest-cost providers at this point, "black" energy is losing market share quite quickly.

I understand electricity markets quite well and live and work in one of the most dynamic ones in N. America. This is how it works.


> If you use an extra kW, the city doesn't get to sell the extra kW into the national grid

It ain't guaranteed that the city would even be selling into the national grid in the first place.


I suppose the question is old desktop, old laptop, or old server. Depends on your compute needs.


And loud as well - servers aren't as optimized for noise as desktops are.

I use ThinkPads from Ebay - I get the high end model from 4-5 years ago. Coming off lease they can be in great condition often loaded with RAM and just need an SSD. The build quality is great, and they're cheap.


I looked into buying a server to harvest the CPU and RAM, then building a desktop machine using a regular cooling components to keep it quiet.

I found the real problem of a re-purposing server is not the noise, but rather that the motherboards are completely unsuitable for desktop work. And buying a new workstation motherboard that would take the Xeon(s) and ECC ram would make the build more expensive than just buying a new desktop machine with consumer grade hardware.


Couldn't you just use the server MB in an (e)ATX case, and use a different heatsink/fan?


Those motherboards have nothing to do with ATX. The rackmountable "blades" are 19" wide and ~24" long. They use redundant custom "jet engine" PSU with custom connectors. Even their PCI-E isn't standard. They are custom built for each generation of servers and mass produced.

Even higher end workstations like the various HP "Z" series and Proliant are custom.

(disclaimer: I have my (large, ~120 Units) homelab/self_employed_datacenter on ATX Rackmount 4U cases with watercooling. I am used to buying servers for the CPU and RAM as others do in this thread. I use consumer gears too, but they die young under full C++/C embedded CI jobs 24/7. I use 4U cases because of the unbearable noise of the blades)


Can I ask what you're doing with that home lab, what your electric bill is, and how loud your racks are?


> what your electric bill is,

I am in Quebec, power is cheap to the point of this being irrelevant. Heating is also required a large part of the year (including until yesterday because the weather has been horrible so far this "summer"...). Plus, well, tax credit for business expenses make that "less than free".

> and how loud your racks are?

Watercooling and passive PSUs makes it silent. I hate noise.

> Can I ask what you're doing with that home lab

Beside all the usual services for a small business (phone, email, storage, backup, routing, etc), it is mostly an oVirt (RedHat Enterprise Virtualization) private cloud running Docker VMs used by the CI to compile C/C++, build embedded device firmware images and run tests. The extra horse power (and least power efficient) nodes are woken up use wake-up-on-lan and boot a template using PXE from the GlusterFS distributed NAS. They are shut down when the load goes down. Only the big 7U case with the double1 140mm watercooling fans is open all the time. It is the "head" of the cloud cluster (but in theory is configured to auto migrate in case of hardware failure to the dual 120mm 4U case, but honestly I never tried the automatic migration).

Not very pretty, but good enough. https://imgur.com/a/hnlInSz (yes, there is some overlap between those 2 pictures because I moved some units between taking them, some units have been traded for others too).


Like Elv13 already said: the form factor is totally different, so it won't work in any desktop enclosure or with a standard power supply.

But apart from that, server boards sometimes don't have a storage controller, no audio, only a few USB2 port. Most have onboard VGA graphics that you can't turn off, or the BIOS won't support outputting video with a different graphics card.

And then there are usually few Windows drivers and you could get into heaps of issues with (the lack of) UEFI if you want to run a desktop version of Windows.

Server boards are custom build for datacenter application and just aren't suitable for workstation use.


My 26 core lenovo/xeon workstation is quieter than my T450s. The former is exactly the same noise level 100% of the time regardless of idle or compiling code. My thinkpad spins its fans up to a nice hum when compiling code.

Some rackmount server hardware is fairly quite these days too. Variable speed fan profiles and a cool room do wonders to keep the fan speeds fairly low. Probably the noisiest think in my rack at home these days is the Ethernet switch.


I bought a house a few years back and quickly realized that leaving my computers on 24/7 was terrible for my electricity bill. Now I feel bad for doing this while living with my parents all those years.


Heh, as a kid I always wondered about this. My parents hated if someone left the lights on, but my computer has a 600w power supply... how many bulbs is that!


That 600W is the maximum power it can supply.

You'll only come close to this figure when the PC is under full load (like gaming). But while idle, a (modern) PC draws about 30-50 watts.


Unless you were cranking everything to the max 24/7 (like maxing out your GPU while reading/writing to a database while doing a bunch of floating point calcs for something?), it probably idled at around 100 watts. Not great, but not something I'd worry about.


Like running SETI@home? https://setiathome.berkeley.edu/

I certainly had that running 24/7 when I was in high school.


If it makes you feel better, those older machines probably didn't have much in the way of power management to throttle back at idle.


What?

Mine has a ‘Turbo’ button!


About 100x 6watt LED bulbs.

I now get upset when someone goes back inside to turn off a 6w LED switched bulb. It’s not worth taking up a minute of 2 people’s time.

At $6 per year, it may not be worth turning off at all. Those seconds to turn it off add up.

But good luck arguing with someone that argues this specific wastefulness should trump all other wastes.


What matter more is whether flicking the switch wears out the components more than letting it run. The replacement cost matters more than the running electrical cost.


Opening the door probably costs more in energy (from AC or Heating losses), than the savings on turning off LEDs.


Often those switches are connected to multiple led bulbs, which can be 65w for a room.


I have been telling my family that for awhile. It's going to be $3 to $6 a year if you NEVER turn off an LED bulb lamp.

If you forget, it is ok.


Though this is why I bought a 0.5w LED for a room that only the cat uses. Sure it cost $4 instead of $1.5 for a 6w, but it’ll pay for itself.


The fact that LED bulbs are that cheap nowadays is insane to me.


The $1.50 may have been subsidized, but that’s just smart. Average electricity prices may be cheap, but marginal prices might be several times.

Saving has big systemic benefits.


Agreed. I have a couple 9w LED lamps that are never switched off. It cost about $7 per lamp per year at $0.09/kwh.


Power consumption is somewhat irrelevant if you live in a place where there is hydro or solar power - eWaste recycling is a huge environmental problem [1], and manufacturing has its own environmental footprint [2].

Buying used hardware is great, but really we should also be probably optimizing for performance-per-watt even with older hardware.

IMO the real issue is that most people just don't need more grunt than a 5-year-old machine is giving them at this point. What they really need is a solid state disk in the old hardware, and for Office and Chromium you'd never notice a difference. I still use a SATA SSD daily and while I've had NVMe in my work-supplied machines, for 99% of my workload it's just not even necessary.

[1] https://eridirect.com/blog/2015/06/how-does-e-waste-affect-t...

[2] http://www.electronicstakeback.com/toxics-in-electronics/whe...


I have multiple r720's that idle at between 130 and 155 watts according to iDRAC. The system has 8+ SSDs, 10 gig fiber, infiniband, and 160GB of RAM. Under load it goes up but typically it uses less than 200 watts.


I ran a core2 duo for years as a server, it was more efficient then any of the newer stuff I could find. I just upgrade to an I7, which costs more energy annually.

Granted it's faster and maybe less likely to break. http://cpuboss.com/cpus/Intel-Core2-Duo-E8400-vs-Intel-Core-...


It's probably also more energy-proportional. newer CPUs and hardware in general do a much better job of scaling power use with demand. so if you're not running flat out 24x7, you're probably using less power in practice, even though your max may be higher now.


Good point, it certainly seems to run cooler.


And heat; I have some computers I just don't run for the hot three months of the year; and I live in a temperate climate. In the winter though; they are nice room warmers.


You are probably seriously over estimating the actual power requirements. I have several servers running, some are 5+ year old Xeon towers, and they aren't costing anywhere near that.


How about you buy these machines and only run them in the winter? Heck, mine coins if you want! It's just going to heat the house!


Several years ago my friend did that to learn about cryptocurrency. He bought a bunch of GPUs off CL that were being dumped by people in favor of FPGAs (or whatever was the new hotness at the time for bitcoin). He had a bunch of server racks that he got for free when AMEX dumped 'em here locally. Also bought the mobos and ram off CL.

Then ran extension cords from all over his house into his garage to power the things, networked everything together, and played with it for 6 months, mostly over the winter.

He said he was able to break even (that is, mined enough coins - mainly other alt-coins, not bitcoin) to offset his costs in both electricity and what he spent on the hardware. Kept his garage closed and the door to his house open, and it heated things fairly well from what I recall.

In the end, though, he shut it down, because it was starting to go upside-down for him; I don't know what he did with the hardware (he ended up giving me one of the GPUs - so maybe he parted it out for friends to upgrade their systems?)...


What about the time investment? How can you take that into account in an unbiased (or least biased) kind of way?


Well, theoretically he did it because he genuinely wanted to spend that time learning, so the time investment shouldn't be factored in.

Same way that if I go to a movie, I don't typically think about it as costing me the price of a ticket and 2 hours of wages (although technically I could, I guess). I calculate time cost for things that I don't want to do.

If anything I would use the time spent in the value calculation (I payed $13 for 2 hours of entertainment).


There are plenty of movies where I've come out thinking that I'll never get those 2 hours of my life back. I do apply this logic to commute time though. $salary - $commute = $takeHome. Some jobs are not worth it.

I also agree 100% with the time cost on something you're wanting to learn is a sunk cost. After all, it's an investment in yourself. Even if it fails, you now have that experience of what not to do if presented the chance again.


In The Netherlands you get a large part of your travel costs reimbursed by your employer. How large it is, there are legal upper limits to avoid "untaxed extra payment".

If you can travel by public transport, part of the time of your commute is akin to leisure time. But I have big issues with failing to focus, especially if I gotta switch transport multiple times or run to make it or its crowded or...

My limit of commute is basically an hour, and if I don't get the costs reimbursed I simply do not take the job. It is bad enough that I lose that time as it is (as I argued above, it is hardly akin to leisure time).

As for the topic at hand, I've spend a good amount of time and money on e.g. old UNIX hardware (such as SGI Indy/Indigo 2/Octane, Sun Ultra 10, and some DEC Alpha machines) back in the start of this millennium. It was costly and bad for my electricity bills, and it took me a lot of time to play around with old platforms. I had a lot of fun though. And I'm not sure you can benchmark "fun". Nowadays, with solar energy on the rise, it might actually matter less to have these machines running. Except for in the summer. The additional heat would kill me.


Highly agreed on the commute time formula.

Learning to think about time spent on getting ready for work and commuting to and from work as a direct extension of my working hours changed the way I looked at jobs and approach salary negotiation. It's so easy for someone to ignore that cost if they haven't thought about it, and so hard once they have thought about it :)


"I learned how alt-coin mining works, first-hand" could easily be seen as a worthwhile experience.


Because if you normally heat your house through cheaper means (wood, natural gas, etc.) like most Americans, then it's still going to add a lot to your power bill.


But in the US, the time of year for heating your house with wood,natural gas, etc is the cheapest time for electricity.


Do electricity prices vary throughout the year? I don't remember that being the case when I lived in the US, and it's not the case in Ontario. We have time-of-use pricing that swaps the mid-peak and on-peak rates, but that doesn't affect the total charge very much.


Utility companies offer a product of supply and demand. During the middle of the day in the summer in Texas, demand is at its peak as everyone runs their A/C full tilt. That's the most expensive unit of electricity, so the pricing reflects that. Everyone knows that you don't run your laundry/dishes during that time. Winter time, most places are heating with gas, so electricity demand is just never as high so the prices are cheaper. Yes, some pricing options claim they are giving "free nights and weekends", or avg billing that "lowers" the summer rates while "raising" the winter months to keep it on average the same per month. That doesn't actually change the rate per KWh at the time. It's like buying car with the squeezing the balloon analogy, squeeze the price on one end the numbers bulge somewhere else.


Only makes sense if you don't have a heat pump.


Not everyone has to use electricity for heating.


However, electricity at least has the potential to be generated by solar, wind, hydro or nuclear; most people in North America are heating with natural gas or even heating oil, which by definition can't get to carbon-neutral.


What if the electricity is generated via hydroelectric or solar?


> The power consumption of some of these old workstations is obscene

Absolutely this.

I work from home and for around six years ran a company provided refurbished Dell Precision (a T5400 I think) workstation with dual Xeon CPU's, SAS Disks etc - this was around 2008. I think when purchased new this machine cost around GBP3K, we got it for around GBP1K. The thing was switched on for around ten to twelve hours a day and the power consumption was (I eventually realised) eyewatering.

The machine fortunately developed a fatal motherboard fault and I replaced it with a self build Core i5-4690K machine with SSDs and a modern graphics card. My electric bill halved and I got a way more powerful and energy efficient machine. It was quite a revelation.

These old machines may seem like bargains, but energy-wise they're just not economical to run (especially at UK domestic energy prices).


A few things in their defense though - newer workstations like z230 from HP generate much less heat noise and have lower power consumption. They are not $200 cheap but they cost a lot less than a loaded MBP for sure - even more so when bought refurbished or on eBay.

These workstations can be put in sleep mode with Windows or Linux and they wake up pretty quick - saves you lot of power consumption by not needing to keep it always on. Wake on LAN is your friend.

Next everything is easily replaceable which is great for the environment as these have much longer lifespans than a glued-in laptop.

Lastly the performance is just much better than a thermally constrained laptop.


Yeah. There are workstations and workstations. Why would the original poster's single Haswell- or Skylake-based Core i7 CPU in his workstation waste more energy than a single Skylake-based Core i7 CPU on a new gaming machine? Also, his storage and GPU seemed quite modest, so I don't think his HP is more wasteful than a new gaming PC.

Of course, 28 cores and eight spinning hard drives is a different story.


Reddit people with Kill-a-watt units indicate loads well below 200 watts for these workstations.

Heck, beasty Dell R710 2U servers appear to pull (in some cases, well) below 200 watts [1].

1. https://www.reddit.com/r/homelab/comments/7r90l8/average_pow...


Also consider that many xeon workstation/server boards have limited sleep state support. Any newer i3/i5/i7 will enter S5 sleep and barely sip 1w, with instant on wake. The time you save sitting waiting for the machine to start up (workstation/server boards are also insanely slow to post) will be worth the investment in a newer setup.


both HP and dell workstations fully support S5.


In a desperate attempt to get me to take Xeon Phi seriously, Intel once sent me a free Xeon Phi server. I still have the thing, but because the boards were passively cooled, the server fans sounded like an F-14 (edited) trying to take off from an aircract carrier when I booted the thing up. I ended up extracting the Xeon Phis and using them in consumer cases with a much quieter fan blowing on them. It also weighs a ton, but I digress...


To be pedantic a F-15 is an Air Force plane and not designed for a carrier. A F-14 is what you are looking for (think Top Gun).


To be more pedantic, he said "trying to take off". I think an F-15 pilot who somehow found himself on a carrier deck and had to (try to) take off wouldn't hold anything back in the power/noise department. :)


In the case of being extra extra pedantic, how did the F-15 without arresting gear get on the carrier in the first place? Okay, enough fun.


Theoretically it could be carried by an aerial crane heli, like a Mil V-12 (20-25T), an empty F15 being in the 15T range.


Noted, edited... Time to buzz the server tower?


I had a Dell R815 quad Opteron server for a short while. Kept a pair of ear defenders on hand for power up time!


May I ask why would Intel send you a server for free or what occupation comes with perks like this?


I worked in oil and gas (Halliburton) for 12 years.

The amount of stuff we got in "for review", "for test", "preview", etc. was simply amazing. Even pre-production gear a lot of the times. I found a pair of Tesla cards just sitting in a box in an office I cleaned out one day... and I know we got a system with some Phi cards in it when they came out.

The most interesting thing I ran into was when cleaning out a facility after a move, we found a Dell Itanium-1 box that not only did Dell not want back, they wouldn't even admit to making it in the first place... It ended up going home with one of our devs...

Nice thing about being a sysadmin was that we would get "video cards and such from our developers who had just upgraded to the latest and greatest - and the stuff they were throwing out was only one or two years old.. so our own desktop workstations built with cast-off parts were pretty nice.


That's pretty cool! I knew tech reviewers/writers get free stuff all the time but didn't know sysadmins do to. Thanks for sharing.


It wasn't really the sysadmins that got free stuff - it was department managers / tech leads, etc, that would get gear in for review to see if it fit with our workflow, processes, etc.

Us sysadmins just had to install/maintain it, and occasionally would "profit" when it was retired and the company/vendor didn't want it back.

Managed to build an entire multi-node NetApp cluster out of spare and retired parts one day when we were bored. Our NetApp rep said "I didn't see this, I don't know it's here, I don't know it exists, as far as I care it's a bunch of spare parts you just happened to put in a rack..." :D


It really depends. My home server uses about 120w idle (older series i7 with many disks). It comes out to about $25 a month in electricity.

That’s enough to where you should make sure it’s worth it, and ask yourself if you’re better off with a higher end NUC or even raspberry pi, which use orders of magnitude less power.

(We have solar so I don’t really worry about the power consumption anymore)


Wow what a difference! My laptop shows about 2-3 watt idle (brightness low of course). What a huge difference in power consumption over the years. Do you think that’s mostly the hard disks using the power?


My home server is basically my old laptop. It is a Dell Latitude e6540, just sips power when not under heavy load.

with a 4 core i7 and 16 GB of ram, it is more than enough for home use.

for storage, it has a m.2 SATA ssd for the root partition. And 2x 2tb hard drives that are raided with zfs for the home and network share folders.


Pretty much. I had to stop running my VAXen 24/7 because they use a lot of power and my lab would get too warm.

That said I have picked up "junk" computers with my neighbors kids, got the running and installed by with different operating systems. Fun stuff and you don't worry about breaking it.


Yes, and it can be a double-edged sword. My home office isn't heated (long story), so if I want some cheap heat in winter I'll fire up the old server and just batch convert some video. That's a pretty efficient room heater for 300 watts of power! Conversely I never convert video in summer...


It's funny how there is nothing really wrong with this. Maybe you could put it to work doing some valuable tasks, like protein folding etc? That'd be really cool!


In the winter I let my PC search for Mersenne Primes when I'm not using it.


I remember seeing a few years ago a startup that made a panel heater that was really a computer that would mine Bitcoin or something.


Do you know how it would compare to something like a space-heater in terms of something like $ per degree heated per cubic meter?


Um, exactly the same? (By thermodynamics)


Yes, this 100%. It was fun (and kind of hilarious) to have an old Dell PowerEdge Server in your dorm room back in college, when power was free. (And, in my case, mostly generated by dams, so not that bad for the environment.)

Keeping one in your house would be ridiculous and loud, though. Mine actually required two power cables! I ended up pawning it off to a friend for $20, who pawned it off to one of his friends for $20, and so forth. I wonder where it is now.


Mine went to Goodwill after I graduated from my university. It was a beast in terms of how loud it was.

I was able to strike a deal with IT - they'd give me 2U of rack space in the Uni's datacenter and some subdomains in the Uni's .edu. In return, they got a forever grateful undergrad with too much time on his hands.


Why is everyone here focused on the desktop workstations?

I switched to laptops over a decade ago. Used 3-5 year old Zbooks are 1/10th of their original price, and still perform flawlessly (mind-blowing fact: Haswells can keep up with Skylakes, and even beat them with some overclocking+undervolting).


I agree, one of the reasons to even upgrade personal computers is to take advantage of the power efficiency. Also must do for anyone who is planet conscious.

Perhaps if the person who buys these old enterprise server workstations also has renewable power source, then it would seem to be an over all win.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: