Hacker News new | past | comments | ask | show | jobs | submit login
I heat my home by mining crypto currencies (blog.haschek.at)
519 points by geek_at 2 days ago | hide | past | favorite | 397 comments





1. At an individual/hobbyist level this is a fine thing to do, if one is already a cryptocurrency enthusiast.

2. At a social/planning level it is important to remember that opportunity costs matter, and this should not be recommended as public policy. One kWh of electricity could either be used for mining bitcoin (creating at most one kWh of indoor heat), or it could be used for running a heat pump[*], yielding multiple kWh of indoor heat due to the magic of heat pumps and their >100% "efficiency"[**].

[*] Yes, I realize the OP is already using a heat pump, but assisting it with preheat is inherently less efficient than plowing the same energy into running the heat pump itself.

[**] Yes, heat pumps aren't >100% efficient per a physics textbook, but what most people care about is that you can use them to get >1 joule of indoor heat from 1 joule of energy input, which feels magical when compared to electric heaters, gas furnaces etc.


Definition of efficiency (from Google): the ratio of the useful work performed by a machine or in a process to the total energy expended or heat taken in.

Heat pumps are >100% efficient.

Usually, we cannot convert all input energy into useful work, so efficiency is always <100%, because of energy losses, which are producing waste heat. However, in case of heater, heat is the "useful work", so electrical heaters have "impossible" 100% efficiency, while heat pumps have even more "impossible" >100% efficiency.


I consider personal enrichment as a sort of useful work as it stands to replace my own 'useful work'

Perhaps parent meant useful with the caveat that "it also wont fuck up the climate, and assist in huge destruction and killing millions in the process of making me rich".

Which is so selfish of him. Why should the safety of billions matter over me becoming rich? /s


> or heat taken in.

Ambient heat is not a "heat taken in", because you don't need to supply it. Usually, heat is taken out from the system and dissipate as ambient heat, so a heat pump can use the same heat again and again, without exhausting of ambient heat.

Of course you need to supply it, you wouldn't use one in Antarctica . (I'm no refrigerant expert, but I assume there's nothing that would work well enough to be worthwhile.)

The only reason I can think of for 'or heat taken in' in that definition is exactly to stop people claiming >100% efficiency for heat pumps, which is a corollary for perpetual motion, and equal nonsense.

> so a heat pump can use the same heat again and again, without exhausting of ambient heat.

That's just not true, they are heat interfaces, thermal energy is transferred from outside to a refigerant, to inside (or inside water, or whatever).

If this were true, they would have a closed tank of hot air as well as a closed refigerant loop (as the vast majority do) so it would always be oprrating optimally in any environment, never running out of hot air.

The clue's in the name - pump! - they just move heat about, you wouldn't claim pumping water is >100% efficient because you're creating water out of nothing and you'll never run out of water in the infinitely massive lake you're pumping from, would you?

Put a box over it. Look at what goes into the box and what comes out, on all sides. Efficiency = Outs / Ins. It's a yield. A ratio of how much of what you put in you get back out in the form you want.


English is not my native language. Did I use the word "ambient" properly?

Assuming I understood your argument correctly, yes. :) (ambient foo = the foo all around; usually used w.r.t. temperature, e.g. food cooling to ambient temperature in the room) The air outside that is the input to your heat pump is already there, already at whatever temperature, that's your argument right?

So, yes, it is, but my point is that you can't ignore it as an input just because you don't have to order (and pay for) some air to supply. In calculating efficiency, you have to put an imaginary box around the heat pump, and look at what's going in, and what (that is the desired useful thing) is coming out.

If inputs don't cost you anything, that means you might not care about lower efficiency, but it doesn't mean that the efficiency isn't lower.


1) If we put a very small box around the heat pump, then we will see that temperature in the box increases at constant rate, because of energy loses at the pump. I.e., it will work as a resistive heater, by converting 100% of input electricity into heat, which is the goal. Eventually, the heat pump will stop, because the goal is reached, thus we will have 0 input energy and 0 useful work.

2) To keep the pump running and keep temperature at steady level, we will need to leak heat at constant rate to the second box. Thus, the pump will work at "impossible" 100% of efficiency again: it will convert all input electricity into heat.

If we leak heat at high rate, then the pump will not be able to keep up, but it will still work at 100% efficiency.

If we redirect some leaked heat from the second box back to the pump as input, so it will pump it back into the box, then it will work at 100% of efficiency + additional heat pumped from the second box. Moreover, temperature in the second box will continue to increase at constant rate, so it will reach equilibrium at some point of time again.

3) For the pump to work indefinitely at steady rate, we need something to dump energy into, with infinite capacity. Also, we can use a fraction of this infinite capacity as infinite input into the system, thus the pump will work at 100% + additional heat.

IMHO, first two systems are worthless, because they doomed to reach equilibrium and stop at 0/0.


Totally agreed but I wonder is it applied to extremely cold situation. Possibly outside unit produce heat but not all heat supplied to room.

A heat pump in a −273 °C space will convert electricity to heat at a 1:1 ratio.

If you count the heat taken from the space around the condenser, which you should based on Google's definition, heat pumps are only 100% efficient.

Calling heat pumps >100% efficient is like calling my solar water heater >100% efficient. True if you only count the electrical input, and not other sources of energy.


You will be pleased to know that the majority of heat pumps are installed in homes whose exteriors stay above 0 kelvin year round. Also, in what I assume is an efficiency feature, most heat pumps do not allow operation in a zero Kelvin environment.

Also, to nit pick: Kelvins are not degrees and do not use a ° sign.


Damn, I've been looking for a zero Kelvin heat pump that works in a vacuum to heat my spherical cows.

Are you trying to liquify vacuum? It's challenging. Try to use a heat pump based on the Casemir effect, fluctuating virtual particles violate the energy conservation of the system for short periods.

Maybe you wouldn't need to heat them so much if they weren't frictionless.

Gotta watch out for that Maxwell brand heat pump.

And, as a bonus efficiency feature, most heat pumps are installed with the condenser external to the home!

Yes, if you draw an imaginary box around the home and the outdoor/underground condenser, they're exactly 100% efficient, but if you only consider the inside of the home and ignore that the condenser is taking in heat energy from the outside, they're more than 100% efficient.


Anecdotally, I have come across hotel bar fridges running in entirely enclosed cupboards. Even the power cord exited the box with a rubber seal.

Nice attention to detail in physical craftmanship though!


The power station and natural gas plants are outside my house too, so my restive floor heaters and furnace are >100% efficient.

Just draw an imaginary line around my pocket book then. I pay for the energy I'm getting from the power station. Beyond capex, I don't pay for the energy I'm getting from the heat exchanger.

did not realize kelvin is not a degree. I fixed my comment.

Good thing they wrote −273 °C and not "0 °K".

Pretty sure they edited it.

I was under the impression that HN indicated when a comment was edited. Maybe not and I got mixed up with elsewhere.

Edit: Apparently I'm completely wrong. Carry on and I'll eat my downvotes with aplomb.


This is incorrect. We measure the "efficiency" of a heat pump by the amount of heat moved into a space divided by the work energy provided to the pump (electricity in this case). This is called the "coefficient of performance". Typical heat pumps have a CoP of 3 or more. Meaning, you can spend 100W of electricity to pump 300W of heat into your home. A purely resistive heater only has a CoP of 1.

OP is in Austria where it often gets below freezing.

Air/air heat pumps and ventilation heat exchangers tend to freeze up when it gets too cold. So heat pumps often need to run in reverse (to melt the ice on the condenser) and heat exchangers need to pre-heat air with resistive heating to prevent ice build up.

If he can pre-heat the air with a crypto miner that would be better than either of the above.


Consumer-grade heat pumps do not approach the theoretical thermodynamic efficiency limit especially when the outside temperature is significantly below freezing temperature as that complicates design. Plus they do not last forever and it takes a lot of energy to make them. So a hybrid design with an electrical pre-heater for really cold weather can be the least consuming in the total balance of energy. But then one can just as well mine crypto currencies in the preheater just as the article described.

Looks like even OP with their fancy own home didn't actually install a heat pump the way which will make it more than 100% efficient, what about the large horde of people who rent and can't do a thing about how their house is heated? It's trivial to find out if your heating system is actually efficient and if it's not why shouldnt you run crypto mining?

> didn't actually install a heat pump the way which will make it more than 100% efficient

Or, more likely, they installed a heat pump of a suitable capacity for 80+% of the expected conditions, with a pragmatic "old school" pre-heater to make up when conditions are out near the end of the bell curve.

> why shouldn't you run crypto mining?

I think the key takeaway here is that any time you're using resistive electrical heating for air, crypto mining is a suitable alternative to just making resistive nichrome wire glow red. Whether the cost of 1000W worth of GPU cards compared to the cost of a cheap 1000W fan/column heater makes sense for you is a different question. Whether the payout in whatever crypto you're mining ever comes close to breakeven on the cost of the mining rig and it's depreciation as it becomes more and more out of date is another question. If, like the article's author, you can build an effectively zero cost mining rig from parts on hand, then it's almost certainly worthwhile (modulo what you value your time building/configuring/maintaining it...)

(And there's the deeper ethical question of whether participating in a Proof Of Work cryptocurrency at all is just outrageously leveraging your personal heating requirements into a planet burning speculative ponzi scheme "asset" trashfire...)


I am renting a house and can't do a thing about how my house is heated. How do I find out if my heating system is efficient?

An alternative to using heat pumps is to use gas heating. Gas heating lets you achieve >100% "efficiency" relative to electric heating because you don't have to pay the price of converting the gas into electricity.

This section of this book:

https://www.withouthotair.com/c21/page_150.shtml

Has some calculations that show that using gas to generate electricity (in a modern gas fired power station) which you then use to run heat pumps for heating - is overall more efficient than just using the gas for heat given real world numbers. (It does go on to show the limitations of how much ground-source heat pumping you can do without freezing the ground - his ballpark numbers suggest maybe only 25% of the heating requirements for a typical British suburb could be met with just ground-sourced heat pumps before you'd freeze the available ground...)

(That whole chapter starting back on page 140, and the whole book - are well worth reading in my opinion.)


There is another non-trivial advantage of natural gas. Energy losses from sending it over pipes are smaller that electricity losses in transmission lines, https://www.eng-tips.com/viewthread.cfm?qid=76881

I did this in an apartment I lived in years ago, and do it now in my house.

My computer room in the house is the lowest point in the house, so naturally, it tends to be a couple degrees cooler than the upstairs. This means that if I set the thermostat so that the computer room is comfortable, the bedroom is too warm. If I run the house circulation fan constantly, it's not an issue, but that consumes a decent amount of power.

If I'm going to consume that much power, I might as well mine crypto and make a few dollars. Mining for NiceHash, a service that lets people rent hashing power from miners, I'll average $100-400/month worth of Bitcoin on my RTX 3080, depending on current hashing prices, and how much of my time I spend gaming, since mining is effectively paused while gaming. The mining happens while I use my computer with no noticeable loss in performance, and it puts a few watts of energy into the room to heat it up a bit, evening it out with the rest of the house.


I would really like to try NiceHash, but the fact that I have to add it as an exception to Windows Defender antivirus, and that one of the founders has previously been convicted [0] and served time for writing botnet malware, really gives me pause.

[0]: https://en.m.wikipedia.org/wiki/NiceHash


Windows Defender flags all crypto miners. It is not unique to NiceHash. You don't have to add an exception for the whole NiceHash directory for it to work, but it makes it easier. NiceHash updates the miners occasionally, and if you don't add an exception for the whole directory, then every time a miner gets an update, it's disabled by Windows Defender until you manually allow it.

Seems like there should be a way to have NiceHash sign their binaries or otherwise vet with Microsoft so Defender can automatically allow it. Setting an exception seems like it makes it a target for other malware to hijack or piggyback on.

A signed binary won't help.

Cryptominers fall into the "Potentially Unwanted Program" category, along with remote access software like a VNC server. They're not malware, but can be installed maliciously. If Windows Defender finds a cryptominer, it has no way to determine if it was deliberately installed, so it flags. Manually excepting the directory NiceHash installs the miners is the only way around it.


I wonder if other actual malware now actively looks for that directory to opportunistically install itself into if available?

Yes this would be my concern!

I'd bet that writing botnet software was pretty good training for writing distributed processing software like NiceHash, though I think your concern is still valid.

Yes absolutely, I'm sure writing a botnet is not trivial so I respect the technical ability!

Nicehash (the app) is a convenience tool to manage miner executables -- it's mostly those that are flagged by AVs. You can connect most miner executables to Nicehash (the service) or other pools directly, and some are open-source.

For instance I GPU mine with https://github.com/ethereum-mining/ethminer and CPU mine with https://github.com/xmrig/xmrig, both built from source and I don't use Nicehash (the app). But I haven't looked at the source, so at some point I'm still trusting someone.


This is interesting, so I could run the NiceHash app (which doesn't require an AV exception) and have it mine using open source miners which I have built myself?

Not really - each miner (and versions of them) may have different command line arguments or monitoring APIs Nicehash wouldn't know about, so it's going to be more trouble than it's worth. I meant just starting them from the command line:

ethminer.exe -P stratum2+tcp://<your nicehash wallet address>@daggerhashimoto.usa.nicehash.com:3353


Possibly.

Depends on whether or not the heuristics scanning in Windows Defender would still flag the miners if you compiled it yourself.


Self-ad: check mine (which is also conveniently called Mine). https://losttech.software/Downloads/Mine/

I am considering to add a feature, that would pause when external sensor reports high temperature. But you can already hack around that by creating a background window titled "DON'T MINE" and setting the tool to stop when it sees it.


I'll take a bit of your NiceHash and add some water [0] to that and make you something tasty. A bit more turnkey and a middleman if you don't mind.

[0]: https://www.753.plus/


NiceHash is on GitHub and you could always build it yourself if you’re worried. I think the reason it flags as malware is because virus scanners flag anything crypto mining-related as malware by default.

You're not actually mining Bitcoin with your GPU. I think nicehash automatically picks the most profitable coin to mine at the time and just pays out in bitcoin.

It hasn't been profitable to mine bitcoin with a GPU since 2013 or 2014.


No, nicehash pays you in bitcoin for the specific hash rate that you sell. It’s a hash marketplace denoted in BTC. Nicehash doesn’t mine, it’s customers do.

Sure, but nobody is buying time on your GPU to mine Bitcoin. They are buying time on your GPU to mine some other cryptocurrency. One that uses a different hashing algorithm.

I'm well aware of all this.

Cool

Aren't you concerned with hardware degradation with your GPU running at 100% most of the time?

This is usually a non-concern. It's likely that the GPU will become obsolete in the time that it fails due to being run at 100% of the time.

The other question to ask, if you compile programs, write webpages, and edit photoshop images: are you concerned with hardware degradation with your CPU if you run at 100% all time time, for example, compiling chrome that takes 3+ hours, applying photoshop filters to an image to get a production quality output, or for rendering a 3 hour long animated movie?

Do you worry about the GPU when you are retraining a GPT-2 or GPT-3 scale AI framework?

These questions are kind of ... dependent on each person. Presumably you buy a computer in order to perform computations. That is its purpose.

Just because it's crunching bitcoins/cryptocurrencies doesn't degrade it any more than ... say running an AI framework on a GPU for 3 months continuously to generate a self driving model ...


The flipside, if you have a CPU and you can't keep it pegged 100% of the time, are you wasting resources?

No, because a CPU isn't a consumable resource, like electricity or labor. You should occupy the CPU with all valuable work (for whatever definition of value) but keeping it pegged for no reason makes no sense, and just wastes electricity.

In a way, by not running useful work 100% of the time, you're wasting the resource of time.

If my phone or laptopbattery dies on me I could lose a lot more time/money than by simply not wasting precious resources like battery charge. People buy devices for specific needs. If they can't meet those needs the purchase is a net negative.

I would argue that there is a lot less useful work available to run per unit available CPU time than there is time.

Otherwise, there would be market for people to sell their spare CPU cycles.

The only entities selling CPU cycles are big cloud providers who have much higher utilization in general.


It's value deprecates over time though, as it's market efficiency drops relative to newer cards.

Your return on the opportunity cost is maximized if you can do work closest to the time the card was acquired.

If you buy a card and leave it in the box for 10 years you have not 'consumed' the card, but you have wasted a few hundred dollars.

From an electricity and financial perspective if you have valuable work to do, your costs are minimized if you do that work closest to the purchase date.

This is obvious if you have a significant workload, but maybe not as obvious if you are running CAD. Is the cost of a new card worth the time saved by the new card? Significantly more efficient if you can keep it busy.


No.

My GPU hovers at around 55-60 C while mining with the fans at 77%.

Years ago, I was running an AMD R9 290. THAT ran HOT. Even with fans at 100%, it would hover around 95 C and constantly fight with thermal throttling.

EDIT: And as someone else said, at these mining rates, it will pay for itself in 3-4 months.


At those rates, the GPU will likely pay for itself before it fails.

The main causes of degradation in electronics are usually material failure due to thermal stress (cycles of expansion and contractor weaken the electrical pathways, resistance goes up, that causes more stress etc.) and electromigration due to over-voltage (the flux of electrons induces a shift in the position of the atoms it goes through).

Considering mining is a constant load (not much thermal stress) and improving efficiency requires lowering the voltage those aren't really concerns. There's far less risk than overclocking a GPU for gaming. The only electrical part that may be of concern would be the voltage regulation, but that's still an outlier as long as cooling is adequate.

Fans are know to fail because they're ran at high speeds all the time, but they're a commodity.


Electromigration can happen because of constant high temperatures as well!

I was quite surprised by this, but when running NiceHash on my 2070S with no overclocking etc. it stays at ~60C, fans aren’t even audible, etc. And it generates the equivalent of $5 of BTC every day. Can’t complain.

The only thing that really degrades is fans.

Running a GPU for ~3 years at 75-82c


just limit power to maximum efficiency. My card mines eth 52MH/s@100% and 49MH/s@55% power

I have a 3090 and cheap-ish electricity. Is it actually worth doing GPU mining again? I thought that died out long ago.

The 3090 runs super hot. You probably want to look into a heatsink/fan for your vram.

https://www.reddit.com/r/EtherMining/comments/loxagu/improve...

https://www.tomshardware.com/how-to/optimize-your-gpu-for-et...


Not for Bitcoin.

Other crypto currencies can work better though.

https://www.nicehash.com/profitability-calculator/nvidia-rtx...

Claims you can earn ~$17 per day if you can get electricity to 10c/kWhr running a single RTX3090 but that's all mining Etherium (DaggerHashimoto) not BTC.


Nvidia even offers dedicated mining GPUs: https://www.nvidia.com/en-us/cmp/

In related news: https://news.ycombinator.com/item?id=26192201


Mining bitcoin no, but other cryptocurrencies yes.

Sounds like with the run crypto is on right now, you can make some pretty decent money.


> If I'm going to consume that much power, I might as well mine crypto and make a few dollars. Mining for NiceHash, a service that lets people rent hashing power from miners

What's the benefit of this vs. just participating in a mining pool?


I'm not sure I get it either. Maybe just simplicity / path of least resistance in that NiceHash provides a quick way to get up and running potentially without needing to install mining software and decide on a pool to participate in etc?

https://www.nicehash.com/ got the HN hug of death!

What is the $ cost of the power?

About $20-25/month.

But it's not as if the cost of power is wasted. It's generating heat which I desire. If my GPU is consuming 300W, then that's 300W of heat that my heater doesn't have to generate.


Electric resistive heating is the worst form of heating, though - heat pumps are something like 3-4x the heat per kwh used. So not totally wasted, but a much worse way to get heat into your house, if that's your goal.

Fair enough, but my goal isn't to heat my whole house, just add a little bit to this one room.

My whole house is heated with gas anyways, which AFAIK is more efficient than a heat pump, or at least, is more cost-effective.


Cost effective yes, efficient no.

The efficiency of a heat pump heating a home depends on the temperature outside of the home. I imagine a gas furnace is far more reliable and efficient during times when the outside temperatures are near or below freezing.

I'm stull struggling to understand this concept can someone explain further

Think of an inside-out refrigerator.

A fridge moves a certain amount of heat energy from the inside to the outside, but that doesn't come for free. You have to put extra energy into the system to make that happen. The hot side of the fridge gets the "moved" heat plus the heat from the work you had to do to move the heat. The work heat is waste.

If what you're trying to do is make something hot, you're moving heat energy from the outside to where you want it to be hot. That makes it hotter, but it requires work. In this case, since you want to make things hot anyway, that work heat is extra bonus heat.


Say you have a 1kWh budget of energy to spend to heat a room and access to a variety of heating methods such as an air pump and an electric resistor (space heater or computer). Which one should you choose to maximize increase in warmth? While the electric resistor will convert the energy it uses to heat with a very high efficiency, the air pump will usually heat the room even more because it doesn’t just locally convert electricity to heat, it moves heat from outside the room to inside the room. Further info in [1] and [2].

EDIT: great explanation/details in sibling comment by ’prutschman’.

[1] https://www.homecomfortexpertsinc.com/why-are-heat-pumps-mor...

[2] https://www.quora.com/Why-are-heat-pumps-more-efficient-than...


Think of the way your fridge works, but your house is the outside of the fridge and the ground is the inside of the fridge. That is, the heat pump uses electrical energy to move heat from the ground (more specifically a deep hole that is drilled) into your house. It takes less than 1J of electric energy to move 1J of heat, therefore one can claim an efficiency >100%.

You're still maximizing power generation demand. That makes environmental issues worse even if it does make fiscal sense. You aren't "using power tgat would go to waste anyway".

All power is JIT. It makes a difference.


> You aren't "using power tgat would go to waste anyway".

That's not what I said.

I'm not "using power that would go to waste anyways", I'm generating heat that I would have needed anyways.

In other words, I need to add a little bit of heat to this room. I can either run a small space heater on Low and consume about 300W and get nothing but 300W of heat, or I can mine crypto and consume around 300W, generate the heat I wanted, and earn about $200/month on top of it.

In either case, I'm consuming 300W. I might as well make money in the process.


Is there any cost analyze for using solar panel power for Crypto?

I am installing Solar from Tesla, wonder if I should just use the external power generated for crypto than to send them back to PG&E.


Since you can make money mining crypto paying for power, and the power company pays you less than they charge, or course using it for crypto makes financial sense.

$0.10/kWh @ 300W = $0.1/kWh * 0.3 kW * 720 hrs/mo = $21.60/month

Years ago when mining with a single gaming card was more common and viable, I lived in a rented room in a house owned by a live-in landlord who controlled the heating, and wouldn't turn it on for very long in the winter. Since electricity was included in my (fixed) rent, I was able to keep my room a few degrees higher than the rest of the house.

However I must point out that you shouldn't do this in most circumstances, there are environmentally friendly ways of heating your house.


In countries like France [1], electricity heating is the most environmentally friendly, at 40 g Co2/kWh, while gas (the cheapest and most common) is >250.

[1] https://www.electricitymap.org/map


If the average carbon intensity of french grid is in the 40g, if you take into account when heating is used it is more in the 80gCO2/kWh, anyway a heat pump is always better.

http://www.carbone4.com/wp-content/uploads/2020/06/Publicati...


France's electricity is largely generated by nuclear power which has minimal CO2 emissions.

True, but very misleading.

When you add a marginal kWh of electricity usage, how is that extra 1 kWh generated?

If France runs its nuclear power at nearly 100% utilisation, then if you add 1 kWh usage then that power comes from another source, which could be a high CO2 emitter.

The same goes for hydroelectricity: if all the hydroelectric power is already being utilised (no spillway), then you cannot claim that your electric car is being charged by hydro even if your country has a large percentage of it. There are complications when lakes are involved because whether your power usage is green or not often depends upon future inflows (lots of future rain = green; no future rain = dirty generation in future when lakes get low).


This is an interesting topic - I agree with your interpretation, but to add another wrinkle, I live in the UK which has a decent but not amazing proportion of renewable energy now. I pay a little extra to my supplier to provide me with "100% renewable electricity" - meaning that for every customer on that tariff, they total up the power usage and buy at least that much renewable energy from the grid. I assume that this puts some upwards price pressure on renewable energy compared to non renewable, but how much? Presumably every extra watt I use doesn't result in a whole extra watt of renewable capacity being added, but what's the conversion - 5%? 50%? I don't even know where to start answering that question.

I always figured those programs just would lead to people not on the green plans buying less green energy. Though I guess if you got enough of a critical mass of users on the plan it could have that upward pressure. I wonder if any studies have been done? It seems that green energy is often used as much as it can, as solar/wind/geo all have low marginal costs once built, and gas/coal/oil are the ones manually turned on/off based on total demand. That would lead me to believe buying green power would have no effect, but this is all conjecture.

"100% renewable electricity" obviously doesn't change where the electricity you actually use comes from. We certainly don't want the lights to go out and our oven to stop working just because it's a cold still night and all the hydro was used up. You would need to check the supplier's fine print very carefully if you actually decide you care what this means. The consumer magazine "Which" have attempted that for you if you subscribe, but basically buy from "Ecotricity" or maybe "Good Energy".

The most cynical thing your supplier could do in the UK is buy REGOs (Renewable Energy Guarantee of Origin). At that point the 100% Renewable claim would be legal, without any other effort. If that's all your supplier does it isn't worth a penny, we'll get to why below, but if they charge extra for the tariff based on REGOs it's pure profit for them I assure you.

A better thing they could do is arrange to buy in bulk from a renewable generator or (more practical for larger outfits) just own the renewable generator. This means their interests would at least somewhat line up with yours - if their renewable energy generators are cheaper they get to keep more of your money while offering attractive prices.

But back to REGOs. So, when you make renewable electricity in the UK you get REGOs, and you're allowed to sell them. You could sell your electricity with the REGOs, here you go, 100% renewable electricity. Or, you could snip the REGOs off, sell those to anybody who wants them, and sell the electricity as just electricity, which people wanted anyway.

If "100% renewable" was in very high demand this wouldn't be a problem, a handful of more expensive suppliers would bid for the REGOs, and this would create pressure to deploy more renewable power.

But in the UK the vast majority of households use the incumbent supplier. What does "incumbent" mean here? Well, historically the UK had regional monopoly suppliers who both billed consumers and handled the last mile distribution infrastructure that actually means electricity works in your house. But the Conservative party believes strongly in Free Market principles, even where there's no evidence they would help. So, it privatised the industry, giving away national infrastructure for a song and creating dozens of private companies that notionally can compete to supply electricity. Except of course your actual supply is the same as ever, there is still a monopoly last mile supplier, it just doesn't deal with consumers. What had been the regional suppliers were now private companies that had "grandfathered in" millions of residential customers in their region, but were free to compete to "supply" customers anywhere.

All these companies immediately offered somewhat lower prices to anybody who'd switch, and began raising the prices for those who didn't switch. Switching is annoying (despite efforts to make it simpler it cannot be entirely painless) and so most people never switch. Sounds like you have, so immediately you're not the usual case. They also began aggressive (indeed sometimes outright illegal) campaigns to keep "their" customers and prevent defection to rivals.

So today the situation is that essentially every company advertising their prices to you offers 100% Renewable Energy, because they're buying REGOs for customers like you (and me). They can get REGOs very cheaply because they don't need very many because there are so few customers like us.

And the vast majority of households aren't on any of those supply contracts, they have a "legacy" contract that's more expensive. And so in practice when it's cold and dark the coal and gas power stations are cranked up exactly the same but on paper we can blame those millions of people not all of us with our cheaper 100% Renewable contracts...

The climate is not so easily fooled.


Thanks for the in depth answer - pretty much what I suspected. I think this sort of thing only even starts to work when you have a critical mass of people on this kind of tarriff (and more than that, willing to pay more and/or switch supplier for it). I wonder what actual proportion of the UK market is currently on a "100% renewable" plan.

Your entire assumption is that France nuclear is at 100% utilization. While utilization%s aren't publicly available, from this graph on Wikipedia about TWh produced [1], you can see that Nuclear has upside capacity, and was probably never 100% utilized (it's the main baseload, thus the biggest buffer option).

So the assumption that any marginal utilization is non-nuclear is flawed.

[1] https://en.wikipedia.org/wiki/Energy_in_France#/media/File:F...


Rather than an unrelated graph on Wikipedia, try looking at for example electricityMap::

https://www.electricitymap.org/zone/FR

What you'll see there is as you'd expect if you understand how fission power plants work and how economics works for electricity supply. The nuclear plants may not be running at what is notionally 100% nameplate power output but they are not in fact being used as a "buffer option" very much.

That big yellow-green splodge on the "Origin 24 hours" chart? That's nuclear power. Varying, maybe +/- 2GW , but nowhere close to enough to offset France's varying power requirements over the course of a day.

France's (much fewer) gas turbines are much more able to spin up and down quickly to benefit from transient utilization and so they, together with the interconnects possible due to France's relatively central location (the UK to the West, Germany to the East, Spain to the South) allow it to manage well on most days without tinkering with the power efficiency of the fission generators.


I am glad I will able to put to rest this common misconception about nuclear power ramping up speed. While it's a common rebuttal argument against nuclear, it's rather incorrect since nuclear power is actually able to ramp up fast enough. Moreover, French nuclear stations are typically older than they should and newer ones could move even faster. But better than Germany's

Anyway, enough empty talk and here is a real world use case from West Europe the 19th of march 2019, lots of wind a sunday. I should translate the whole thing for everyone one day.

https://mobile.twitter.com/tristankamin/status/1102620969808...


All I see there is consumption %s. I think your thoughts about marginal power usage seem reductive.

Of course, immediately your marginal power will be by gas turbine. However if power increases overall, wouldn't sustained power usage force an increase nuclear baseload? The overall GW output of nuclear has varied YoY substantially.


> All I see there is consumption %s

The section you apparently didn't scroll down to, "Origin of electricity in the last 24 hours" shows you exactly what I described.

The chart you've offered is denominated in TWh, thus energy not power, because it is cumulative over an entire year rather than showing marginal power. So you end up concluding that if a nuclear power station is closed for two months for repairs, or a new one is brought online those somehow constitute energy "flexibility".


I thought it was obvious I was using 100% as an unrealistic example.

1. Have a quick quick look at https://www.electricitymap.org/zone/FR and scroll down to “Electricity production in the last 24 hours” on the left. Natural gas was used from about 7pm to 5am. If you heat your house in France using crypto during those hours, you are using up to 100% non-renewable peaker gas (ignoring hydro because it is even more complicated as per previous comment).

2. France exports a lot of electricity, so it is quite possible for your marginal increase in usage within France to cause a decrease in exported electricity, which leads to a marginal increase in non-renewables in another country... i.e. your extra load causes an increase in world CO2 production.

I think you are making the same mistake that I am trying to illustrate... understanding marginal usage is difficult and most people jump to conclusions that are not factual.


Even in France a heat pump will give you at least twice the heat for the gram of CO2.

Heat pumps need energy to operate. Hence at low temperatures the best combo is electricity+heat-pump. So cryptomining+heat-pump is exactly the same, minus the e-waste.

Don't heat pumps work like a fridge/AC but backwards? That is, compressing the air inside to generate heat, then decompressing it outside where it gets extremely cold and is warmed up by the relatively hotter outside air?

If so I fail to see how cryptomining features in there.


You're right, the compression step is mechanical. Heat pumps are the most carbon efficient after ~0°C. So in most parts of the globe electrical heat-pumps are the most environment-friendly. Thanks I Edited my first comment.

Yes they do. They usually have classic resistive heating elements for when the outside temperatures drop too low for the heatpump to be efficient.

Obviously, he meant that the heatpump mined the cryo out of outside cold air… cryomining, cryptomining, same thing, right ?

Heat pump is 3x more efficient at heating than any electric resistive heat.

At least in Germany you can also buy electricity that is 100% renewable at 0 g CO2/kWh. It also does not really cost much more (though we generally have pretty high electricity prices because of taxes on electricity).

They don't run an extra cable though, they just buy credits and if you're lucky the profit doesn't go to the parent fossil energy company. It's still better than old energy firms but it's not as if the coal plants run any less hard when you turn on your bitcoin farm with a green contract instead of a fossil one. It's just a matter of where profits flow, and also some public perception. (E.g. Climeworks says that individual customers help show investors that that's a market for it, which helps them do more good than purely the CO2 you pay them to remove.)

Regarding this "fossil parent" thing btw, list of checked companies: https://www.robinwood.de/oekostromreport (I'm with Green City Power because they don't only go the easy hydro route but also build out solar -- iirc, it has been a while). Switching is a matter of signing up with the new provider. Nobody needs to come by, power doesn't go out at all, you just tell them what the meter says on date X and all is good.


I know that's what you're paying for, but I wonder what the marginal effects are. I.e. am I paying for a 100% renewable energy that otherwise would have been sold to a customer not on a renewable plan? If I use more of the 100% renewable energy, does that increase the amount of non-renewable energy that must be generated for other people's consumption? I have no idea how grids work.

I'm in a similar situation, and I'm just using an electric heater... No fancy mining setup.

But I'm also thinking, maybe actually this is for the better? (Not only for the stingy landlord, but for the environment).

I mean, even if a heat pump is much better than an electric heater... Having to heat up only the rooms that need it, instead of kitchen, bathrooms, living room (plus the rooms of other flatmates who might not be bothered) might actually use up less energy total, than using a more efficient mechanism, which otoh would be used for the whole home.


I'm not sure if they're common where you are, but this is what Thermostatic Radiator Valves are used for. You can set an approximate room temperature, and the valve will open/close based on that. There are also smart ones now which can have a schedule applied to them, e.g. don't heat my bedroom during the day.

Despite my precarious housing situation at the time, I also own a big house in the country over 150 years old. The main form of heating it originally were 2m tall ceramic furnaces, always placed so that they could heat two rooms at the time. You put material to burn in from either side, and after getting hot the furnace would keep heating the room for hours, and all it takes is a few logs of wood (that you later have to regrow anyway). I wonder how many bitcoins is that in term of environmental impact.

0.5kg of CO2 per kWh is the average for electric grids around the world.

1kg of wood produces around 2kg of CO2, every 400g of firewood gives you 1 kWh, so 0.8kg CO2/kWh.

That’s slightly worse than mining. Bitcoin is wasteful in its total energy consumption which is not reused, but 99% of the power you put into that GPU will be released as heat.


Difference being if he’s using wood he’s recycling that carbon over infinitely as long as it isn’t from old growth forests.

The wood I use was going to a landfill, so does using aircon actually equate to more CO2 in the atmosphere if I just let it get dumped and turn on the aircon?

The concept makes sense, but I'm not sure the numbers work out. Heat pumps are pretty efficient, something like 300 to 400%. Are you really going to use less than 25 to 33% of your living area and also not provide any heat at all to the other areas?

Also, with forced air heat pumps, you can usually adjust the registers (air vents) to reduce the flow of air to unused rooms.


> However I must point out that you shouldn't do this in most circumstances, there are environmentally friendly ways of heating your house.

What aspect do you think could be friendlier to the environment? Rare metals and production emissions in high tech electronics? Or the fact that it's using electricity (as opposed to more isolation)?


Your electricity likely comes from burning coal. Coal pollutes quite badly.

Other ways to heat a dwelling are more efficient. For example natural gas heaters burn cleaner and energy losses are much smaller compared to all the conversion losses (by the time you produce a BTU of heat from electricity, several BTUs of energy have been lost due to conversion losses, while burning gas is much closer to perfect efficiency though obviously not perfectly efficient). Or if you can use co-generation, you essentially produce no noticeable environmental impact at all. Or you could go the other route and invest in insulation or a molten salt wall for passive heating.


But electricity isn't inherently dirty, right? It's probably better to have the network shift to greener sources than have every consumer do that individually.

Right. If you use solar panels you are definitely better off. The question is whether you are or not and for the time being most places still don’t use renewable energy sources.

Apart from what others mentioned, proper insulation is the best way to keep your house warm, it is missing from many houses.

I mean the OP found an environmentally friendly and economic way of heating his house.

I'm not sure I understand the principle.

A heat pump achieves a COP (coefficient of performance) of approx. 3-4 (e.g. by investing 1 kWh of electricity, a heat pump generates 3-4 kWh of heat by extracting from the surroundings, air or sole, 2-3 kWh). In this example, by pre-heating the air, you supply the heat pump with ~0.9 kWh of thermal energy (the miner will convert 900 watt directly to heat I would assume). So instead of 1 kWh of electricity consumption from the heatpump, you have 1 kWh of electricity for the heat pump, plus 0.9 kWh for the mining, and you end up with a bit more than 3-4 kWh (since the COP of a heat pump increases if the source temperature is higher).

So in a nuthsell: before: in 1 kWh, out 3-4 kWh after: in 1.9 kWh, out 3.5 - 4.5 kWh

so you lose 0.4 kWh?


On principle, yes. There are quirks - the COP is different for lower temperatures, and becomes 1:1 at around 5F (-15C). So preheating the air could improve the effectiveness at low temperatures, and I am guessing it might look like this:

- before: 1kWh in, 1kWh out - after: 1.9kWh in, > 1.9kWh out

However, if it was that simple, I suppose heat pump manufactures would include a pre-heating as built-in feature.

It's very likely that he reduced the energy consumption of heat pump by 50%, but at the same time he uses more than those 50% for mining and has a negative total result that is being offset by the profit from mining itself. Which probably nice for him, but not really for environment :)


> There are quirks - the COP is different for lower temperatures, and becomes 1:1 at around 5F (-15C).

It depends on the unit. Some (Mitsubishi FE12NA) have a COP of 1.75 even at -10F / -20C. See Table 6:

* https://www.nrel.gov/docs/fy11osti/52175.pdf


The used coolant mainly infleunces these numbers.

check chart on: https://www.researchgate.net/figure/Coefficient-of-performan...


IIRC my thermodynamics classes correctly, the heater would be optimally placed on the hot effluent out of the heat exchanger (HE) going into the house. This is because the COP is improved (similarly to heating the cold side) because the hot side of the HE doesn't need to be as hot to get to the same T, but also the HE doesn't need to move the heat through it, increasing efficiency. (COP decreases with increasing heat flux [Q] in practice.) For well-mixed air in a house (a poor assumption), this is the same as throwing the miners in a closet. I would suggest to the author to move the miners to the hot side the HE going into the house's rooms. Simulation or measurements (over the course of a week, not just instantaneous measurements) would be helpful here .

If I were a HVAC company with WiFi thermostats, I would look into including miners in heating solutions.


Heat pumps have thermostats so they do not run at max all the time.

So if the heat pump without the miner use 1 kW to heat to room temperature it would need less than 1 kW with the miner to heat to the same room temperature. Plus as you say the COP increases.

So it could be that 5 degC to 22 degC requires 3 kW but the heat pump can do it with 1 kW (COP=3).

The miner use 0.9 kW to heat the the outside air to 11 degC. It would take 2.1 Kw to heat from that to 22 degC. But the heat pump now has a COP of 4 so it can do it with 0.5 kW.

So you lose 0.4 kW but gain bitcoins.

Or you could say that you get 2.25 as many BTC for the same electricity cost.


I wonder if we could make it even better and make smart heaters that instead of wasting calculations actually do something useful, something like folding proteins or some other computing load that doesn't require huge network load and can be easily distributed (something like SETI@home but actually useful).

In densely populated cities we could bury huge datacenters underground and use that energy for heating directly, and use excess to produce electricity.


Another one: Yandex (disclosure: the company I work for) is using one of its datacenters to heat the entire town it is in.

Proof: https://helsinkismart.fi/case/waste-not-want-not-data-center...


Interesting. Never ran across someone who worked for Yandex. Overall are you happy with your employer?

Absolutely! By any definition, the pinnacle place to work for in Russia.

Currently a Facebook datacenter in Denmark is supplying around 10.000 homes with heating.

You can argue that Facebook isn't actually doing anything useful, but still, it's better than to waste the generated heat.


I don’t think any rational person argues that Facebook isn’t doing anything useful at all, lots of people value the connectivity and ability to share photos during lockdown.

Facebook has a good and bad side just like humans do. For example, if Facebook got rid of the ability to post news, it would be a much better place.


The questions are twofold: does Facebook do more good than bad? Could the good be accomplished without much of the bad? The answers to both questions are arguably bad for Facebook.

But then why is there no serious competitor?

>> no serious competitor?

There are plenty. There is no mirror corporation doing exactly the same thing under a different name. But there is no mirror to Microsoft, Google, Apple or any other large tech corp. These are corporations backed by ironclad IP laws meaning nobody can every play on exactly the same field. For something like facebook, the competitor is all things not facebook. Every time you share a new story via SMS, you are competing with facebook. Every time you send a message via email rather than via facebook you are competing with facebook. And ever time you visit a store's own website rather than their facebook page, you deny facebook a tiny bit of the world. That is the serious competitor.


The competition for Facebook is owning the social graph as a means to fuel advertising profits. Google+ failed, sure. Instagram would have been eating Mark's lunch right now if he didn't buy them, and he knows it.

Imagine a world where Facebook didn’t buy WhatsApp and Instagram. So I think the better hypothetical is “imagine a world where Facebook didn’t leverage its existing dominance to preempt competitors”.

Oh, since I do not use Facebook or Instagramm and only occasionally whatsapp, I can very well imagine a world without that all.

Still, if enough people would be fed up with FB, their dominance would fade away. Well, afaik Telegram (and Signal) gained lots marketshare lateley, so lets see


https://friendi.ca is a serious competitor doing mostly the same things as Facebook. (Except it's a social media site first, instead of an ad network first.)

But you probably don't need social media.


Competition to what? Facebook's main product is ads.

A social network is a natural monopoly. You are kept at Facebook because all your friends are on Facebook, and they are kept there because you are.

Or completely remove posts and keep only Messenger and Whatsapp. I've been using almost only Whatsapp and Telegram to share stuff with friends and groups of friends for a few years. If I want to read news I look for them either on Google News or on the very web sites that publish them, the ones I trust.

If Google and Facebook would block news on my country I wouldn't notice much.


Also, Facebook created apis that other developers utilize for many AppStore apps. So it's really nothing you can control.

I think Amazon does the same with a datacenter and powers a refuge

Fascinating! Is this true? Do you have a link? Thanks!

Most articles are in Danish, the best I could find quickly is: https://www.datacentremagazine.com/data-centres/facebook-exp...

Edit: It's should be noted that most Danish cities already have a remote heating infrastructure in place. Aside from the regulatory issues, it's mostly a question of hooking up datacenters and other heat producing industries to that infrastructure. In most places utilising the remote heating isn't voluntary, if it's available where you live, your home has to be connected.

Things like datacenters are slowly replacing coal fired heating plants, because most of those plants where made to generate electricity, but that's now supplied by more and more renewable energy. So cities need to find other sources of heat, to replace the volume no longer coming from the power plants. Where I live that's datacenters, waste incinerators and heavy industry.

As a new thing, remote cooling is now also attempted by using cold water from limepits.


How are the houses centrally heated l? I live in the Southern US and we do not have harsh winters. Most houses are heated by gas or electric furnaces (central air).

It’s interesting you can also use remote cooling from lime pits.


It's simply insulated pipes with hot water running underground. Homes in more remote places often use gas or electric heating.

The hot water pipes can actually run surprisingly long streches, but it’s only economical for denser populated areas.

There have also been plans to do this in Finland, but most of what I can find is either marketing material or news articles discussing plans [1, 2] rather than actual achievements, so I'm not sure what actually became of it.

There's a brochure from the national innovation fund that mentions a town actually covering about half of its heating needs with heat from a data center, though. [3]

Swedish telco Telia also has had similar plans for a data center in Helsinki, Finland, and their website says their "goal is to recover and reuse all the heat produced" [4], but I'm not sure how much weight to give that since proclaiming a goal only costs a few words. It would be nicer if they said what they're actually doing at the moment even if it were much less than "all of it".

[1] https://www.zdnet.com/article/from-deep-underground-data-cen...

[2] https://www.theguardian.com/environment/2010/jul/20/helsinki...

[3] https://www.sitra.fi/en/cases/district-heating-from-data-cen...

[4] https://www.telia.fi/business/telia-helsinki-data-center


Here's a Finland example that was posted earlier. It makes it seem like it's already working.

https://helsinkismart.fi/case/waste-not-want-not-data-center...


Thanks. That would seem to be the same case as in my third link.

I don't doubt it, they'd be silly not to do it: good PR from a waste product? Perhaps you can even charge for it? Amazing deal, especially the former and especially for Facebook. It's also very common; a school I went to was heated by the data center across the road.


> The computing-heater warms buildings ecologically and for free, thanks to the waste heat released by embedded microprocessors. By performing complex IT operations...

... which are what, exactly?


Going by the vaguness of the site, I wouldn't be suprised if they're mining bitcoin for themselves while you pay for the electricity.

Edit: Dug around some more, looks like they're building a BOINC-like service? https://computing.qarnot.com/en/


If I remember correctly, they pay for the network and electricity use of the heater. They rent the platform for computation to other third parties.

So you, store and cool the computer for free. They probably sell directly to buildings and municipalities during construction, so it's installed and left there.

So you either just rent it for a nominal fee or pay nothing.


Sounds like what Nerdalize tried to do in the Netherlands, but IIRC they went bankrupt not long ago.

looking through their tech stack and the hello world example and stuff I think you can put your own stuff i n it, but probably the most common stuff is 3d rendering.

on edit: I got a downvote so someone must think I'm wrong? the basis of my idea was - in the FAQ

https://computing.qarnot.com/en/FAQ

3D:

Blender, Maya, V-Ray, Guerilla

IA / ML / Big data or simulation:

Code Saturne FreeFem OpenFOAM PyTorch TensorFlow SickitLearn Spark

If there is a Docker image, we support the software! You can either bring your own or choose an existing one on Docker Hub. You can also ask our experts for help!

I expected the 3D stuff put up top made it the most used, but could be wrong in that. Obviously also some data crunching, ML tasks, but at any rate if my answer was wrong and so off base as to get a downvote maybe you could also just say why I'm wrong and what it's generally used for?

on second edit: developer documentation https://computing.qarnot.com/en/developers/overview/qarnot-c... made me think that maybe if you have one you can get your own api token and put your stuff on it, obviously you would have to pay them for that so not sure how it would work.


You have been here for a couple of years, surely you have noticed downvotes don't always make sense.

Seriously. I've actually gotten into the habit of compulsively upvoting anything that has been downvoted unless someone is really out of line, even if I disagree with an opinion they're expressing.

Same here. Unless it's something intentionally offensive or trollish, I don't downvote anyone. Including opinions I don't agree with.

If it's something I disagree, and worth discussing, I set aside five minutes to write a good reply instead. I think it's much more constructive.


sure, but I suppose it makes sense to the person making the downvote. And when it makes especially poor sense to me I start to think - maybe they see something I don't?

on edit: I do sometimes also get paranoid and think, man there is just someone who doesn't like me and automatically downvote when they run across my name!


>I do sometimes also get paranoid and think, man there is just someone who doesn't like me and automatically downvote when they run across my name!

It certainly can feel like that sometimes. I rarely enjoy posting here anymore as a result. The fact a single downvote can inhibit your comment's visibility and negatively bias its progression is silly.

The unsettling part is it feels like there's very little stopping individuals and organizations from weaponizing that dynamic. Anything from targeted sustained psychological distress to censorship is possible with the current scheme.


Particularly when there's a case of a single downvote, it's worth remembering that it's easy to click the wrong button here - especially on mobile. I make a habit of checking the command has changed to "unvote" or "undown" correctly to verify I voted how I wanted to.

I've also hit voting arrows a lot whilst scrolling (on mobile) and must not have caught that every single time.


Im not sure what it’s doing but “folding@home” is doing biology simulations as it’s useful distributed computing, that can probably generate some warmth.

https://en.m.wikipedia.org/wiki/Folding@home


Folding@Home has an unofficial guide for it too [0].

At the end of the day, Qarnot rents this infrastructure to other companies for computation and use the heat energy to heat stuff (air, water, warehouses, etc.). Not a bad idea.

[0]: https://greenfoldingathome.com/2020/05/25/how-to-make-a-fold...


I don't know much about this company but I love the name Qarnot which is suggestive of thermodynamically optimal computing. Carnot (https://en.wikipedia.org/wiki/Nicolas_L%C3%A9onard_Sadi_Carn...) was the father of thermodynamic efficiency. He came up with thermodynamics to optimize steam engines, maybe making him the most steampunk of scientists. Thermodynamically optimizing bits is an echo of that for the age of computing.

happy to know they're still on

For years in an apartment I generated heat using surplus servers folding proteins. Some servers that are still quite fast and only a few years old are pretty cheap on eBay. Dockerize the whole thing and it’s easy to start and stop. One time I woke up however and it was especially cold, I realized I had a network issue and my servers could not get any more work units...

'Proof of Useful Work', as opposed to proof of work, is a thing!

https://eprint.iacr.org/2017/203.pdf


Any coins / tokens that implement this or something similar?

The proof of work in tokens has to be a net waste of energy; the idea is that the proof of work is costly enough to prevent a 51% attack.

Making the proof of work do “useful” things would lower the cost of said work thus lowering the barrier of entry to an attack.


> Making the proof of work do “useful” things would lower the cost of said work thus lowering the barrier of entry to an attack.

How does this follow? If the work is so universally useful that it lowers the cost of the work, it lowers the cost for everyone. Not just "attackers", but "defenders" as well.

As it happens, Bitcoin miners mine not out of the goodness of their hearts but for financial profit. Bitcoin POW is "useful" to them: It gives them more money than they put in. They do it precisely because the cost of said work is lower than the returns.


The problem is that the usefulness of said work might not be the same to everyone. Let's imagine a potential coin where mining involves cracking hashes.

This work is useless to you and me (thus the only extracted value would be the reward from mining), but might be useful to someone who's got hashes to crack (so they get extra value out of the same process). In this case, the latter party can enjoy a 51% attack at a fraction of the cost of the former.


That's only true if the work is profitable for you and not something lie for folding@home

How can you chain folding@home puzzles?

PoW requires that a block's solution proves that the solver had access to the previous block.


The linked paper says the opposite.

> This results in PoWs whose completion does not waste energy but instead is useful for the solution of computational problems of practical interest.


do you believe every paper that comes across without critical review?

This 31 page paper most definitely has not been fully evaluated by anyone commenting on it in this thread.


In fact, nobody seems to have read the first page, which has a note that the definition of "proof of useful work" in that paper is trivial. An updated version is available here:

https://eprint.iacr.org/2018/559


Thanks for finding a better paper, I just went for the first paper with the relevant title as I was on my phone.

Proof of waste.

Gridcoin.

I think that could be interesting. As others have said, whatever the computations are, if they are being done anyway, might as well use the generated heat for something useful instead of letting it go to waste.

But I wonder what would happen with such a system in the summer? For example in most on France it gets pretty cold for long enough every year that having a proper heating system and good insulation makes financial sense. But during the summer it's pretty hot, especially in cities. While the heating can be turned off between March and October (give or take), Facebook & co would probably like their DC to keep on working, so to keep on heating, year round.


How hot is hot? A few days ago it was -18 here and now its 27. In a few months we will hit 38. I'm sure I dont want to be doing computations then.

District heating is used year round to get warm water.

There are several useful BOINC-projects, for protein folding you have https://foldingathome.org/, the projects mine computers use most time on are the projects at https://www.worldcommunitygrid.org. You can find lists of projects at https://boinc.berkeley.edu/projects.php and https://www.boincstats.com/page/projectPopularity.

Interesting idea to combine it with larger heating than to heat a room or two. Many will probably argue that it is an ineffective way to create heat without also looking at the benefits from the work done


> Many will probably argue that it is an ineffective way to create heat.

Electrical heating is as close to 100% efficient as you can get. Every watt your computer uses ends up as heat.

Generating those watts from non-renewable sources is much less efficient though.

I wonder if it's possible to calculate when the benefit of contributing to BOINC projects outweighs the CO2 generated.


Residential sized heat pumps can do '400-500%' efficiency (i.e. 4-5kWh of heat for every kWh of electricity), so electrical heating at 100% efficiency is indeed inefficient. If the calculations are really valuable it could still be worth it though.

Also, what would you do in the summer?


Miniscule % of people have heat-pumps. In UK most people use either a gas boiler or a dumb electric heater. And you can't even install heatpump in an an apartment building without re-constructing half of it.

Today that is true, but I think the trend will move towards a larger share in the future. Relevant anecdotes:

- All newly build housing in the Netherlands must be without natural gas, thus either lower heat-grid or heat pump heating - People that use airconditioning for heating have a heat pump without being aware of it (if configured that way)

Finally, for any technology early in the adoption curve, the market share - or even the growth rate (%) - today shouldn't be taken as good indicators for future development. E.g. McKinsey famously underestimated the mobile phone market by 100x that way [1] and the energy predictions on the adoption of solar manage to underestimate installed solar power _every_ year. Instead, also consider growth-of-growth and network effects as adoption grows.

[1]: https://skeptics.stackexchange.com/questions/38716/did-mckin...


They are slowly becoming more common here in Germany. I'd say about half of the houses in the newly developed part of town have one and a few older ones are retrofitting them as well here.

Heat pumps have >100% efficiency. It's better to use AC to heat your house as long as outside temperature is not too cold.

No, heat pumps don't have over 100% efficiency the same way hot water pumps that pump from municipal source don't have >100% efficiency in heating home (it uses hot water available from somewhere else).

Heat pumps engage some other source of energy so if you want to measure efficiency you now need to include that other source into account.

Since you are engaging natural source of energy you can measure how effective (not efficient in thermodynamical terms) your heat pump system is, by calculating how much energy it can transfer for energy put into the pump. But this has nothing to do with efficiency, which in case of devices used to convert one type of energy into another or moving energy from place to place is typically meant in its strict thermodynamical sense.


The efficiency calculation here measures only electrical power in vs heat energy out. Heat pumps are always listed with greater than 100% efficiency. There is no problem with the laws of thermodynamics of doing this and it’s even listed in text books with the explanation of how it is possible.

The work of the heat pump is to move heat from one location to another, it does so with the byproduct of producing more heat, therefore it produces more heat energy than the electrical energy put in.

You’re right that conservation of energy says that the heat in being moved did come from somewhere but that’s outside the system, and you will always find heat anywhere but absolute zero. Calculations for turbines or engines don’t make any efficiency allotments for heat already in the air, which is also necessary for them to run.


I guess you are mistaken about what a heat pump is.

Heat pump is a closed system in which you store energy when it is hot and recover it when it is cold.

https://en.wikipedia.org/wiki/Heat_pump

Heat pump is not just the pump mechanism, but the entire system which includes mass of rock that is heat reservoir.

What typically happens is you drill deep in the ground or rock and circulate air, water or some other refrigerant underground. During summer you pump hot refrigerant to heat up the mass of rock. This can be for example water that has been made hot by the sun. During winter you push water through that warm rock to recover the heat to warm your home.

No, it is not thermodynamically possible to recover more than 100% of stored energy.


We’re talking of home heating where the ground source heat pumps are rare. Here’s an article explaining how they work. Efficiency is 200-300%

https://www.finehomebuilding.com/2020/04/08/how-efficient-ar...


Can climate change be solved by crunching more numbers?

Produce electricity? I assume electricity was used to do the computation in the first place, turning into let's say 80% work and 20% heat. The idea was to use that 20% to heat our homes. The original electricity still needs to come from elsewhere.

Doing computations hardly stores any energy anywhere (the energy stored in, say, magnetization in a magnetic disk is pretty much negligible), so almost 100% of electricity is turned into heat.

The problem with this ~100% efficiency is that, if your goal is heating, you can move way more than 100% heat with 100% electrical energy if you use, say, a heat pump.


Heat pumps are not viable in a lot of cases, in an urban setting, it's probably the vast majority. So for the many cases were the alternative for heating would be burning fossil fuels, using electricity, ideally produced via renewable energy/nuclear, could be a superior alternative.

Wait, why heat pumps are not viable in an urban setting?

My urban apartment has a heat pump and it's nothing special. I even pay the heat bill and they still gave me a heat pump. I have no idea why someone would say they aren't viable. Not only are they viable, they're typical for new construction in warmer climates.

For many reasons.

First, transporting heat as opposed to electricity is very wasteful, so you only want to transport it in very short distances.

Second, typically even small single family home requires quite large volume to store the heat effectively for many months. It isn't that complex only because in a single family home setting you already have a bunch of uncontested land available so you can use the volume that is relatively flat and not too deep.

Building this on a scale of a city would be insurmountable challenge. You would have to dig deeper than the buildings are high and any kind of works like that are difficult in urban areas.


A heat pump is just a backwards air conditioner. Not whatever you are thinking it is.

I think you are thinking of something very different than what "heat pumps" actually are? They don't involve storing heat, they're just an inverse fridge.

I think guys you are all wrong.

https://en.wikipedia.org/wiki/Heat_pump

The first sentence:

"A heat pump is a device that transfers heat energy from a source of heat to what is called a thermal reservoir."

So yes, it involves energy storage.

The way this works is you store heat in the summer (warm up a lot of rock or ground underneath your house) and recover that energy in the winter by pumping a liquid through warm rock back to your house and use it as a heat source.


The air outside your house counts as a thermal reservoir as well. That it doesn't store anything to the next season is annoying and makes it not work well in cold temperatures.

I think wikipedia is wrong or confusing here. A heat pump requires a thermal resevoir (something that doesn't change temperature much when you move heat to or from it) of some sort, usually the atmosphere, or in ground source heat pumps pipes running through the ground, but it can move heat in either direction.

> While air conditioners and freezers are familiar examples of heat pumps, the term "heat pump" is more general and applies to many heating, ventilating, and air conditioning (HVAC) devices used for space heating or space cooling.

Basically what I would understand from the term heat pump in ordinary conversation would be an air conditioner intended for use in a heating dominated climate rather than a cooling dominated one, but there might be some regional differences in usage.


Thermal reservoir is a technical term in thermodynamics and as others have pointed out it doesn’t mean what you think it means. Anything remotely resembling a (reversed) Carnot cycle would involve thermal reservoirs. You can read its own Wikipedia page.

Given that the neighbouring house has one on their wall, I think I know what they are.

What a few of these "lot of cases" where they are not viable?

Anytime it is "cold" air source heat pumps don't work. Cold varies a bit, somewhere between -5C and - 25C depending on design factors. Even in the best case as you get closer and closer to the minimum temperature the worse they work (IE when you need them most!), and once you hit the cut-off you better have a backup source off heat.

You can use geothermo (ground) to work around this. I'd recommend it, but the one time install costs mean it is questionable if it is cost effective.


OK, so that's one, very well understood case. Yes, a heat pump will not work all the time. I have one and it stops working around 20F and the furnace takes over. So what? I live at the US/Canada border and my furnace runs maybe ~2 months during the year. This is still tremendous savings.

Big savings, but is it big enough to be worth the extra expense of a heat pump vs used using the furnace year round. The times when the heat pump works are the times when you least need it, since other activities of life are adding heat to the house too.

When I said Data Center can produce electricity I did not mean that it can produce as much or more than it takes in. I thought this obvious enough that it did not have to be said.

But alas, it has to be said.

No, you can't create perpetuum mobile ie. build a machine that given a supply of energy produces as much or even more energy.

https://en.wikipedia.org/wiki/Perpetuum_mobile


It is still not viable.

Assume your datacenter runs at 50C (122F) and the temperature outside is -40C (-40F), using the datacenter heat to generate electricity has theoretical maximum efficiency of 27.85%. If your datacenter is at 23C (73F) and room temp is 0C (32F), then the theoretical maximum is 7.77%. Note the word 'theoretical maximum', in reality probably at least 10 times worse than that.


You assume DCs are cooled by the whole volume of air inside when in fact you can put heat pipes on components and get closer to 70-90 degrees heat source. Even years ago when I worked in datacenters the air was circulated from outside to the server rack and then back outside, never crossing from the rack into server room.

Newer generations of CPUs are going to be exchange less frequently but will produce more energy as they offer more dense computing. This makes case for investing more in the server hardware.

Even then understand, that 7% of a huge amount of energy is still huge amount of energy.

Edit: Apparently "Facebook datacenter in Denmark is supplying around 10.000 homes with heating" -- source, another poster.

So... you need to rethink your expertise on defining what is and what is not viable.


You can supply heat no problem. What I was talking about was recovering energy from heat, which is physically limited by Carnot engine. The percentage was Carnot engine efficiency. Note that it is impossible to create Carnot engine in real life, so actual efficiency is much, much, much lower.

No, essentially 100% of energy used for computation will turn into 'waste' heat.

If you calculate 1000 digits of pi, those digits will not embody any energy.


Just turning chaos into order takes thermodynamic work, but I was wildly off with my percentages. Thinking in terms of an idealized computer that doesn't create heat, but didn't realize how far we were from that.

On one side, you want to recoup the miner investment and run it as much as possible, ideally 24/7.

But on the other side, you want the heater to regulate temperature (not overheat the room) by switching itself off as needed.

You can have both only if you either waste energy (vent it to outside) or have a thermal storage. It is already existing technology i.e. for electric heaters that accumulate heat using cheap night electricity and slowly release it during the day. Or water boilers. But they are bulky and can only store several hours worth of heat.


I've wondered the same thing every time I hear of "bitcoin will change the climate".

I think we could even design processors differently if heat was not part of the equation. Right now it's all about perf-per-watt, which gives up absolute speed.

Also it would be fun to say "It was so cold that night, <x>" like "I could raytrace in realtime" or "I mined 1 bitcoin" or "crysis ran 10,000fps"


Not quite what you mean, but a city in Australia is trialing "Smart Appliance Response Automation". The gist is that some appliances can utilize excess green power during the day in order to not need that energy later at night when it would need to be provided by gas turbines.

The current trial is actually residential water heaters, by heating the water using excess green power during the day you effectively store that energy as heat and that water then gets used later that night or early morning for showers/baths/washing etc.

Another example is a more obvious one, which is electric cars and other large battery appliances. The ultimate goal is to get these appliances talking to the grid directly, so that they know when to draw power and when to idle.

https://www.energyrating.gov.au/sites/default/files/2020-01/...


My parents have had a smart water heater since 1988. It was interesting a few years back hearing a politician talk about how bad smart meters were - he had to make it clear that the smart meter 80% of the room had for 20 years were nothing like the smart meters he was talking about. (which is to say the ones he was against gave minute by minute reports with the privacy concerns, the ones everyone had just turned the water heater and AC units off/on - I'll let you decide if the issue was real or not)

I think the likelyhood of it being done in the "New Tech" fashion of bi-directional real-time data flows with high level languages is high, and the security of that will be low. So I'm not super pumped about the inevitable grid software exploits. Privacy aside, it's got other potential risks that are more important to address I think.

I understand your point but it does gloss over the fact that the calculations are useful to the bitcoin network in terms of security.

This was discussed before on HN.

The issue is building consensus protocol (in this case consensus that a transaction happened or did not happen).

There exist no physics law that says that achieving this requires burning extraordinary amounts of energy.


> This was discussed before on HN.

You say that like the conversation is resolved?


It is resolved.

Everyone except bitcoin bagholders can see that the BTC proof-of-work protocol is obscenely wasteful. There are already superior cryptocurrencies that use different consensus algorithms and have similar or better security/anonymity properties.


I think you might be underestimating the amount of work required to achieve real consensus among humans.

Look the actual problem as solved by BTC which is digital cash with no protections like chargebacks requires only one single trusted entity (can be more if we want) maintaining a very simple ledger of transactions operating in the open with auditing done by interested parties.

This is plenty achievable given that banking, which is far more complicated and messy, works. It doesn't not require a small country's energy usage to achieve human consensus.

BTC is super cool having created an pseudoanonymous digital voting system that's resistant to ballot stuffing but we're allowed to make stronger assumptions for our financial systems.


What happens when banks break the consensus and what is needed to prevent that?

I think the banking regulator and or justice system in the relevant country step in. Being in a modern society with laws seems to be the main, partly effective preventative measure. If we're going through the work to have laws and courts and regulatory bodies anyway, and if we have the FDIC etc, why shouldn't we get a partly-trustworthy financial system out of it? I mean, of course, not that the people are trustworthy, but that records of what you deposited, withdrew, transferred etc will be respected.

Even if you own crypto, I'm guessing you have a bunch of money in other assets in accounts managed by financial institutions. How often has the bank just decided that you don't own that?


it is not about physics, it's about security.

In other words, when the last Bitcoin is mined, Bitcoin is well and truly fucked.

Bitcoin and other non-inflationary proof-of-work coins need to switch to proof-of-stake if they want any hope of longevity.


It is assumed that transaction fees will be enough to incentivize miners.

Only if you assume the conclusion.

So, absolutely useless for 99.99% of the world.

I was thinking we could build something like this for refining aluminum.

Every week you pick up your 10kg block of bauxite and exchange it for aluminum.

Not sure how much byproduct heat is actually involved though...


Apparently nobody here has mentioned Gridcoin yet: https://gridcoin.us/

The key here is that mining pays enough to offset electricity used for mining.

SETI@home doesn't pay.


So we need a cryptocurrency based on useful calculations instead of useless hashing.

Part of me wonders if we aren’t getting there now. The crypto craze is driving more heavy compute setups around the globe than anything else.

When it finally slows down (if ever) what will all of that hardware be used for?


The problem here is that densely populated cities generally aren't in need of heating: keeping them cooled is a bigger issue!

From a heating perspective, there is basically zero demand for the kind of year-round low-quality heating a data center can produce. From a computing perspective, rare and uncontrollable bursts of computing power aren't desirable either and is a waste of hardware.

The article should be considered an edge case. The author already had hardware lying around for free and required zero usable computation. An in-ground heat buffer wasn't an option. Longevity of the hardware was irrelevant. Heat demand was quite small.

Does it work for a single person? Sure, why not! Will it work on a city-wide scale? Highly unlikely.


So where do you live exactly?

I live in Europe and I would say that heating IS life or death problem whereas cooling isn't.

See, there is this thing that is called hypothermia and if you take a look at the map and find where it is possible to die of hypothermia vs where it is possible to die of overheating, the number of people that live in places that require heating is much more than number of people that live in places that require cooling.

That may change in the future.


Where does this claim that densely populated cities don't need heating come from?

Wth is this "low-quality heating from datacenters"?

Cryptocurrency mining doesn't actually require huge network load

I don't consider cryptocurrency mining as overall beneficial for humanity.

You know, humanity has this huge issue of CO2 in atmosphere, maybe you have heard of it?

We are building more and more renewable sources, but cryptocurrency mining is countering these benefits to a considerable extent.

Additionally, even if we are able to produce 100% energy from renewable sources it still requires energy to scrub carbon from our atmosphere and so any energy put in bitcoin could be used for like saving our planet.


> You know, humanity has this huge issue of CO2 in atmosphere, maybe you have heard of it?

yes, it is exhausting



Network load here refers to bandwidth on the internet, not power consumption on the electricity grid, I believe.

What you're looking for is called the Berkeley Open Infrastructure for Network Computing (BOINC). The work units for BOINC can be crunched to earn GridCoin.

SETI coin should be a thing. It’s nice for people to fold without prompting or compensation, but the added incentive would most likely draw a significantly larger market.

It is not wasted. This is the most common misunderstanding on this forum.

Miners are required to provable expend effort in order to become eligible to produce a block. 2nd law of thermodynamics as core security mechanism, as it cannot be reversed.

If you were to improve efficiency by routing excess heat for other purposes, it will gradually spread across the entire mining industry, and eventually you end up right where we started.


Which is OPs point. The energy has to be wasted, or else everyone will do it to reduce cost and the difficulty just goes up until the net mining reward is similar to what it is now.

And that for something that’s almost exclusively used for speculation, because for anything else transactions costs are too high...


Sure, but if the rebates result in wasted energy instead being used for good, then we’re getting an energy benefit while getting better power.

Hell, dont pay people, if I was mining and it was similar costs to donate the wasted energy I’d do it.


I don’t think GP misunderstood what miners do. What I believe he means by wasted cycles is that the final product in crypto doesn’t have any practical purpose aside from what was agreed on by convention. For any other purpose it’s a wasted effort. This calculation power could instead be used to try to find new practical knowledge, like protein folding to use on new medicine etc.

It's a fallacy that high hashrate means high security.

What matters is the % of miners that are honest. (See bitcoin.pdf)

High hashrate just means higher difficulty. (The difficulty controls the average time between blocks, so it's always targeting to be 10 min on average).

Also, for heating your home, mining bitcoin could end you up with a loss. For example, your heater needs an internet connection, and your heater is will become out-of-date very quickly as new more advanced bitcoin mining gear becomes available. You would also need to have your heater on 24/7 to break even, this it will cost you to turn off the mining heater due to warm weather or to cool it.

Not to mention the noise.


What matters is the % of miners who don’t collude.

Honesty isn’t the issue, collusion is. The entire concept of Bitcoin assumes dishonest players.

The only real attack is for miners to combine hash power to get over 50% of the network hash rate so they can execute double spends - and even that is self defeating as doing so degrades confidence and by extension price.


Only once it gets publicized. You have a window to sell and make an enormous profit before people notice and then the entire bit-conomy gets wrecked. I hope this happens as it is a much better outcome to this mess we've created than wrecking the biosphere.

Yeah, it's a really really tiny window though. chain reorganizations and 50% attacks are really easy to see on the chain and people do watch for them.

They've happened on smaller chains that don't have much hash power attached to them, but even there, they're caught very quickly.


A) 100% honest miners, 1 hash per century. B) 55% honest miners, 1 googol hashes per second.

What is more secure in your view?

hashrate is absolutely necessary for high security.


Read bitcoin.pdf

do you want to cite a specific paragraph?

You can search for the word "honest" and read everything around there…

Section 6 about incentives discusses about why nodes may be honest rather than collude to attack the system. This is the main pillar of bitcoin's security.


> 2nd law of thermodynamics as core security mechanism

LOL... this is the most ridiculous claim I've heard about crypto-mining yet. Might as well buy a truck-full of wine glasses, break them and use the shards of broken glass to prove that I've "expended effort". Now we're using the asymmetry of time itself as a security mechanism. Come to think of it, that may be actually less wasteful than burning the electricity to mine Bitcoin.


What?

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: