2. At a social/planning level it is important to remember that opportunity costs matter, and this should not be recommended as public policy. One kWh of electricity could either be used for mining bitcoin (creating at most one kWh of indoor heat), or it could be used for running a heat pump[*], yielding multiple kWh of indoor heat due to the magic of heat pumps and their >100% "efficiency"[**].
[*] Yes, I realize the OP is already using a heat pump, but assisting it with preheat is inherently less efficient than plowing the same energy into running the heat pump itself.
[**] Yes, heat pumps aren't >100% efficient per a physics textbook, but what most people care about is that you can use them to get >1 joule of indoor heat from 1 joule of energy input, which feels magical when compared to electric heaters, gas furnaces etc.
Heat pumps are >100% efficient.
Usually, we cannot convert all input energy into useful work, so efficiency is always <100%, because of energy losses, which are producing waste heat. However, in case of heater, heat is the "useful work", so electrical heaters have "impossible" 100% efficiency, while heat pumps have even more "impossible" >100% efficiency.
Which is so selfish of him. Why should the safety of billions matter over me becoming rich? /s
The only reason I can think of for 'or heat taken in' in that definition is exactly to stop people claiming >100% efficiency for heat pumps, which is a corollary for perpetual motion, and equal nonsense.
> so a heat pump can use the same heat again and again, without exhausting of ambient heat.
That's just not true, they are heat interfaces, thermal energy is transferred from outside to a refigerant, to inside (or inside water, or whatever).
If this were true, they would have a closed tank of hot air as well as a closed refigerant loop (as the vast majority do) so it would always be oprrating optimally in any environment, never running out of hot air.
The clue's in the name - pump! - they just move heat about, you wouldn't claim pumping water is >100% efficient because you're creating water out of nothing and you'll never run out of water in the infinitely massive lake you're pumping from, would you?
Put a box over it. Look at what goes into the box and what comes out, on all sides. Efficiency = Outs / Ins. It's a yield. A ratio of how much of what you put in you get back out in the form you want.
So, yes, it is, but my point is that you can't ignore it as an input just because you don't have to order (and pay for) some air to supply. In calculating efficiency, you have to put an imaginary box around the heat pump, and look at what's going in, and what (that is the desired useful thing) is coming out.
If inputs don't cost you anything, that means you might not care about lower efficiency, but it doesn't mean that the efficiency isn't lower.
2) To keep the pump running and keep temperature at steady level, we will need to leak heat at constant rate to the second box. Thus, the pump will work at "impossible" 100% of efficiency again: it will convert all input electricity into heat.
If we leak heat at high rate, then the pump will not be able to keep up, but it will still work at 100% efficiency.
If we redirect some leaked heat from the second box back to the pump as input, so it will pump it back into the box, then it will work at 100% of efficiency + additional heat pumped from the second box. Moreover, temperature in the second box will continue to increase at constant rate, so it will reach equilibrium at some point of time again.
3) For the pump to work indefinitely at steady rate, we need something to dump energy into, with infinite capacity. Also, we can use a fraction of this infinite capacity as infinite input into the system, thus the pump will work at 100% + additional heat.
IMHO, first two systems are worthless, because they doomed to reach equilibrium and stop at 0/0.
If you count the heat taken from the space around the condenser, which you should based on Google's definition, heat pumps are only 100% efficient.
Calling heat pumps >100% efficient is like calling my solar water heater >100% efficient. True if you only count the electrical input, and not other sources of energy.
Also, to nit pick: Kelvins are not degrees and do not use a ° sign.
Yes, if you draw an imaginary box around the home and the outdoor/underground condenser, they're exactly 100% efficient, but if you only consider the inside of the home and ignore that the condenser is taking in heat energy from the outside, they're more than 100% efficient.
Nice attention to detail in physical craftmanship though!
Edit: Apparently I'm completely wrong. Carry on and I'll eat my downvotes with aplomb.
Air/air heat pumps and ventilation heat exchangers tend to freeze up when it gets too cold. So heat pumps often need to run in reverse (to melt the ice on the condenser) and heat exchangers need to pre-heat air with resistive heating to prevent ice build up.
If he can pre-heat the air with a crypto miner that would be better than either of the above.
Or, more likely, they installed a heat pump of a suitable capacity for 80+% of the expected conditions, with a pragmatic "old school" pre-heater to make up when conditions are out near the end of the bell curve.
> why shouldn't you run crypto mining?
I think the key takeaway here is that any time you're using resistive electrical heating for air, crypto mining is a suitable alternative to just making resistive nichrome wire glow red. Whether the cost of 1000W worth of GPU cards compared to the cost of a cheap 1000W fan/column heater makes sense for you is a different question. Whether the payout in whatever crypto you're mining ever comes close to breakeven on the cost of the mining rig and it's depreciation as it becomes more and more out of date is another question. If, like the article's author, you can build an effectively zero cost mining rig from parts on hand, then it's almost certainly worthwhile (modulo what you value your time building/configuring/maintaining it...)
(And there's the deeper ethical question of whether participating in a Proof Of Work cryptocurrency at all is just outrageously leveraging your personal heating requirements into a planet burning speculative ponzi scheme "asset" trashfire...)
Has some calculations that show that using gas to generate electricity (in a modern gas fired power station) which you then use to run heat pumps for heating - is overall more efficient than just using the gas for heat given real world numbers. (It does go on to show the limitations of how much ground-source heat pumping you can do without freezing the ground - his ballpark numbers suggest maybe only 25% of the heating requirements for a typical British suburb could be met with just ground-sourced heat pumps before you'd freeze the available ground...)
(That whole chapter starting back on page 140, and the whole book - are well worth reading in my opinion.)
My computer room in the house is the lowest point in the house, so naturally, it tends to be a couple degrees cooler than the upstairs. This means that if I set the thermostat so that the computer room is comfortable, the bedroom is too warm. If I run the house circulation fan constantly, it's not an issue, but that consumes a decent amount of power.
If I'm going to consume that much power, I might as well mine crypto and make a few dollars. Mining for NiceHash, a service that lets people rent hashing power from miners, I'll average $100-400/month worth of Bitcoin on my RTX 3080, depending on current hashing prices, and how much of my time I spend gaming, since mining is effectively paused while gaming. The mining happens while I use my computer with no noticeable loss in performance, and it puts a few watts of energy into the room to heat it up a bit, evening it out with the rest of the house.
Cryptominers fall into the "Potentially Unwanted Program" category, along with remote access software like a VNC server. They're not malware, but can be installed maliciously. If Windows Defender finds a cryptominer, it has no way to determine if it was deliberately installed, so it flags. Manually excepting the directory NiceHash installs the miners is the only way around it.
For instance I GPU mine with https://github.com/ethereum-mining/ethminer and CPU mine with https://github.com/xmrig/xmrig, both built from source and I don't use Nicehash (the app). But I haven't looked at the source, so at some point I'm still trusting someone.
ethminer.exe -P stratum2+tcp://<your nicehash wallet address>@daggerhashimoto.usa.nicehash.com:3353
Depends on whether or not the heuristics scanning in Windows Defender would still flag the miners if you compiled it yourself.
I am considering to add a feature, that would pause when external sensor reports high temperature. But you can already hack around that by creating a background window titled "DON'T MINE" and setting the tool to stop when it sees it.
It hasn't been profitable to mine bitcoin with a GPU since 2013 or 2014.
The other question to ask, if you compile programs, write webpages, and edit photoshop images: are you concerned with hardware degradation with your CPU if you run at 100% all time time, for example, compiling chrome that takes 3+ hours, applying photoshop filters to an image to get a production quality output, or for rendering a 3 hour long animated movie?
Do you worry about the GPU when you are retraining a GPT-2 or GPT-3 scale AI framework?
These questions are kind of ... dependent on each person. Presumably you buy a computer in order to perform computations. That is its purpose.
Just because it's crunching bitcoins/cryptocurrencies doesn't degrade it any more than ... say running an AI framework on a GPU for 3 months continuously to generate a self driving model ...
Otherwise, there would be market for people to sell their spare CPU cycles.
The only entities selling CPU cycles are big cloud providers who have much higher utilization in general.
Your return on the opportunity cost is maximized if you can do work closest to the time the card was acquired.
If you buy a card and leave it in the box for 10 years you have not 'consumed' the card, but you have wasted a few hundred dollars.
From an electricity and financial perspective if you have valuable work to do, your costs are minimized if you do that work closest to the purchase date.
This is obvious if you have a significant workload, but maybe not as obvious if you are running CAD.
Is the cost of a new card worth the time saved by the new card?
Significantly more efficient if you can keep it busy.
My GPU hovers at around 55-60 C while mining with the fans at 77%.
Years ago, I was running an AMD R9 290. THAT ran HOT. Even with fans at 100%, it would hover around 95 C and constantly fight with thermal throttling.
EDIT: And as someone else said, at these mining rates, it will pay for itself in 3-4 months.
Running a GPU for ~3 years at 75-82c
Considering mining is a constant load (not much thermal stress) and improving efficiency requires lowering the voltage those aren't really concerns. There's far less risk than overclocking a GPU for gaming. The only electrical part that may be of concern would be the voltage regulation, but that's still an outlier as long as cooling is adequate.
Fans are know to fail because they're ran at high speeds all the time, but they're a commodity.
Other crypto currencies can work better though.
Claims you can earn ~$17 per day if you can get electricity to 10c/kWhr running a single RTX3090 but that's all mining Etherium (DaggerHashimoto) not BTC.
In related news: https://news.ycombinator.com/item?id=26192201
Sounds like with the run crypto is on right now, you can make some pretty decent money.
What's the benefit of this vs. just participating in a mining pool?
But it's not as if the cost of power is wasted. It's generating heat which I desire. If my GPU is consuming 300W, then that's 300W of heat that my heater doesn't have to generate.
My whole house is heated with gas anyways, which AFAIK is more efficient than a heat pump, or at least, is more cost-effective.
A fridge moves a certain amount of heat energy from the inside to the outside, but that doesn't come for free. You have to put extra energy into the system to make that happen. The hot side of the fridge gets the "moved" heat plus the heat from the work you had to do to move the heat. The work heat is waste.
If what you're trying to do is make something hot, you're moving heat energy from the outside to where you want it to be hot. That makes it hotter, but it requires work. In this case, since you want to make things hot anyway, that work heat is extra bonus heat.
EDIT: great explanation/details in sibling comment by ’prutschman’.
All power is JIT. It makes a difference.
That's not what I said.
I'm not "using power that would go to waste anyways", I'm generating heat that I would have needed anyways.
In other words, I need to add a little bit of heat to this room. I can either run a small space heater on Low and consume about 300W and get nothing but 300W of heat, or I can mine crypto and consume around 300W, generate the heat I wanted, and earn about $200/month on top of it.
In either case, I'm consuming 300W. I might as well make money in the process.
I am installing Solar from Tesla, wonder if I should just use the external power generated for crypto than to send them back to PG&E.
However I must point out that you shouldn't do this in most circumstances, there are environmentally friendly ways of heating your house.
When you add a marginal kWh of electricity usage, how is that extra 1 kWh generated?
If France runs its nuclear power at nearly 100% utilisation, then if you add 1 kWh usage then that power comes from another source, which could be a high CO2 emitter.
The same goes for hydroelectricity: if all the hydroelectric power is already being utilised (no spillway), then you cannot claim that your electric car is being charged by hydro even if your country has a large percentage of it. There are complications when lakes are involved because whether your power usage is green or not often depends upon future inflows (lots of future rain = green; no future rain = dirty generation in future when lakes get low).
The most cynical thing your supplier could do in the UK is buy REGOs (Renewable Energy Guarantee of Origin). At that point the 100% Renewable claim would be legal, without any other effort. If that's all your supplier does it isn't worth a penny, we'll get to why below, but if they charge extra for the tariff based on REGOs it's pure profit for them I assure you.
A better thing they could do is arrange to buy in bulk from a renewable generator or (more practical for larger outfits) just own the renewable generator. This means their interests would at least somewhat line up with yours - if their renewable energy generators are cheaper they get to keep more of your money while offering attractive prices.
But back to REGOs. So, when you make renewable electricity in the UK you get REGOs, and you're allowed to sell them. You could sell your electricity with the REGOs, here you go, 100% renewable electricity. Or, you could snip the REGOs off, sell those to anybody who wants them, and sell the electricity as just electricity, which people wanted anyway.
If "100% renewable" was in very high demand this wouldn't be a problem, a handful of more expensive suppliers would bid for the REGOs, and this would create pressure to deploy more renewable power.
But in the UK the vast majority of households use the incumbent supplier. What does "incumbent" mean here? Well, historically the UK had regional monopoly suppliers who both billed consumers and handled the last mile distribution infrastructure that actually means electricity works in your house. But the Conservative party believes strongly in Free Market principles, even where there's no evidence they would help. So, it privatised the industry, giving away national infrastructure for a song and creating dozens of private companies that notionally can compete to supply electricity. Except of course your actual supply is the same as ever, there is still a monopoly last mile supplier, it just doesn't deal with consumers. What had been the regional suppliers were now private companies that had "grandfathered in" millions of residential customers in their region, but were free to compete to "supply" customers anywhere.
All these companies immediately offered somewhat lower prices to anybody who'd switch, and began raising the prices for those who didn't switch. Switching is annoying (despite efforts to make it simpler it cannot be entirely painless) and so most people never switch. Sounds like you have, so immediately you're not the usual case. They also began aggressive (indeed sometimes outright illegal) campaigns to keep "their" customers and prevent defection to rivals.
So today the situation is that essentially every company advertising their prices to you offers 100% Renewable Energy, because they're buying REGOs for customers like you (and me). They can get REGOs very cheaply because they don't need very many because there are so few customers like us.
And the vast majority of households aren't on any of those supply contracts, they have a "legacy" contract that's more expensive. And so in practice when it's cold and dark the coal and gas power stations are cranked up exactly the same but on paper we can blame those millions of people not all of us with our cheaper 100% Renewable contracts...
The climate is not so easily fooled.
So the assumption that any marginal utilization is non-nuclear is flawed.
What you'll see there is as you'd expect if you understand how fission power plants work and how economics works for electricity supply. The nuclear plants may not be running at what is notionally 100% nameplate power output but they are not in fact being used as a "buffer option" very much.
That big yellow-green splodge on the "Origin 24 hours" chart? That's nuclear power. Varying, maybe +/- 2GW , but nowhere close to enough to offset France's varying power requirements over the course of a day.
France's (much fewer) gas turbines are much more able to spin up and down quickly to benefit from transient utilization and so they, together with the interconnects possible due to France's relatively central location (the UK to the West, Germany to the East, Spain to the South) allow it to manage well on most days without tinkering with the power efficiency of the fission generators.
Anyway, enough empty talk and here is a real world use case from West Europe the 19th of march 2019, lots of wind a sunday. I should translate the whole thing for everyone one day.
Of course, immediately your marginal power will be by gas turbine. However if power increases overall, wouldn't sustained power usage force an increase nuclear baseload? The overall GW output of nuclear has varied YoY substantially.
The section you apparently didn't scroll down to, "Origin of electricity in the last 24 hours" shows you exactly what I described.
The chart you've offered is denominated in TWh, thus energy not power, because it is cumulative over an entire year rather than showing marginal power. So you end up concluding that if a nuclear power station is closed for two months for repairs, or a new one is brought online those somehow constitute energy "flexibility".
1. Have a quick quick look at https://www.electricitymap.org/zone/FR and scroll down to “Electricity production in the last 24 hours” on the left. Natural gas was used from about 7pm to 5am. If you heat your house in France using crypto during those hours, you are using up to 100% non-renewable peaker gas (ignoring hydro because it is even more complicated as per previous comment).
2. France exports a lot of electricity, so it is quite possible for your marginal increase in usage within France to cause a decrease in exported electricity, which leads to a marginal increase in non-renewables in another country... i.e. your extra load causes an increase in world CO2 production.
I think you are making the same mistake that I am trying to illustrate... understanding marginal usage is difficult and most people jump to conclusions that are not factual.
If so I fail to see how cryptomining features in there.
Regarding this "fossil parent" thing btw, list of checked companies: https://www.robinwood.de/oekostromreport (I'm with Green City Power because they don't only go the easy hydro route but also build out solar -- iirc, it has been a while). Switching is a matter of signing up with the new provider. Nobody needs to come by, power doesn't go out at all, you just tell them what the meter says on date X and all is good.
But I'm also thinking, maybe actually this is for the better? (Not only for the stingy landlord, but for the environment).
I mean, even if a heat pump is much better than an electric heater... Having to heat up only the rooms that need it, instead of kitchen, bathrooms, living room (plus the rooms of other flatmates who might not be bothered) might actually use up less energy total, than using a more efficient mechanism, which otoh would be used for the whole home.
1kg of wood produces around 2kg of CO2, every 400g of firewood gives you 1 kWh, so 0.8kg CO2/kWh.
That’s slightly worse than mining. Bitcoin is wasteful in its total energy consumption which is not reused, but 99% of the power you put into that GPU will be released as heat.
Also, with forced air heat pumps, you can usually adjust the registers (air vents) to reduce the flow of air to unused rooms.
What aspect do you think could be friendlier to the environment? Rare metals and production emissions in high tech electronics? Or the fact that it's using electricity (as opposed to more isolation)?
Other ways to heat a dwelling are more efficient. For example natural gas heaters burn cleaner and energy losses are much smaller compared to all the conversion losses (by the time you produce a BTU of heat from electricity, several BTUs of energy have been lost due to conversion losses, while burning gas is much closer to perfect efficiency though obviously not perfectly efficient). Or if you can use co-generation, you essentially produce no noticeable environmental impact at all. Or you could go the other route and invest in insulation or a molten salt wall for passive heating.
A heat pump achieves a COP (coefficient of performance) of approx. 3-4 (e.g. by investing 1 kWh of electricity, a heat pump generates 3-4 kWh of heat by extracting from the surroundings, air or sole, 2-3 kWh). In this example, by pre-heating the air, you supply the heat pump with ~0.9 kWh of thermal energy (the miner will convert 900 watt directly to heat I would assume). So instead of 1 kWh of electricity consumption from the heatpump, you have 1 kWh of electricity for the heat pump, plus 0.9 kWh for the mining, and you end up with a bit more than 3-4 kWh (since the COP of a heat pump increases if the source temperature is higher).
So in a nuthsell:
before: in 1 kWh, out 3-4 kWh
after: in 1.9 kWh, out 3.5 - 4.5 kWh
so you lose 0.4 kWh?
- before: 1kWh in, 1kWh out
- after: 1.9kWh in, > 1.9kWh out
However, if it was that simple, I suppose heat pump manufactures would include a pre-heating as built-in feature.
It's very likely that he reduced the energy consumption of heat pump by 50%, but at the same time he uses more than those 50% for mining and has a negative total result that is being offset by the profit from mining itself. Which probably nice for him, but not really for environment :)
It depends on the unit. Some (Mitsubishi FE12NA) have a COP of 1.75 even at -10F / -20C. See Table 6:
check chart on:
If I were a HVAC company with WiFi thermostats, I would look into including miners in heating solutions.
So if the heat pump without the miner use 1 kW to heat to room temperature it would need less than 1 kW with the miner to heat to the same room temperature. Plus as you say the COP increases.
So it could be that 5 degC to 22 degC requires 3 kW but the heat pump can do it with 1 kW (COP=3).
The miner use 0.9 kW to heat the the outside air to 11 degC. It would take 2.1 Kw to heat from that to 22 degC. But the heat pump now has a COP of 4 so it can do it with 0.5 kW.
So you lose 0.4 kW but gain bitcoins.
Or you could say that you get 2.25 as many BTC for the same electricity cost.
In densely populated cities we could bury huge datacenters underground and use that energy for heating directly, and use excess to produce electricity.
You can argue that Facebook isn't actually doing anything useful, but still, it's better than to waste the generated heat.
Facebook has a good and bad side just like humans do. For example, if Facebook got rid of the ability to post news, it would be a much better place.
There are plenty. There is no mirror corporation doing exactly the same thing under a different name. But there is no mirror to Microsoft, Google, Apple or any other large tech corp. These are corporations backed by ironclad IP laws meaning nobody can every play on exactly the same field. For something like facebook, the competitor is all things not facebook. Every time you share a new story via SMS, you are competing with facebook. Every time you send a message via email rather than via facebook you are competing with facebook. And ever time you visit a store's own website rather than their facebook page, you deny facebook a tiny bit of the world. That is the serious competitor.
Still, if enough people would be fed up with FB, their dominance would fade away. Well, afaik Telegram (and Signal) gained lots marketshare lateley, so lets see
But you probably don't need social media.
If Google and Facebook would block news on my country I wouldn't notice much.
Edit: It's should be noted that most Danish cities already have a remote heating infrastructure in place. Aside from the regulatory issues, it's mostly a question of hooking up datacenters and other heat producing industries to that infrastructure. In most places utilising the remote heating isn't voluntary, if it's available where you live, your home has to be connected.
Things like datacenters are slowly replacing coal fired heating plants, because most of those plants where made to generate electricity, but that's now supplied by more and more renewable energy. So cities need to find other sources of heat, to replace the volume no longer coming from the power plants. Where I live that's datacenters, waste incinerators and heavy industry.
As a new thing, remote cooling is now also attempted by using cold water from limepits.
It’s interesting you can also use remote cooling from lime pits.
There's a brochure from the national innovation fund that mentions a town actually covering about half of its heating needs with heat from a data center, though. 
Swedish telco Telia also has had similar plans for a data center in Helsinki, Finland, and their website says their "goal is to recover and reuse all the heat produced" , but I'm not sure how much weight to give that since proclaiming a goal only costs a few words. It would be nicer if they said what they're actually doing at the moment even if it were much less than "all of it".
... which are what, exactly?
Edit: Dug around some more, looks like they're building a BOINC-like service? https://computing.qarnot.com/en/
So you, store and cool the computer for free. They probably sell directly to buildings and municipalities during construction, so it's installed and left there.
So you either just rent it for a nominal fee or pay nothing.
on edit: I got a downvote so someone must think I'm wrong?
the basis of my idea was - in the FAQ
Blender, Maya, V-Ray, Guerilla
IA / ML / Big data or simulation:
If there is a Docker image, we support the software! You can either bring your own or choose an existing one on Docker Hub. You can also ask our experts for help!
I expected the 3D stuff put up top made it the most used, but could be wrong in that. Obviously also some data crunching, ML tasks, but at any rate if my answer was wrong and so off base as to get a downvote maybe you could also just say why I'm wrong and what it's generally used for?
on second edit: developer documentation https://computing.qarnot.com/en/developers/overview/qarnot-c... made me think that maybe if you have one you can get your own api token and put your stuff on it, obviously you would have to pay them for that so not sure how it would work.
If it's something I disagree, and worth discussing, I set aside five minutes to write a good reply instead. I think it's much more constructive.
on edit: I do sometimes also get paranoid and think, man there is just someone who doesn't like me and automatically downvote when they run across my name!
It certainly can feel like that sometimes. I rarely enjoy posting here anymore as a result. The fact a single downvote can inhibit your comment's visibility and negatively bias its progression is silly.
The unsettling part is it feels like there's very little stopping individuals and organizations from weaponizing that dynamic. Anything from targeted sustained psychological distress to censorship is possible with the current scheme.
I've also hit voting arrows a lot whilst scrolling (on mobile) and must not have caught that every single time.
At the end of the day, Qarnot rents this infrastructure to other companies for computation and use the heat energy to heat stuff (air, water, warehouses, etc.). Not a bad idea.
Making the proof of work do “useful” things would lower the cost of said work thus lowering the barrier of entry to an attack.
How does this follow? If the work is so universally useful that it lowers the cost of the work, it lowers the cost for everyone. Not just "attackers", but "defenders" as well.
As it happens, Bitcoin miners mine not out of the goodness of their hearts but for financial profit. Bitcoin POW is "useful" to them: It gives them more money than they put in. They do it precisely because the cost of said work is lower than the returns.
This work is useless to you and me (thus the only extracted value would be the reward from mining), but might be useful to someone who's got hashes to crack (so they get extra value out of the same process). In this case, the latter party can enjoy a 51% attack at a fraction of the cost of the former.
PoW requires that a block's solution proves that the solver had access to the previous block.
> This results in PoWs whose completion does not waste energy but instead is useful for the
solution of computational problems of practical interest.
This 31 page paper most definitely has not been fully evaluated by anyone commenting on it in this thread.
But I wonder what would happen with such a system in the summer? For example in most on France it gets pretty cold for long enough every year that having a proper heating system and good insulation makes financial sense. But during the summer it's pretty hot, especially in cities. While the heating can be turned off between March and October (give or take), Facebook & co would probably like their DC to keep on working, so to keep on heating, year round.
Interesting idea to combine it with larger heating than to heat a room or two. Many will probably argue that it is an ineffective way to create heat without also looking at the benefits from the work done
Electrical heating is as close to 100% efficient as you can get. Every watt your computer uses ends up as heat.
Generating those watts from non-renewable sources is much less efficient though.
I wonder if it's possible to calculate when the benefit of contributing to BOINC projects outweighs the CO2 generated.
Also, what would you do in the summer?
- All newly build housing in the Netherlands must be without natural gas, thus either lower heat-grid or heat pump heating
- People that use airconditioning for heating have a heat pump without being aware of it (if configured that way)
Finally, for any technology early in the adoption curve, the market share - or even the growth rate (%) - today shouldn't be taken as good indicators for future development. E.g. McKinsey famously underestimated the mobile phone market by 100x that way  and the energy predictions on the adoption of solar manage to underestimate installed solar power _every_ year. Instead, also consider growth-of-growth and network effects as adoption grows.
Heat pumps engage some other source of energy so if you want to measure efficiency you now need to include that other source into account.
Since you are engaging natural source of energy you can measure how effective (not efficient in thermodynamical terms) your heat pump system is, by calculating how much energy it can transfer for energy put into the pump. But this has nothing to do with efficiency, which in case of devices used to convert one type of energy into another or moving energy from place to place is typically meant in its strict thermodynamical sense.
The work of the heat pump is to move heat from one location to another, it does so with the byproduct of producing more heat, therefore it produces more heat energy than the electrical energy put in.
You’re right that conservation of energy says that the heat in being moved did come from somewhere but that’s outside the system, and you will always find heat anywhere but absolute zero. Calculations for turbines or engines don’t make any efficiency allotments for heat already in the air, which is also necessary for them to run.
Heat pump is a closed system in which you store energy when it is hot and recover it when it is cold.
Heat pump is not just the pump mechanism, but the entire system which includes mass of rock that is heat reservoir.
What typically happens is you drill deep in the ground or rock and circulate air, water or some other refrigerant underground. During summer you pump hot refrigerant to heat up the mass of rock. This can be for example water that has been made hot by the sun. During winter you push water through that warm rock to recover the heat to warm your home.
No, it is not thermodynamically possible to recover more than 100% of stored energy.
The problem with this ~100% efficiency is that, if your goal is heating, you can move way more than 100% heat with 100% electrical energy if you use, say, a heat pump.
First, transporting heat as opposed to electricity is very wasteful, so you only want to transport it in very short distances.
Second, typically even small single family home requires quite large volume to store the heat effectively for many months. It isn't that complex only because in a single family home setting you already have a bunch of uncontested land available so you can use the volume that is relatively flat and not too deep.
Building this on a scale of a city would be insurmountable challenge. You would have to dig deeper than the buildings are high and any kind of works like that are difficult in urban areas.
The first sentence:
"A heat pump is a device that transfers heat energy from a source of heat to what is called a thermal reservoir."
So yes, it involves energy storage.
The way this works is you store heat in the summer (warm up a lot of rock or ground underneath your house) and recover that energy in the winter by pumping a liquid through warm rock back to your house and use it as a heat source.
> While air conditioners and freezers are familiar examples of heat pumps, the term "heat pump" is more general and applies to many heating, ventilating, and air conditioning (HVAC) devices used for space heating or space cooling.
Basically what I would understand from the term heat pump in ordinary conversation would be an air conditioner intended for use in a heating dominated climate rather than a cooling dominated one, but there might be some regional differences in usage.
You can use geothermo (ground) to work around this. I'd recommend it, but the one time install costs mean it is questionable if it is cost effective.
But alas, it has to be said.
No, you can't create perpetuum mobile ie. build a machine that given a supply of energy produces as much or even more energy.
Assume your datacenter runs at 50C (122F) and the temperature outside is -40C (-40F), using the datacenter heat to generate electricity has theoretical maximum efficiency of 27.85%. If your datacenter is at 23C (73F) and room temp is 0C (32F), then the theoretical maximum is 7.77%. Note the word 'theoretical maximum', in reality probably at least 10 times worse than that.
Newer generations of CPUs are going to be exchange less frequently but will produce more energy as they offer more dense computing. This makes case for investing more in the server hardware.
Even then understand, that 7% of a huge amount of energy is still huge amount of energy.
Edit: Apparently "Facebook datacenter in Denmark is supplying around 10.000 homes with heating" -- source, another poster.
So... you need to rethink your expertise on defining what is and what is not viable.
If you calculate 1000 digits of pi, those digits will not embody any energy.
But on the other side, you want the heater to regulate temperature (not overheat the room) by switching itself off as needed.
You can have both only if you either waste energy (vent it to outside) or have a thermal storage. It is already existing technology i.e. for electric heaters that accumulate heat using cheap night electricity and slowly release it during the day. Or water boilers. But they are bulky and can only store several hours worth of heat.
I think we could even design processors differently if heat was not part of the equation. Right now it's all about perf-per-watt, which gives up absolute speed.
Also it would be fun to say "It was so cold that night, <x>" like "I could raytrace in realtime" or "I mined 1 bitcoin" or "crysis ran 10,000fps"
The current trial is actually residential water heaters, by heating the water using excess green power during the day you effectively store that energy as heat and that water then gets used later that night or early morning for showers/baths/washing etc.
Another example is a more obvious one, which is electric cars and other large battery appliances. The ultimate goal is to get these appliances talking to the grid directly, so that they know when to draw power and when to idle.
The issue is building consensus protocol (in this case consensus that a transaction happened or did not happen).
There exist no physics law that says that achieving this requires burning extraordinary amounts of energy.
You say that like the conversation is resolved?
Everyone except bitcoin bagholders can see that the BTC proof-of-work protocol is obscenely wasteful. There are already superior cryptocurrencies that use different consensus algorithms and have similar or better security/anonymity properties.
This is plenty achievable given that banking, which is far more complicated and messy, works. It doesn't not require a small country's energy usage to achieve human consensus.
BTC is super cool having created an pseudoanonymous digital voting system that's resistant to ballot stuffing but we're allowed to make stronger assumptions for our financial systems.
Even if you own crypto, I'm guessing you have a bunch of money in other assets in accounts managed by financial institutions. How often has the bank just decided that you don't own that?
Bitcoin and other non-inflationary proof-of-work coins need to switch to proof-of-stake if they want any hope of longevity.
Every week you pick up your 10kg block of bauxite and exchange it for aluminum.
Not sure how much byproduct heat is actually involved though...
SETI@home doesn't pay.
When it finally slows down (if ever) what will all of that hardware be used for?
From a heating perspective, there is basically zero demand for the kind of year-round low-quality heating a data center can produce. From a computing perspective, rare and uncontrollable bursts of computing power aren't desirable either and is a waste of hardware.
The article should be considered an edge case. The author already had hardware lying around for free and required zero usable computation. An in-ground heat buffer wasn't an option. Longevity of the hardware was irrelevant. Heat demand was quite small.
Does it work for a single person? Sure, why not! Will it work on a city-wide scale? Highly unlikely.
I live in Europe and I would say that heating IS life or death problem whereas cooling isn't.
See, there is this thing that is called hypothermia and if you take a look at the map and find where it is possible to die of hypothermia vs where it is possible to die of overheating, the number of people that live in places that require heating is much more than number of people that live in places that require cooling.
That may change in the future.
You know, humanity has this huge issue of CO2 in atmosphere, maybe you have heard of it?
We are building more and more renewable sources, but cryptocurrency mining is countering these benefits to a considerable extent.
Additionally, even if we are able to produce 100% energy from renewable sources it still requires energy to scrub carbon from our atmosphere and so any energy put in bitcoin could be used for like saving our planet.
yes, it is exhausting
Miners are required to provable expend effort in order to become eligible to produce a block. 2nd law of thermodynamics as core security mechanism, as it cannot be reversed.
If you were to improve efficiency by routing excess heat for other purposes, it will gradually spread across the entire mining industry, and eventually you end up right where we started.
And that for something that’s almost exclusively used for speculation, because for anything else transactions costs are too high...
Hell, dont pay people, if I was mining and it was similar costs to donate the wasted energy I’d do it.
What matters is the % of miners that are honest. (See bitcoin.pdf)
High hashrate just means higher difficulty. (The difficulty controls the average time between blocks, so it's always targeting to be 10 min on average).
Also, for heating your home, mining bitcoin could end you up with a loss. For example, your heater needs an internet connection, and your heater is will become out-of-date very quickly as new more advanced bitcoin mining gear becomes available. You would also need to have your heater on 24/7 to break even, this it will cost you to turn off the mining heater due to warm weather or to cool it.
Not to mention the noise.
Honesty isn’t the issue, collusion is. The entire concept of Bitcoin assumes dishonest players.
The only real attack is for miners to combine hash power to get over 50% of the network hash rate so they can execute double spends - and even that is self defeating as doing so degrades confidence and by extension price.
They've happened on smaller chains that don't have much hash power attached to them, but even there, they're caught very quickly.
What is more secure in your view?
hashrate is absolutely necessary for high security.
Section 6 about incentives discusses about why nodes may be honest rather than collude to attack the system. This is the main pillar of bitcoin's security.
LOL... this is the most ridiculous claim I've heard about crypto-mining yet. Might as well buy a truck-full of wine glasses, break them and use the shards of broken glass to prove that I've "expended effort". Now we're using the asymmetry of time itself as a security mechanism. Come to think of it, that may be actually less wasteful than burning the electricity to mine Bitcoin.