Hacker News new | past | comments | ask | show | jobs | submit login
Our data centers now work harder when the sun shines and wind blows (blog.google)
507 points by martincollignon on April 22, 2020 | hide | past | favorite | 189 comments

This is the future. Using algorithms to optimize usage of renewable energy. Not only will it be lower carbon, it will be cheaper. What's interesting is they describe it working on forecasts (for wind and sun) instead of instantaneous renewable production. I wonder what the rationale for that was? Basing the algorithm on instantaneous information should be more accurate and thus give better savings, but maybe it varied too much to reliably run the loads they want.

Imagine when your fridge can do this: freeze extra cold when the sun is shining (or wind is blowing), don't run the compressor when it's not, only run the blower after you open the door to move that extra cold from the freezer, allow a slightly larger temperature range, and of course run as necessary to avoid spoilage. It's not a simple algorithm, it has to handle various timeframes, such as solar being a daily cycle except there's less in winter and can go for a week or more with very little (storm/overcast). Maybe it could also use a bit of "learning" like the Nest thermostats to also optimize predicted usage.

I know of one commercial product that sort of does this: the Zappi electric car charger. If you have grid-tied solar, it measures the current being fed back to the grid and adjusts the charging current to match. So if a cloud goes over your house, or you turn on a big appliance, the charger reduces the power to the car by the same amount. This maximizes the use of your own solar energy and minimizes the use of grid energy.


> Imagine when your fridge can do this

I've been posting for years that an effective grid "battery" is internet connected refrigerators, water heaters, A/C, car chargers, etc., that only run when power is cheap, i.e. when solar/wind is providing excess power.

A great deal of our demand for electricity is elastic and shiftable, which will eliminate a huge chunk of the need for grid batteries.

Glad to see this finally gaining some traction!

How much of energy consumption even is by households though?

I think people have a strong bias for thinking the things they see and touch during the day are environmentally important. But most "pollution"/energy use happen out of sight from our everyday lives.

Or so I think. Happy to be disproven!

~20% is residential—still significant. Plus ~30% transportation, which as we transition to EVs become more and more relevant to residential energy use.

But even commercial and industrial could be optimized in the same manner. I've been thinking: If electricity is much cheaper during certain periods, would factories be built to only run during those times?


This is already a thing, but it‘s only done at “enterprise scale”. Meaning account managers meet with energy purchasers at large manufacturing plants to negotiate “interruptible supply” contracts. The notice periods, levels and tariffs are all negotiated individually. “Pricing managers” on both sides of the negotiation spend a long time modelling the deals in Excel. If algorithms could replace the pricing managers that would save both sides some time and money though, not to mention making it available to the residential and SME market.

If you're in the UK, then Octopus Agile tariff gives you a variable price and an API with IFTTT integration so that you can switch devices on and off depending upon the current price: https://octopus.energy/agile/

That's really cool, for all the spam/ads I have seen about Octopus I've never seen this before, and this is the first time I'm interested. (Ordinarily I'm only interested in price, viewing it as a straightforward commodity, and Octopus has never been appealing on that ground.)

This is already a thing in Australia

Just because something isn't "environmentally important" (whatever that means) doesn't mean you can't optimize it. We need to shift into the mentality of thinking about the environment. Don't slap the hand that's interested in making a difference.

You can optimize small things, but it will only have a small effect.

Better than nothing, sure, but the danger is that you only deal with the small things because you're not even aware of the big ones.

This will only work with fine-grained energy pricing (on the scale of minutes) and smart meters.

Does this exist anywhere in the world?

Octopus energy in the UK with the beta agile tariff is based on 30 minute pricing and prices have been so low at times such as yesterday that the price per unit has been negative, for example on Sunday and Monday I was paid 4.2 pence per unit to take electricity off the network.

Their web site advertises support for IFTTT. Has anyone here hacked on Octopus APIs? Looks very cool.

Yeah I have a bit. I'm on their Agile energy with half hour pricing intervals (this is what the smart meter standards give).

It's really cool. Sometimes the price goes below zero - "plunge pricing" - if there is too much production and too little use. They give you the pricing info for today and tomorrow ahead of time, not sure how they predict future pricing - probably based on weather forecasts, since they only use solar and wind?

I have WiFi smart sockets on things like electric heaters to turn them on when electricity price is below my threshold level. It's a nice feeling being paid for using electricity. If you have an electric car you could also programmatically only charge it when electricity price is below a certain level. Octopus also have an EV leasing company, thinking of selling my current car and leasing one from them after covid19 is gone to a degree that I still need a car again.

Here are their dev docs:


They even have an open graphQL API and a storybook - how many utility companies you know who do this? Looks like Octopus hired the right people!



Providers are starting to pop up in Australia that support this, and people have been quick to start hacking: https://twitter.com/_______kim/status/1226606021558194176

Griddy in Texas is a wholesale energy provider for consumers that passes on the fees set by the Texas grid regulatory agency every 5 minutes.

I think Griddy is a cool idea but only if you have something to automatically cut your usage when the price spikes to $9 per kWh like it did last summer.

It spiked to the regulatory max of $5, which hit me while I was at work. I still saved money compared to most other energy plans in my area.

Though yes, it would have been better if I could have adjusted.

It does in Australia: https://www.amberelectric.com.au/ (disclaimer: I work there)

I can see you're hiring; I'm on the market, what's your thoughts about the workplace? Super interesting industry!

Not true, it would work with hourly pricing. Still requires smart meters but those are something like 60-70% of homes in the US now. Even without real time or time of use based pricing, load shifting can still be valued and paid for as a service. Look up Demand Response: >$1Bn market in the US.

Edit: https://en.wikipedia.org/wiki/Demand_response

Since solar energy doesn't happen after sunset, the idea can still work even just assuming power is cheaper during daylight.

There are even open protocols used in the wild. SunnyHome has it working for example for using your local solar electricity.

It works well if you can aggregate the distributed resources so they're large enough to bid at the wholesale level.

The market is a little more dynamic than that, wholesale bidding into the ISOs is still the biggest option but many utilities run load shifting programs of various sizes as well.

Still, the conclusion is like you said. There's basically no real-world scenario where it makes sense for a residential customer to go it alone, because they can make at most a couple hundred bucks a year. So makes sense for their device companies or someone else to figure it out with utility and pay thousands of homeowners to agree to participate. Ohmconnect has a cool service in California.

Demand response programs are great because of their scale and ability to provide incentives and command and control without directly interacting with the markets.

The real big thing is aggregating the aggregations. Distributed Energy Management Systems (DERMS) / Virtual Power Plant Management Systems are an active research topic.

one could imagine a company that gives fridges out for FREE and sells the aggregated “balancing energy” to recoup the cost of the fridge.

I like the business model. But it'd surely be advertising at you too. The pressure for the owner to pick up those free dollars would be just too strong.


Sense is great, but way overkill for demand response. Also, measurement needs to be billing grade accuracy, and I don't think Sense is going for that

Fridge is typically placed inside a house, unless we are talking about some alternate kitchen style. If so, ambient temp inside an apartment/house that is already heated or cooled is somewhat constant I assume. If so, where is the savings?

Also freezing it more in day doesn’t allow me to freeze less at night. If I don’t open the door at all, then the insulation is already taking care of this. What am I missing?

There are a few things going on here.

First we need to address the basic premise that could allow this to make sense at all in some house or environment -- that one can afford to freeze less during the day if they freeze it more at night (or vice versa).

For a fridge this is hard because you're usually targeting a narrow temperature band (don't want to accidentally freeze your veggies), but for a freezer it usually doesn't matter if you get it 10-20 degrees too hot or too cold as long as everything remains frozen. In that case, your losses (whether they're through the insulation or an open lid) will be higher with a colder freezer, so even if you could get your freezer down to e.g. -200 it might not be cost advantageous to do so, but colder temperatures definitely buy you some extra time. E.g. if normally your freezer is at -5 and you instead cool it down to -10, there will be a nonzero amount of time before it heats back up to -5 again (and the amount of time can be large if your freezer is well insulated and has a high thermal mass).

The "closed system" aspect of this is also interesting.

The cheap answer is that not all houses are heated or cooled (mine isn't), so you can sometimes ignore that part of the equation.

A better answer is that everything mostly works out the same even for closed systems. Suppose you have A/C that cools your home to a target temperature. When your freezer is running it moves heat into the living space, which then needs to be moved outside. As a system, you can pretend those two units are working together to move that same quantity of heat outside, and when the freezer is off you're just not operating that entire heat pumping system (assuming linear power demands and a host of other things to make the example tractable in a comment). While the freezer is off it will also assist in cooling the room down, further alleviating A/C power demands during peak hours (not alleviating them much, but helping nonetheless).

If you're instead heating the room to a target temperature, everything works a bit in reverse. The freezer alleviates heating costs while it's running and exacerbates them while it's off (and critically, it exacerbates them more than if you assumed a constant, low power draw because we're intentionally decreasing it's temperature below a typical freezer level and thus increasing the rate at which heat will enter it from the apartment by some small amount). Assuming you were trying to run the freezer during low cost hours, that effect might be large enough to offset any gains. I doubt it, but that's just a hunch, and I haven't run any hard numbers yet.

It might not be a good idea to temperature cycle your freezer contents, as it speeds up degradation.

Dang, I can't find it right now, but there's a service out there that pays you to reduce your consumption on demand. Use their app to link up with your smart appliances and stuff, and the rest does itself.

Basically they've plugged into the ISO as a "generation" resource, and when the price of power goes high enough, they say "okay we can produce that much power", and they have all their users reduce that much power, which has the same effect. They get paid for the power, and pass some of it along to their users.

The problem is that with modern fridges energy consumption is already low enough that steering that through smart meters probably isn't worth the investment. Large companies could use that, esp with electric heating. But in residential areas, I think energy efficiency can still yield greater results.

There should just be a common protocol for all electical devices - heaters, washing machines, vacuum cleaners, computers, TVs, toasters - everything. Some devices might ignore it, but they should still have the data port and the microcontroller responding with "power regulation not supported". The state should regulate that like it regulates normal power sockets, it's a net benefit for everybody and the cost is minimal.

We can just add usb port to regular wall sockets and make a new power plug with that.

Of course the socket should be backward compatible, but any new devices would require the full plug and responding to the protocol.

Over 10 years we could have an effective grid-wide gigawatthour battery. And everybody would only pay for it like 5 USD every time they buy a new device :)

The cost of a cheap microcontroller, power regulator and an usb plug is like <5 USD. And the possibilities of energy savings are huge. There are also big usability improvements (want to have lights learning your habits and pretending you're home when you're on vacations - someone will make that product).

Want to make sure you switched off that iron when you went to a party? No problem - connect with a smarpthone and check (and switch it off if needed).

There could be several categories of devices with different priorities and default power settings that you could change if you want to:

- background, critical

- background, optional

- background, opportunistic

- interactive, critical

- interactive, optional

- interactive, dangerous if left alone

- bidirectional (like Tesla wall, or a car charger if you want to power your house from your car battery when there is a blackout)

And the house could detect if you're home, and disable the iron if you're not and throttle the freezer and all heaters to optimize the power usage.

Automated vacuum cleaners nad chargers could start by themselves when the energy is effectively free and stop for a moment when there's a shortage.

This will get especially important when electric cars are more common - you don't want everybody to start charging them when they get back home.

I do agree with your overall suggestion, but it would be a mission impossible to change sockets. Indeed, technology for communication over power-lines are nothing new, and doesn't require a new socket/wiring: https://en.wikipedia.org/wiki/Power-line_communication

A grid supply status, could empower next gen energy saving home appliances. This could be implemented as simple as pricing, the ultimate incentive. Price would fluctuate on the wire, just like it does for energy market.

> it would be a mission impossible to change sockets

Why? I lived through one such change in last 30 years (switching from 2-pole europlugs to CEE 7/7). They are backward-compatible, but so would be the usb+power plugs.

We also switched from 60 Hz 240V to 50Hz 230V some time in 00s (don't remember exacty, it was a non-event).

As for incentives new electronic devices should simply be reqired to have these plugs and microcontrollers, like now we require them to have fuses and certain wire gauge, etc.

Why depend on a hack when you can do it right?

Adding a data bus to all lines would be a much bigger deal than changing sockets for existing lines or running a slightly different voltage (often within tolerance of the power supplies anyway!) over the same wires.

And USB isn't meant to operate over those long distances. So you'd either need some active component in each socket or devise a new standard.

It doesn't have to be usb, maybe ethernet would work? Or something new if needed.

Rewiring might be a problem (but it was the same with adding a ground wire).

If we have to we can do the data-over-power between the sockets, and use data lines from sockets to devices.

This would ensure devices don't have to adapt to that hack and eventually we'll have a clean solution.

People are hacking together some interesting things in this space: https://fridge0.branchable.com

> A great deal How much in % ?

>> What's interesting is they describe it working on forecasts (for wind and sun) instead of instantaneous renewable production. I wonder what the rationale for that was? Basing the algorithm on instantaneous information should be more accurate and thus give better savings

The article says they are “shifting the timing of our compute tasks”, so if they think that there will be cheap electricity later in the day (because it’s going to be especially windy or something) it would make sense for them to schedule some of their heavy compute tasks at that time, rather than right now.

Afaik, the speed with which a cold (or hot) body converges to room temperature is exponential. Maybe that limits the smart fridge idea?

It limits the rate at which you can dump energy without excessive temperature excursions.

If you have the space, you can put thermal mass (e.g. water) between the cooler and produce, acting as a cheap thermal battery dampening temperature oscillations. It's often done in off-grid situations.

And stop the train when the wind is not blowing or it is cloudy.

The train can carry batteries that could also be charged through regenerative braking.

The future is nuclear power, and when companies buy hardware they use it at max performance around the clock because energy is cheap and does not depend on weather.

There’s huge lie (by omission) about renewables: nobody explained how to convert the world to 100% renewable energy without coal backup.

Coal and nuclear are both inflexible power sources, they are either on or off, and they are hard to turn on and off. Nuclear does have the benefit of being cheap after very high initial capital costs. Natural gas is a much better backup to renewables however, since it can turned on and off at will. Dam hydroelectric also has that nice quality (send water through the generator when you need it, otherwise let it stay in the reservoir).

Given nuclear’s inflexibility, doing untimely work when less electricity is needed for more timely needs is also a win.

I don’t see nuclear making it big any time soon. There’s too much up front costs, but in theory one could make use of any excess heat during off peak hours.

For example desalination, hydrogen production, indoor growing with the light cycle at night or whenever the low demand period is, etc...

For the near future I’d only expect small modular reactors to see much use in areas with unreliable sunlight for chunks of the year. Especially since they could use the waste heat for heating/growing.

Nuclear with hydro/pumped storage is an interesting combo: use the extra nuclear power to pump water up into a reservoir and then when extra energy is needed, move the water down through a turbine. The reservoir is basically a battery in this context.

Waste heat isn’t really recyclable, or it wouldn’t be waste heat :).

To generate electricity you need a hot side and a cool side.

The huge towers on power plants are for cooling the warm (waste) output. If you can get someone to cool your warm waste water even further than the cooling towers would, it would both increase efficiency of electricity generation, and use the boatloads of low grade heat for something useful.

You still need independent cooling capability either way.

Nuclear's inflexibility is not a barrier to countries like France generating over 70% of its electricity from it. Energy demand fluctuations throughout the day are significant but not huge - usually ~20% different from peak to trough.

“they are either on or off”

This is not true about neither nuclear nor coal.

Nuclear reaction can be slowed down, coal furnace can be used with lower amount of coal at lower temperature.

Nuclear does not need to be turned on or off spontaneously. Nuclear power plant can increase or decrease power smoothly, and that’s practically enough because consumption patterns are easily predicted.

The inflexibility you are describing is nonissue.

High capital cost is. But the largest issue is irrational fears of voters.

Edit: nuclear or coal are not on or off

They can be slowed down. True.

But that takes minutes or hours to do. And on anything but 100% power, the fuel innefficiency is bad. That latter is mostly a problem with coal, but even nuclear is burning fuel and costing wear.

Plants can be optimized to react really fast, or run efficient on lower capacity, or run efficient on max capacity. Choose one.

There’s no need to change power output rapidly. Power demand can predicted very well. Sorry, I didn’t understand your argument.

You're right that power consumption is predictable. But looking at some great-parent posts I'd assume we're still talking about using nuclear as backup for renewable sources. In this case, often the renewable power production is the bigger variable factor in my opinion, and it's less predictable than usage patterns.

> There’s huge lie (by omission) about renewables

I live in Scotland. We now get around 90% of our electricity from renewables. We closed our last coal power station in 2016.

Perhaps you should consider checking your theories against reality before accusing others of lying?

Don't cut off quotes mid sentence:

> There’s huge lie (by omission) about renewables: nobody explained how to convert the world to 100% renewable energy without coal backup.

Because not everywhere in the world has access to hydroelectric power. The point remains: there is no plan to power the world with renewables without a fossil fuel backup. At least not any feasible plan - batteries don't scale, and the Sabatier processes has well below 50% round trip efficiency.

> Don't cut off quotes mid

You managed to get the gist.

You don't need a plan to power the world. Try a plan to power your own state and let the rest of the world sort itself out. When you're at 80% renewables and then get stuck, then I'll believe it's an issue. Not when you're at 20% and handwaving theoretical excuses.

Not every country has so small population and access to hydropower. Sure some small counties can go 100% renewables but it doesn’t scale.

Yes it does. Countries with bigger populations can do more, not less. That should be obvious.

Coal backup is incompatible with renewables. The coal plants clog the grid and prevent the usage of renewables. If you had said gas then those same gas plants could be used to convert hydrogen back into electricity.

It is the opposite: coal plants are turned on only when renewables do not produce.

And I didn’t get about hydrogen. Who can produce hydrogen?

Optimizing for this is a perfect task to throw at a simple market. Especially because actually reworking the software to take advantage of resources at different times is often going to require a decent amount of work by engineers.

One way to do it would be assign various jobs a value, (which could be dynamic e.g. it might get more important as information becomes stale) and have them bid on compute power. You could make the value virtual.

Or you could use real money. This is the premise behind EC2's spot instances. So when power is abundant, your prices drop and the relevant jobs kick off.

Using real market prices makes sense especially if you're renting out computing power, most customers will be happy to adjust workloads to save money.

Even if it's entirely internal, it's good to have a facility to "optimize for cost" and then report the savings. That's helpful to get the engineering resources devoted towards it, because "I saved $X" is a great bullet point to put in anyone's promotion packet or to base a bonus on.

> One way to do it would be assign various jobs a value. Or you could use real money.

It's not the value of the outcome for that job that you're interested, but rather its sensitivity to a delayed latency in executing it.

For example, preemptively converting Youtube videos to a lower resolution with optimum compression to avoid having to do it in real-time (when video is played) at a crappy compression (to be fast), is valuable for sure. It's just that it can be postponed for 24 hours without real impact. Executing a search for a single user is less valuable in terms of overall impact but much more latency-sensitive.

(you can think of value and latency-sensitive in terms of two dimensions that are independent between them.)

This idea helps save the planet for sure, but it requires cloud-providers to build APIs that enable devs to switch from the "here's the SSH to the server, do what you want with it" to a model where it's the devs that say instead "here's a lambda function and its desired latency execution, please schedule to run it for me and let me know when the result is ready" ( https://en.wikipedia.org/wiki/Inversion_of_control )

Google was able to do that because it owns a large part of the jobs executed in their datacenters. Hence they could build this adaptive scheduling for their own jobs quickly without necessarily passing through a cloud-based API that inverts the control of job scheduling.

I think this is much more useful to a cloud provider than to a customer.

As a customer I think you could configure somethings like you said using spot instances on AWS but that’s it, you’re going to save some small amount of dollars in a year but if you account for the engineering hours needed to set this up maybe is not really worth it.

As a cloud provider you could juggle your client between datacenters depending on the load and price of energy over there. A flat rate for a cloud region means that there’s an opportunity for arbitration between datacenters that could be an increase of thousands in profits on their side.

If spot instance price is the only communication channel between you and the cloud provider to achieve this, it's hard to do a good job at it. For example, if the spot price was $0.60 one hour ago, and $0.55 right now, and you still have 13 hours of latency left for your job's execution, should you start triggering it or not? (how do you figure out your bid level?) You could have statistical data to have an intuition what's the lowest price that's been hit historically on similar days of the week in the past etc, but it's inexact and overly complicated.

If the cloud provider becomes aware of the remaining latency you have at your disposal for the job's execution, they can do a much better job. They would be able to look across the entire job execution queue in that datacenter, each job having a specific remaining duration for its execution, they would know the predicted pattern of carbon/solar/wind split for the next hours, and the implementation of the system would sit just on the cloud provider side (making the life of the customers easier and simple).

Of course, in the end, the benefit is towards our planet, but this is much more likely to succeed if the proper cloud API exists and the implementation initiatives are properly aligned to avoid redundant implementations on the customer side.

Thing is, Google requires an absolutely stupid amount of computing resources for running their core business. YouTube transcoding is a great example and a big one for sure, but I bet they have even bigger ones in there somewhere. I have no real data to base this on (and I'm sure nobody does), but I'd bet 5:1 odds that if Google were an AWS customer, they'd be bigger than all the others combined.

So in that case, optimizing for a single customer makes perfect sense if it's the right customer.

Something sort of similar exists for iOS. BGTaskScheduler (https://developer.apple.com/documentation/backgroundtasks/bg...) is the public interface, but the internal interface allows for more subtle calculations about how long you can wait until this job is complete and what the optimal physical condition of the phone is (e.g. battery charging, user not using phone, wireless network connection). One of the important features is that when you run a job you have to check in fairly often (every second or so) to see if your job can still run, and if it can't, stop the job. This stops your two hour ML training session from burning battery for a while if the user wakes up at 2am and decides to go on a walk.

On a smaller scale, you could do what Low Tech Magazine [0] does and actually have downtime when sunlight is low. Since this doesn't happen too often and users can just save articles (with RSS, email newsletters, etc.), websites like this can just be powered by a single computer, solar cell, and small battery in the owner's Barcelona apartment. Thanks to small static pages and tiny dithered images, the site is almost always up.

The future doesn't always need to be as "webscale" as Google; sometimes, scaling down is the smart thing to do. The minimal approach of LTM is the technology equivalent of riding a bicycle (or electric velomobile [1]) to work instead of driving.

[0]: https://solar.lowtechmagazine.com

[1]: https://solar.lowtechmagazine.com/2012/10/electric-velomobil...

Actually, I'd say LTM's is the car, and Google's approach is public transport.

LTM's approach required producing, assembling and shipping a full 2GHz/1GB computer, plus PV, PV-controller and router, all to serve a single site. And it's even turned off some of the time!

Google, on the other hand, is more like a fleet of trains; sure, each one is a honkin' beast, but it also transports thousands of passengers/sites at once, possibly millions in its lifetime.

The bicycle analogy doesn't really work, because a bicycle is just a performance attachment to the real vehicle: the human.

This is actually great: imagine a CDN in various geographical locations all of which work off sunlight in their own time zone (and turn off with no sunlight).

This way you can have 24/7 fully green content delivery to consumers.

Although that being said we could just try to cover the earth with as many generators everywhere and then fully connect the grids.

> Although that being said we could just try to cover the earth with as many generators everywhere and then fully connect the grids.

This is the easier route: you do this by having as many businesses as possible purchase renewables PPAs, where they're specifically contracting for renewable energy.


I respect lowtechmagazine's experiment: it's designed to make you think, and it succeeds.

But CPUs have an incredibly high embodied energy, we should aim for full utilization of servers, regardless of the source of that energy.

If it's three CPUs with carbon-neutral electricity, or one CPU with electricity from coal, the latter is the responsible choice.

Did you mean the reverse? I.e. "If it's one CPU with carbon-neutral electricity, or three CPUs with electricity from coal, the latter is the responsible choice."

If no, then why. I can explain my confusion if needed.

I don't mean the reverse, I mean that running one CPU 24 hours a day instead of 3 CPUs 8 hours a day is better, even if the 3 CPUs have carbon-neutral energy and the 1 CPU does not.

To be fair, these are artificial choices, and that's not even what lowtech was doing (they have a battery). I was responding to the post about the CDNs that stop working at night.

I mean, technically, that's what Google's solution ends up doing.

Their DCs are globally distributed, so their timings on when they'd actually be doing the heavy lifting is shuffled around.

It's good to an extent, but if optimization for cost gets too intense then it will seek out the flaws in market rules. This will be true whether machines or humans are doing the optimizing.

I guess it's okay as long as the people making the rules have good monitoring and are watching out for weird exploits and fixing them. The flexibility to change the rules tends to be more common internally than externally where customers want more guarantees.

As we've seen, there also needs to be a balance between cost-optimization and preparedness. If the wind patterns don't match the prediction then you need to be ready for that.

Also, as we've seen with cryptocurrency, real money attracts theft. A human-adjusted credit system is better. In the real world, this looks like support having the discretion to forgive big bills. But to do that they need to know their customers. It's hard to automate.

Tangent: this discussion is an interesting microcosm of the liberatarian/social-democratic dichotomy of economic theory. GP says a market will take care of itself, parent says not without significant regulation or else the perverse incentives will eventually be exposed and exploited.

I think it's more a continuum than a dichotomy, since there can be more or less regulation. A regulated market is usually considered to be capitalism. For example, the US stock market is regulated by the SEC. But it does get nebulous as you increase the scope and goals of regulation.

Borg has concepts of quota and priority which function as the internal market you are talking about: Verma, et al. "Large-scale cluster management at Google with Borg".

Tangentially related but in Australia we run a household utility company that operates of that same assumption: https://www.amberelectric.com.au/

Our hypothesis is that market signals combined with the right tools (friendly app and home automation) can help households shift demand into less carbon intensive periods.

So far it's working pretty well.

Yes! Especially when you have multiple data centers participating in different locations. It might be cloudy in one place but not another, so jobs get re-routed accordingly.

I really hoped this is what EC2 spot instances would be, but it doesn't seem to work that way. My spot instances usually get terminated due to "no available capacity" without any major price movement.

It would also be pretty neat to integrate processing power markets with the wholesale energy markets. Energy prices are quite volatile and making load responsive to that would actually be quite helpful to stabilize them.

In the UK households used to be able to get economy 7 electricity, which meant cheap electricity at night (I'd guess for 7 hours?).

I've wanted to have realtime pricing like that for a while, it seems to be becoming available again.

I honestly thought that was what the advanced electricity meter roll-out was going to do; but it seems not.

More direct energy cost to service price charged seems like a good thing in general.

TOU pricing is pretty widespread at this point; I have it in a medium-sized city in Canada, and intentionally run laundry (electric dryer) in the evening when power is half the cost of daytime use. With EVs coming online and being set up to charge at night, it would definitely be nice to have a minute by minute spot pricing scheme, though, then you'd basically have a mechanism for using the chargers and other intelligent devices on the consumption side as an intelligent buffer.

It's goofy, but another one is situations where you have a lot of stored heat energy, thinking like pools, hot tubs, hot water heaters, etc— all those things could be activated in response to spot pricing with pretty simple policies (I want a shower of at least X degrees at 7am, I want the hot tub at at least Y degrees by 9pm, etc).

I think when widespread overnight charging of EV's is a thing cheap evening electricity will be phased out

Even in that world, you'd still want a way to manage the load through the night so that you don't have all the chargers clicking on at 11pm and then when the cars are full have a few hours later, demand craters until morning.

With renewables the incentives for load management become even higher (per this article). The real next step after second-by-second billing would be setting up chargers with backfeeding capabilities/policies, so that you have an arrangement with your employer to charge your car at work on cheap daytime solar power, sell it back into the grid during the evening rush, then charge up again on overnight base load. Most EV batteries are way overspecced for what people need in daily use, so as long as you have a special "charge me to full and stay there" mode you can switch into, there'd be no reason (other than a bit of wear and tear) not to cycle your battery like this.

I think your household electricity prices just need to reflect real market prices. For example there were times in germany this year, with negative power prices, because the wind was blowing strong and all the wind plants generating more than anyone needed.

But for me as a customer, the price was as high as allways.

But if customers could react to that (automatically), you could have all sorts of jobs waiting for it. Bitcoin mining, dryer, charging batteries, freezer full power ... would be good for the grid, too, to balance it.

With Octopus you can go to an Agile tariff, where you're paying exactly what the market rate is, which means that frequently at night you are being paid for using electricity(the rate per kWh goes into negative). I know a few people who have it with an electric car, and it means that at certain times you are being paid to charge your car. It's crazy. But also it means that at peak times the cost can be as high as 25-30p/kWh

There are apps that can automate this whole process for you, monitoring the charge your cars needs each night and the dynamic energy prices either on an Economy 7 or Agile tariff and then controls the charging to optimise the price. If you're car needs less charge than your off peak rate time period then it will charge at the lowest carbon times. https://ev.energy

OHME charging cables and wallboxes do this as well

Smart Meters and Advanced Metering Infrastructure (AMI) still have about half of all meters to go in the US. Real time pricing for the customer requires cutting through a few intermediaries designed to keep the grid and utility bills stable, much at the cost of the environment.

Texas energy markets are where all the fun on this front is really happening.

This is a really good argument for carbon taxation to appropriately increase the cost of dirty energy. Send the correct price signal everywhere rather than making your own software do the equivalent of looking out the window at the weather and trying to decide if it’s a sunny or rainy or windy or clam day, and thus if solar or wind generation is making the grid cleaner. Or if instead those are likely offline and the grid is dirtier today.

Best thing. Then you incentivize a cleaner grid overall and you don’t even have to worry eventually about this kind of thing.

In particular a revenue neutral carbon tax with dividend should be politically as uncontroversial as it gets because it is also economically equitable. It's totally perplexing to me why these relatively low-hanging fruit solutions are not being pursued.

Exactly. It then also starts to make local storage (like powerwall) more interesting. When your PV generates a lot, prices will go down so better store the excess for times with higher prices. And when you don't generate PV power but can still store, you buy cheap also to use when prices go up.

If there is also a dynamic price for using the grid, that usage will also spread.

It seems that it must be a really difficult problem to work out the optimal solution for having spare capacity to allow time/location shifting of workloads to minimize carbon per unit of compute.

This Dell paper[0] suggests that 16% of the carbon over a typical server lifecycle is from the manufacture, so you probably don't want a server sitting there unused for 23 hours per day, since the overall carbon/compute ratio would be worse overall.

The post doesn't mention this metric, but it would be really nice to see something more detailed in time - especially with this overall efficiency of the server/datacentre lifecycle in mind, rather than just energy consumed from use.

[0]: https://i.dell.com/sites/csdocuments/CorpComm_Docs/en/carbon...

Carbon consumed in building a server is sunk cost and would be paid independent of whether the server does any kind of carbon-footprint-aware load shifting.

Assuming the server is "sitting unused for 23 hours a day" is the wrong model for what this work changed. You're assuming the server could be running at 50% duty cycle vs. 100% duty cyle. It isn't; since we're talking the batch load, there's a roughly fixed amount of low-priority work to be done and doubling the amount of CPU active-duty time alotted to doing the work doesn't get the work done faster (the details on that are complicated, but that's the right model for what Google's describing here). One should model the duty cycle as fixed relative to the processor (i.e. "This global datacenter architecture, over the course of its life, will do a fixed N units of computronium work on these batch tasks") and then ask whether that work should be done using coal to power the electrons or wind.

Suppose I'm building a new datacenter that I want to do some constant amount of work each day. It doesn't matter the time of day. I can either power it with solar power, in which case it will run for 1/Y of the day, or with coal power, in which case it will run 100% of the day. If it only runs for 1/Y of the day, then I will need to buy Y times as many computers in the solar scenario than in the coal scenario.

If Y = 2 and only 16% of the carbon in a typical coal-powered computer's lifetime is from the manufacture, then solar makes sense - solar is 2*16% = 32% of the carbon of coal. But if Y = 10 - so it's running 10% of the time, meaning there need to be 10x as many computers built - and 16% of the carbon is from the manufacture, then solar power is actually worse for the environment than coal power: solar takes 60% more carbon than coal power.

Of course, this is a vastly simplified situation, but it points to the idea that we need to at least consider the carbon cost of manufacturing.

But again, that's the thing. There is only 1/Y work to do in the day; it's batch work. The work in this case is constrained on the input side, not the CPU resources side; building 1 or 2 or 1,000 nodes to do the work won't decrease how expensive it is to do the work (in fact, building more computers than you need will make it cost more!).

... so why does Google build more computers than they need? Keep in mind that at Google scale, they're always and forever building "As many computers as we can possibly afford to" under the assumption that there will always be work for those machines to do. You and I may need to consider cost of manufacturing; Google doesn't. They always have the "Build datacenter infrastructure" cranked to an 11 (more accurately, they are following an N-year plan of construction that is extremely expensive to modify).

That's the breakdown between Google's way of thinking and the way of thinking that you've presented: Google's cost of manufacture is fixed. Those computers will be built, whether or not they're going to also then run green-streamlined batch jobs. The limiting factor on the batch work is only so much work is generated in a day, and the rate the work is completed is already good enough that completing it in half the time yields no marginal value. So may as well complete it using sun instead of coal.

> Google's cost of manufacture is fixed.

That may well be the way you (and most likely Google) currently look at it, but the argument presented seems to be that everything other than where you run some batch tasks is all fixed.

Assuming everything else is fixed does make the optimisation easier, though surely nearly everything is up for debate if we're actually talking about minimising the overall footprint of a compute task?

Using fossil fuels rather than allowing these fixed costs you mention to sit idle - there is a point where your carbon/CPU cycle would actually go up by not using fossil fuels some of the time (though I am assuming less carbon emitted for a unit of power than manufacture cost).

I appreciate they are ever-expanding and predicting where non-movable workloads are going to need to be run etc etc, and I'm not suggesting there are easy answers.

Is the question on the table whether, all other things being equal, this change would decrease carbon output per unit work completed, or is the question whether Google is greener after this change?

For the latter question, we have insufficient data. Google's datacenter infrastructure is huge and complicated, and if one factors in all the interdependencies and purchased carbon offsets, one needs way more data than this announcement blurb gives out to answer that question. Maybe they have an additional process to determine whether their most carbon-negative datacenters can be switched off during peak coal-use and this work unblocked them from enabling that feature? We don't know.

For the former question, yes.

If I have 2 hours of work to do a day, I could sell 90% of my servers, the racks, the networking hardware, one of the air conditioning units, backup generators, and battery backups, lay off some of my IT staff, and just let the thing run 23 hours a day.

What people are trying to tell you is you can’t make having a glut of equipment work out for any reason, let alone environmental. It costs you a dozen times over to have most of you hardware not doing anything at all most of the day. You are overprovisioned.

You are so overprovisioned in fact that someone will steal your promotion by pointing this out to your boss, which is also a pretty big cost.

> What people are trying to tell you is you can’t make having a glut of equipment work out for any reason, let alone environmental.

Google has a glut of equipment for two reasons: variable loads and experimental.

The batch loads are predictable and constrained by input rate. What the coal ends up used for is spikes in demand for compute power (i.e. "Michael Jackson died and now everyone needs to watch Thriller right now") and ceiling to develop and experiment with new projects.

This optimization means that while, yes, the fossil fuel resources would be feeding those use cases, it's a net gain because those use cases are less common than the predictable batch-job work.

> You are so overprovisioned in fact that someone will steal your promotion by pointing this out to your boss, which is also a pretty big cost.

Remember the context. Nobody ever got fired at Google for adding more compute power to the behemoth, because it's all fungible and will all be used eventually. if you're trying to argue "Google is a net polluter because they continue to build machines in a fashion only bounded by land and hardware costs under the assumption more compute power is always better than less" it's an argument you could make but (a) you should factor in the carbon offsets they buy and (b) you're tipping close to arguing "All human activity is pollution; kill the species."

Doesn’t not buying servers create pressure to not build them in the first place?

I shouldn't have used "sunk cost;" I should have said "fixed cost."

Google builds datacenters to N-year-long plans that are expensive to modify. Whether they're running their batch jobs on solar energy or coal energy, the carbon footprint of the datacenter build plan is not going to change. They want those datacenters anyway (mostly for the non-batch work and whatever the next big thing is that hasn't been invented yet but can only be done by putting a beach's worth of thinking sand on the problem).

It is ultimately, in a certain sense, a mathematical optimization problem to determine the optimal configuration of the entire infrastucture of additional power sources. Perhaps like finding the optimal location position set of cell phone towers, perhaps using k-means clustering. Furthermore, additional issues must be resolved like legal regulations compliance—the decision maker or engineering agent has preferences or desires to satisfy.

> for having spare capacity to allow time/location shifting

Part of that calculation should be the amount of compute capacity headroom you'd choose to have anyway even if you didn't care about carbon.

Compute demands can vary from one day to the next. Maybe tomorrow people uploaded 3 times as many YouTube videos as they did today. Maybe load varies based on day of the week or day of the month. To some extent, you can smooth that out by delaying jobs, but there are practical limits.

You also want some spare capacity just for safety. Efficient utilization is important, but things like performance regressions or spikes in demand can happen.

Two possibly examples-

- Spawn nightly regressions when wind power starts to pick up, instead of at some arbitrary wall clock time

- Dispatch compute-heavy jobs during low energy cost times; dispatch IO-heavy or memory-limited jobs during high cost times.

FWIW, for the build2 project we host our own CI servers on premesis and the power supply is supplemented by a solar array. We have configured our daily package rebuild window to coincide with the maximum solar output so that on a sunny day it is all done using renewable energy.

(Off-topic, but regarding this site)

Am I crazy or is this website capturing down-button clicks and ignoring them? I typically use down and up to slowly scroll as I read an article. This page is driving me nuts.

Yes, down-arrow key is disabled. I'd thought it was just me and some badly behaving extension!

Surprised to see that get through QA. (Up arrow works just fine.)

Funny enough, on Mac, fn + down works, although the jumps are larger.

This makes sense, fn + down sends Page Down, which is a different keystroke.

bizarrely it's only the down button (not the up button)

It appears to have something to do with this code (paste the URL into DevTools when you're on the site). webpack:///./frontend/js/trackers/keyboard-tracker.js

There's one other keyboard event listener, but it doesn't appear to do that.

Yeah, was able to replicate this on Chrome 81.0.4044.113 on Windows. Weird.

PageDown still works.

Seems like an intuitively good idea to me! It'd be great to see how effective this change was.

Regardless of this change, I wonder if they share their forecasted non-renewable energy needs with their energy supplier so that the energy supplier can prepare for changes to the expected base load.

Do any factories or other energy intensive operations do this?

It's probably not at the same level of granularity that Google is trying to accomplish here, but I believe that power-hungry commercial systems have tried to move to when power is cheapest for many years now.

Aluminum Foundries in particular are extremely power intensive and have been run during off-peak times (or are built in areas with cheap plentiful electricity like nearby hydro-electric dams).

Still, i'd love to see this concept made a lot easier for the average consumer. Many people already have smart thermostats, why can't that talk to my power generation company and allow me to over heat/cool when the impact is lowest? Why can't my dish washer run automatically when it would impact the world the least? Why can't my EV automatically charge when power is most available?

I know most of those things are possible, but they sure as hell aren't easy, and IMO they won't truly have an impact until they're on by default and don't require the user to do much of anything.

These things seem like they are easily doable, but we just need the different industries to work together to come up with ways to have all of this stuff interoperate.

> Aluminum Foundries in particular are extremely power intensive and have been run during off-peak times (or are built in areas with cheap plentiful electricity like nearby hydro-electric dams).

Fun fact: this is a big reason why aerospace congregated in the pacific northwest during WWII (eg Boeing in Seattle).

Aluminum is key to aircraft because it's lightweight, and at the turn of the century the US went on a dam-building spree with a lot of hydro (ie large consistent baseload) being located in the pacific northwest.

Another fun fact: in 2018 the US produced 890,000 tons of Aluminium. Iceland (population ~350,000) produced 870,000 tons.

Is it Iceland’s insatiable desire to produce aluminum, or an abundance of geothermal energy?

It's hydropower ;)

Geothermal energy is mostly for district heating and hot water

Some power companies are integrating with your smart thermostat and may pre-cool in some case. Example, APS in Arizona... --- During an event, how will my thermostat be adjusted?  At the start of an event, your thermostat temperature will be automatically adjusted up a few degrees above the current temperature.  Each event will typically last an average of 2 hours, and will typically occur between 3 p.m. and 8 p.m. Events typically will not occur on holidays.  In some cases, your thermostat temperature may be adjusted down a few degrees prior to the event to pre-cool the home and ensure your comfort during an event.  Once the event is over, your thermostat will return to its normal set point and/or schedule.

ComEd in Chicago can do this with Nest thermostats as well. At the time I had a Nest thermostat, they actually paid people to enable this. And I basically made back most of the cost of my Nest that way. (Before selling it off for a less cloud-connected thermostat.)

My electricity provider does the same. It's opt-in and you get a $200 credit at the end of the summer if you stay in the program. Pretty good deal, I think.

I believe some aluminum smelters can even bid into electrical markets, either directly as dispatchable load(/negative generation) or as ancillary services for grid stabilization.

That's awesome. Do you have a source for that?

At least in Germany some huge consumers (cement mills, huge refrigerated warehouses) are classified as "Lastabwurfkunden" ("load shedding customers"), where the grid operation center can remotely shut them down. Information is scarce as Google has a lot of copies of the Wikipedia article trashing the results, but at least I found an insurance company that asks if the applicant is classified as such (http://www.energyprotect.eu/wp-content/uploads/2014/07/Risik...).

There are already some solutions around that try to address this on a consumer level. I recently watched a teardown for a piece of test gear that utilizes frequencies injected directly to the mains supply in order to control home hot-water heaters in Australia[0].

That's the great thing about standards -- there are so many to choose from!

[0]: https://www.youtube.com/watch?v=Po4b7JhpxKQ

I imagine this could result in a feedback loop between the power plants scheduling and your devices scheduling if enough people use it.

Wind, solar, and run of river hydro are unscheduled so if load ramps up due to renewable generation ramping up there is no change in load to be accommodated by dispatchable scheduled units.

If the load responding to increased renewable generation exceeded the generation then the excess load should have just run whenever it was convenient since it isn’t using renewable power anyway.

I’ve omitted nuclear and large hydro from my discussion even though their fuel doesn’t emit carbon.

It tends to work the other way around - at least in the markets I'm familiar with.

If you're a large consumer of energy and can turn that consumption on or off at short notice (on the order of seconds) then the grid operator will pay you to allow them to scale your consumption up or down.

The classic example of this is cold storage. If you have a warehouse full of freezers which need to be kept within a certain temperature threshold then it doesn't really matter when you run the freezers and you could switch off at several points during the day.

Having worked in a factory before, I can tell you that the factory was calling the electricity company before shutting down or turning on the factory. It is consuming a noticeable chunks out of what the whole surrounding city is consuming.

Given how much electricity a datacenter consumes. Google surely must have a direct support contact within the electricity provider and they better start working both ways, if they don't already.

Quick math. A datacenter is 60 000 servers, so 6 MW consumption at 100 W per server (moderate load). That is 1% of the peak output of a nuclear reactor. You bet the electric company wants to know when they need to adjust their reactors.

And reactors aren’t generally easy or desirable to adjust.

Seems like an intuitively good idea to me!

Seems like getting rid of the middleman at a basic, physical level. Power is available for very low cost at certain times. So let's time-shift computation that doesn't have to be done at a specific time! Really, it's just the same trick as time-shifting your EV charging and other power draws. It costs money to run battery banks and inverters. Let's just take them out of the process, where we can.

"The best part is no part." -- Elon Musk

COO from Tomorrow here (who provides the CO2 forecast data to Google). Happy to answer any questions!

Great collab with Google! They've been probably the most serious corporate wrt getting all-renewable offsets for their operations... this is helping them reach the next milestone where the renewable offsets are time-matched. Great example of being serious about this stuff rather than just greenwashing.

It sounds like this project is for their own operations. Have you guys thought about how to offer this closer to a turnkey cloudops SASS / API? What kinds of abstractions would you present to developers building non-time-critical compute loads?

Could be a great differentiator for GCP vs. AWS (I have heard of some companies choosing GCP over AWS due to Google's green energy cloud). And for you guys, the only thing better than Google being a customer is all of Google's customers being your customers.

Also, how can we avoid the potential for unintended consequences where this tech makes Google "greener" while GCP users become less green?

If a data center has (roughly, to a first-order approximation) fixed compute capacity at any point in time, and we assume that any capacity not being used by Google themselves is made available to GCP, then wouldn't Google reserving the "green" hours for themselves drive the remaining "dirty" hours onto the GCP spot markets?

Is there a cloud market design that addresses this tension between maximizing utilization and having desirable or 'premium' compute hours?

Thanks a lot for the support and encouragement! More info coming soon about a turnkey solution ;-)

That looks like a good opportunity for an (at least tangentinally related) big shout-out to your great electricityMap (https://www.electricitymap.org/map) website!

I wish there would be more countries covered (in particular Switzerland), but I guess you depend on the live data being provided in these countries.

Disclaimer: Not affiliated with Tomorrow.

Unfortunately, the crux is data availability and reliability from system/transmission operators. For example, there is no online data available for the Northern Territory of Australia, so you can't build a parser for it. Some data providers have frequent data outages for different regions (ENTSOE), as there is no SLA or contractual obligation for providing data reliably.

If you're aware of a region that has live data available and is not yet live on electricitymap.org, please consider contributing a parser [1]! If you live in a region without live data, please consider politely requesting such data be made available through utility and system operator contacts, or explore requiring such data be made available by law (if public policy is your thing).

[1] https://github.com/tmrowco/electricitymap-contrib#adding-a-n...

I think the problem is that something is wrong with the ENTSO-E feed for solar and wind power in Switzerland. Data is given as "N/A" on their webpage, too, and it states that: "Due to changes to a different primary data provider, no data is published for Production Type Solar and Wind Onshore. The missing data will be published retrospectively by 28th Mar 2019."

Given that the 28th Mar 2019 has long passed, I am not very optimistic that the feed will be restored anytime soon... and unfortunately I don't know of any other (e.g. primary) public source, either.

You might consider inquiring with ENTSO-E regarding the status: transparency@entsoe.eu. Squeaky wheel gets the grease.

Hey Martin, this is super cool! Congrats on the collab!

Would you mind commenting on what your tech stack is like? Looking at your github repo it seems like you're combining a lot of data sources. Can you comment on your approach? Also, considering this collaboration, are you running on GCP?

Hello, Olivier here (CEO Tomorrow) Indeed we're running on GCP, using a mixture of Python and Node. In terms of the approach, I suggest checking out our blog (https://www.tmrow.com/blog) and this talk (https://www.youtube.com/watch?v=PAelZb2ZYwI) which might provide a bit of clarity.

Also, feel free to join our Slack at https://slack.tmrow.com to ask your questions!

That's a very interesting presentation (the youtube video), I'm glad someone is working in this space.

Somewhat related, the presentation has a lot of math, which illustrates some of the points made in the recent HN discussion of the impact and effects of math instruction in Russia (https://news.ycombinator.com/item?id=22941144). Some comments pointed out how French schools rely on math for entrance exams, and this video a clear demonstration of a software startup having a math background. In fact, the startup probably wouldn't exist or even be conceived without the math and modeling of the data they provide.

What's your opinion about Michael Moore's new documentary about the green economy? https://m.youtube.com/watch?v=Zk11vI-7czE

The main point of the documentary is that the EROEI of the "green economy" turns out to cost more than using fossil fuels directly. In short, the green economy didn't give common people an alternative, it simply gave business people a new tool to mask their intentions while idealistic politicians genuflect to the lie. The lies trickle down through the career pipe line and people go along with unquestioned assumptions out of fear of losing their jobs.

I personally think the documentary doesn't go far enough about the cultural aspect. I'm speaking specifically of the social norms of conformity, sociopathy and narcissism in the decision making classes. We have fucked future generations for short term gains and the only hope the tech community can come up with is to pretend that we will build rockets to go to Mars. When the kids of the billionaires pushing this horseshit get preventable cancers or tangled up in a class war - maybe then things will change (albeit I doubt it). Until then, it's all bean bags, idiocy, shiny tech toys and fluff. Enjoy it while it last. The magic bullet of nuclear fusion doesn't seem to be likely and the social organization to leverage what we have in a responsible manner doesn't either. We will keep doing what we are doing until nature bitch slaps us and we have to change. Hopefully I'm wrong. I want to be.

While the green angle on this story is definitely good to highlight, I wonder if they're seeing any cost savings too?

Solar and wind on the grid increase supply, which should drive down price per KwH (of course, the equation isn't quite that simple, since demand in most of the world near human population centers is also highest during the day).

I don't think the electric grid follows a traditional supply-and-demand curve. Every kilowatt used must be generated at the same time. Generation comes in tiers: cheap baseload (coal or nuclear, each with their issues), expensive peaker plants (nat. gas), and cheap but unpredictably intermittent renewables. Prices are highly regulated and agreed to in long-term contracts that take into account peak usage and minimal generation capacity. If demand increases, it is really expensive to supply until the level is enough to build a new baseload plant, but even those are expensive now.

Big users such as Google with its datacenters will of course negociate their own electricity contracts. I think renewables are the cheapest to buy right now, so by moving load around to be able to maximize use of cheap renewable electricity, they will definitely save money.

The pessimist in me says that they're only doing this in case a carbon tax passes, they'll be able to keep their pricing competitive.

I think this has very little to do with a carbon tax and everything about external and internal marketing. They are committed to using renewable power.

They bought enough carbon credits to offset their emissions but it still left a bitter taste because data centers run 24 hours a day.

I've had an idea for the longest time that we should get rid of all the processes we've put in place to deliver everything "on-demand" and instead work with nature to get what we can.

What I mean by this is that instead of deciding "I want to drive 200 miles to the beach" and buying a tank of petrol, you would instead wait for favourable wind/solar conditions in order to "save up" the energy you need such that you can afford to drive to the beach. If you are unfortunate one year you might only end up with half of what you need, but you'll still be able to do something.

This goes for things like food too. Stop demanding the same food year round. Instead work with the seasons and eat what is available locally at that time of year.

This would be such a huge boost to happiness. You can't see light if it's light all the time. We just don't know how great our lives are because we simply expect it to all be available at all the time. Expectations are simply assimilated and become invisible very quickly. Not only that but it turns out that meeting these expectations comes at a huge price. Let's instead take what nature gives us, but no more.

Interesting! Although there aren't any metrics on how much load is actually balanced this way. The only plot doesn't have y-axis labels.

I frequently criticize Google harshly for everything from search becoming more and more useless to pushing Chrome way to hard.

Seems some people at Google still hasn't got the memo yet that the "not evil" days are now a thing of the past. This looks amazing and more like something I would expect from old Google.

Not sure I’m very impressed by the plot they show here. The results during the day looks ok, but then they only translate two nightly peaks (low carbon) into one slightly larger... couldn’t even more of the work be done at night... also it is strange that there is a dip in both ends of the plot (maybe they just plot one 24h period, ignoring the previous day’s load and the next day’s load... I think it would be more appropriate to consider previous/day as well, as a 24h snapshot over a multiday view)

A more interesting measure would be the actual reduction in CO2 emissions.

Does anyone know of a good estimate for how much energy and emissions a typical Google search requires?

Everyone wanting to really understand what is going on with the new green economy and these platitudes should watch Michael Moore's nee documentary he just released free on YouTube called Planet of the Humans. https://m.youtube.com/watch?v=Zk11vI-7czE

This documentary is without nuance and without pragmatism. And it criticizes without proposing a path forward -- it's a bunch of cheap pot-shots, and it demands perfection instead of proposing progress.

Yes, there are valid criticisms. Wind turbines are made of unrecyclable fiberglass. It takes energy to build them (truck rolls to the site, concrete for the foundations), and it's important to make sure the energy return on energy invested is net positive. We use fossil fuels to produce these renewables technologies. That's all true, but not insurmountable.

They say battery storage makes up only a tiny percent of the needed capacity to overcome renewable intermittency. Sure, but it also omits how solar has dropped two orders of magnitude in price over the last few decades as we've built more of it and gotten better at making them (the "learning curve").

It follows a group of Vermont hikers hiking to a wind turbine site and then being NIMBY about it, but none of them talk about where their energy SHOULD come from.

Look, it raises a lot of critical questions. But it also seems to expect a single magic pill that just doesn't exist. 2/3's of the way through they talk about the misrepresentations in biomass and point out how many organizations seem to be both for it and against it. "Which side are they really on?" says the classic accusatory documentary voiceover with scary music. Well, it's complicated! Clearly you don't want to burn all the forests all at once. And yeah, if you burn pressure-treated wood, those chemicals go into the local community. At the same time, wood does grow back. The nuance that's missing in this documentary is questions like "how many acres of rotationally-harvested woodlands are required to power a 1MW biomass plant sustainably in perpetuity? And can such projects exist in practice?"

Biomass isn't a panacea solution, and the HN startup mindset of "can I scale up a technology to dominate everything" doesn't apply because biomass has limits to it's scalability. It's just one of many tools, and the problem about this documentary is it can't envision a future where many tools are used together. When a Sierra Club exec is questioned about biomass, they kept the part where she says their "position is nuanced", but then they cut to something else without explaining that nuance. That's lazy documentary filming.

The complicated thing about energy is there is no silver bullet. This documentary finds the bad in each technology without considering how all the pieces could fit together. It presents the bad sides of each technology as if that should disqualify the tech instead of asking how can we improve each over time. There aren't easy answers to these questions, but this documentary just wallows in how bad everything is without asking the hard questions about how things can be made to work or what the alternative of doing nothing is.

> Yes, there are valid criticisms. Wind turbines are made of unrecyclable fiberglass. It takes energy to build them (truck rolls to the site, concrete for the foundations), and it's important to make sure the energy return on energy invested is net positive. We use fossil fuels to produce these renewables technologies. That's all true, but not insurmountable.

These turbine blades can be broken down into pellet insulation or used as feedstock for cement kilns. It's a supply chain and economic incentive issue, not an unsolved technology issue.

I can't speak to Moore's beliefs, but his documentaries (IMHO) are designed to inflame, not to have an intelligent discussion about complex problems that require complex solutions. They are "clickbait" disguised as objective information.

Thank you.

This is beyond depressing. All cope and hope peddling.

Our data centers now run slower when it's cloudy and there's no wind.

Imagine climate modeling super computers that are carbon neutral.

Just a marketing device. Carbon neutral since 2007? Let me laugh in CO2, "green energies" are nowhere near carbon neutral. See planet of the humans by Moore.

I like the animated illustration; very pleasant.

That headline reads more like a bug report than a press release.

"Here's a barge full of coal. Maybe you can fix it with that."

This is beautiful.

what a pity to know that they will be mining more of our personal data when we are chilling with the sunshine.

They're doing exactly the same amount of work, just time-shifted.

So Skynet will now prefer very sunny weather with strong consistent winds?

Hope they crash some day too so you can stop tracking people :). How long does it take until this get -votes

Good idea, but I can't help but see this as marketing spin over the alternative title of the same story

"Google: Data centers now perform LESS when the sun is not shining or the wind is not blowing"

I don't see your point. That alternative title is exactly as accurate, so it seems like a good idea to pick the version that sounds a bit better and is easier to understand.

Of course its a better idea to pick the version that sounds better. That's what I said; I know both are accurate.

My point is that by tying performance to environmental factors, you get a boost when things are great but then can have troubles when things are not great. Anyone familiar with solar panels already knows this, but if the correlation is obscured, it could be surprising. The article didn't mention a specific performance gain, but if we say you get an X% performance gain when the sun is out, it also means you get a similar X% performance loss when the sun is not out. Users of the system will get used to the improvement, which becomes the new standard, and then a particularly dreary season comes in with weeks of cloud cover, and suddenly there is concern about the degradation of service.

(Like I said, it's still a good idea, it's efficient use of resources, but the PR is funny, that's all.)

Encouraging and incentivizing compute/electricity demand to be time flexible provides the opportunity for cost savings and emissions reductions.

The greater the flexibility, the greater the savings when demand can be smoothed out to better allocate resources and allow easier forecasting.

If compute jobs that are run on demand can be deferred a few hours and run during a time period, that allows resources to better utilized. Like charging an EV overnight, but better.

I hope this doesn't add to much cost to their cloud customers' bills.

They've worked so hard to sell their AI solutions to the fossil fuel industry, lately, so they can help them extract and burn more oil and gas[0].

[0] https://www.vox.com/recode/2020/1/3/21030688/google-amazon-a...

Why would AI solutions help the fossil fuel industry burn more oil and gas? The fossil fuel industry sells oil and gas to others to burn, it doesn't light the stuff up itself for fun.

>The fossil fuel industry sells oil and gas to others to burn, it doesn't light the stuff up itself for fun.

Who said they'd burn it for fun? Necessity suffices.

>In Russia, one of the world’s top producers, the industry is considering resorting to burning its oil to take it off the market, sources told Reuters

Source: https://www.reuters.com/article/us-global-oil-turmoil/when-o...

Almost all the oil we extract must be burned. There's far too little need for oil as a non-energy source, and there's a very good reason why the price of WTI is so low these days. The industry simply cannot find enough customers to burn it.

How can Google, while helping others extract as much oil as possible, can still pretend it may not end up being burned?

It's like a a drug dealer saying "I'm definitely not helping anyone taking drugs. I'm only selling it, they stay them somewhere if they please. Don't blame me".

You can make this argument more general (and IMO, even more ridiculous):

Fossil fuels are global commodities. Anyone who reduces their oil consumption is reducing demand for those global commodities, which makes them cheaper. Making them cheaper makes renewable energy less competitive, which discourages investment, which means as a society we keep using fossil fuels for longer, which means in the end we produce more carbon.

Ergo, anyone who tries to reduce their fossil fuel consumption by not flying, not driving, or turning off the A/C, is actually more responsible for destroying the planet.

You're making the argument that reducing your demand for something does not reduce the demand for that thing, because demanding less of something makes others demand it more.

Is all your consumption from renewable sources ?

While the story is very positive and encouraging --

Unfortunately an unintended side consequence of these kinds of efforts (unless you're very conscientious about maintaining the correct incentives, generally through pricing) is sometimes that the gains in energy efficiency and savings are clawed back by an increase in overall energy consumption because it's gotten effectively cheaper to operate for the same number of compute cycles.

Just like with energy efficient LED light bulbs, although the overall energy use goes down, often it doesn't go down as much as it could have ideally, because people start lighting places that didn't have light before, because it's gotten so much more affordable to do so!

Or like when you add highway lane capacity, traffic gets worse...

Or in this case, the Google video engineers come up with new useless filters and resolutions to occupy the newly freed-up compute capacity.

Just something to be aware of. The people who do this have to monitor and put in place controls so that the outcome is what they intended. Otherwise people are more clever than you think.

If it is still an improvement in both end usage and utility isn't that letting the perfect be the enemy of the good?

LEDS have to be one of the worst possible example for claims of induced demand as a bad thing given that the efficency gains outstripped proliferation of additional always on devices and a cellphone per person.

While Induced Demand may exist it too has its saturating limits of diminished returns.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact