Nuclear power costs about as much to run 24/7 as it does to run intermittently, because the vast majority of its cost is in construction. So building enough nuclear power to satisfy demand when intermittent sources aren't producing makes those intermittent sources totally redundant.
Suppose you have renewable sources that on average generate 100GW. Well, some weeks they're going to average 90GW -- and that will be quite common -- in which case you'd need 10GW of something else. The long-term average (e.g. over 48 hours) is what matters because you'd always need to smooth out the short-term load with batteries.
Some weeks you'll need to make up 20GW, but that will be less often than a 10GW shortfall. Even less often, 30GW. Sometimes, but very rarely, the whole 100GW, or at least say 90GW.
This means that the first 10GW of peaker plants will be quite economical because they get used e.g. 15% of the time, but the last 10GW will only get used <1% of the time and not be competitive with nuclear running ~100% of the time.
So what you end up with is, say, 60GW of renewables, 40GW of nuclear and 50GW of peaker plants that run the ~5% of the time when you have low renewable generation. Instead of needing 100GW of renewables and 90GW of peaker plants, of which 40GW is almost never used.
> Suppose you have renewable sources that on average generate 100GW. Well, some weeks they're going to average 90GW
Renewables' fluctuations are way, way bigger than 10%. Solar, by definition, is going to be producing 0% of output 12 hours a day on average. Wind has fluctuations of over 10x as well:
It's more like, you have renewable sources that produce 100 GW at their times of peak output. And for a couple months of the year they're producing 10GW. If you built 90 GW of nuclear power, then you could just scrap 90% of your renewables and run your nuclear plants all year round. Again, nuclear power plants are just as cheap to run constantly as it is to run them intermittently.
I reiterate that we're talking about the daily average and not the instantaneous generation.
You have solar panels that generate 100GW average, i.e. 200GW during sunlight and nothing at night. You have batteries to match this up with the daily load.
But then it's more cloudy than usual for a couple weeks, so your average generation from solar is 50GW, or 10GW, or what have you, depending on how cloudy it is. You need something to make up the difference that week or the batteries run down and the power goes out.
But because averaging 90GW over a day (when the typical average is 100GW) is a lot more common than averaging 10GW, the first 10GW of peaker plants is economical against nuclear and the last 10GW is not. And the total of nuclear + peaker plants has to add up to ~90GW because there will be times when you only average ~10GW from renewables.
> You have solar panels that generate 100GW average, i.e. 200GW during sunlight and nothing at night. You have batteries to match this up with the daily load.
And that amount of batteries is absolutely massive. To put this in perspective, the US uses about 500 GW of electricity, or about 12 TWh per day. That's 6,000 GWh for just the United States. This is an incredible amount of storage, over 10x the annual global battery production.
This is why talking about "average production" of intermittent sources is simplistic. You need to dig into the degree of overproduction and storage required to even out that intermittency. But including those factors makes renewables way more expensive, which is why people don't like to talk about these factors.
> This is an incredible amount of storage, over 10x the annual global battery production.
It's a lot. Battery production would have to increase. But it is increasing, and it's not an impossible amount.
You also cut it to less than half by doing the thing mentioned above, because the daytime load is higher anyway. If 40% of the generation comes from nuclear, that's 80% of the nighttime load and now you only need 20% as many batteries.
> It's a lot. Battery production would have to increase. But it is increasing, and it's not an impossible amount.
Is it? Global battery production is about 500 GWh per year. A single digit percentage of that is going towards grid storage, it's mostly used for EVs and electronics. Even if we completely halted EV production, and dedicated 100% of global battery production to grid storage it'd take over a decade to just satisfy the United States' grid storage requirements. And that's just for diurnal storage, not seasonal storage. And again, that's dedicating 100% of global battery production to just one country.
While battery production is increasing, so is electricity use. As transportation and other industries decarbonized, and as poor countries grow richer, electricity use is going to rise and so will the storage requirements.
> because the daytime load is higher anyway
Nope, in fact it's the opposite. Load typically peaks after sundown, at around 8pm. This is why solving the duck curve is so hard. Solar produces no energy during peak demand: https://en.m.wikipedia.org/wiki/Duck_curve
> A single digit percentage of that is going towards grid storage, it's mostly used for EVs and electronics.
That's fine, it makes sense to put it in EVs because EVs are de facto grid storage. You charge them during the part of the day with the most surplus generation and they even out the curves.
> Even if we completely halted EV production, and dedicated 100% of global battery production to grid storage it'd take over a decade to just satisfy the United States' grid storage requirements.
That's assuming production doesn't increase at all, and that you want a 100% renewable US grid instead of e.g. 60%.
> And that's just for diurnal storage, not seasonal storage.
Batteries aren't suitable for seasonal storage. Their contribution is to smooth out the load.
Suppose during a cloudy week you would have a 50GW shortfall from 6PM to 10PM and a 10GW shortfall from 12PM to 6PM, but a 20GW surplus from 8AM to 12PM. Well then what you really have is 7.5GW average shortfall and by smoothing out the load with batteries can get by with 7.5GW of peaker plants that week, instead of needing 50GW at 8PM which then sits idle from 10PM to noon.
> While battery production is increasing, so is electricity use.
The main driver of this is electrifying transportation. But because cars have built-in storage, you can charge your car during daylight and substantially all of the new generation capacity can be renewables without encountering short-term intermittency issues.
> Nope, in fact it's the opposite. Load typically peaks after sundown, at around 8pm.
You can take a look at your own graph. The load during peak solar generation (~3PM) is >25GW, compared to ~20GW after midnight, and that's during the autumn rather than the summer. Air conditioning load aligns extremely well with solar generation.
The peak load is around 8PM, but the peak load and the minimum load are both after dark, and the daytime average is higher than the nighttime average.
EVs can only be used as storage under very specific circumstances. It assumes people drive to the work in the morning have charging stations at work, and don't commute very far. In reality lots of people don't have charging stations at work, and charge their cars in the evening when solar isn't producing. Also, there's loads of people doing errands during the day, or utility vehicles transporting cargo. These are going to be used consistently through the day and charged at night. Electric semi trucks aren't going to be helpful for grid storage. And they burn a lot more energy than a passenger sedan. And lastly, few grid operators are going to accept that their grid storage may drive away on masse during the holidays.
> Suppose during a cloudy week you would have a 50GW shortfall from 6PM to 10PM and a 10GW shortfall from 12PM to 6PM, but a 20GW surplus from 8AM to 12PM
> The load during peak solar generation (~3PM) is >25GW, compared to ~20GW after midnight, and that's during the autumn rather than the summer.
But again, peak demand isn't at midnight it's at 8pm. You can't just hand-wave away the challenge of meeting peak demand when intermittent sources aren't producing.
> Air conditioning load aligns extremely well with solar generation.
And it aligns even worse during the Winter: solar is producing the least amount of energy, cloud cover is at its greatest extent. Most homes are heated by gas, and when that gets decarbonized it's going to put even more load during the worst time of the year for renewables.
Definitely, rooftop solar is a great addition and a good way to reduce load due to air conditioning. It comports with what I've been saying this whole thread: intermittent sources are good when used opportunistically to supplement a grid primarily powered by non-intermittent sources. The downsides of intermittency are too great to be used as a primary source of energy, though.
Every battery you put in an EV becomes a battery you can use to balance the grid when the EV is scrapped and the battery still has 80% of its capacity left.
80% capacity does not mean 80% of life span remaining. Degradation is not linear [1] [2], a battery that's gone through enough cycles to drop down to 80% of its capacity is very close to the end of its life just a few hundred more cycles before failure. That would drive up the costs of labor as these recycled batteries would have to be replaced once a year or even more frequently.
Recycled EV batteries are popular in DIY communities because labor is free, and if they fail you still have the grid to fall back on. That's no the same case for grid operators. New LFP batteries have better economics than used EV batteries for grid storage, especially since they are both cheaper per watt hour and they have over twice the life span of most lithium-nickel-cobalt batteries used in EVs.
So you're saying your point about every battery going into an EV is completely irrelevant because grid batteries will most likely use different chemistries than EV batteries? That's fine too.
> It assumes people drive to the work in the morning have charging stations at work
That's a pretty reasonable assumption once electric cars are the majority of cars.
> and don't commute very far.
There are popular electric cars with a range on the order of 300 miles. The average commute is 41 miles round trip.
> Also, there's loads of people doing errands during the day, or utility vehicles transporting cargo.
The average person does not drive 300 miles a day in total, and so does not need to charge more than once a day in total. Utility vehicles will have larger batteries and more range, and will strongly prefer to charge during daylight because charging from solar will cost less.
> And lastly, few grid operators are going to accept that their grid storage may drive away on masse during the holidays.
It's storing the energy that the vehicle itself will use. If the vehicle leaves the area then so does its demand for electricity.
> During a cloudy week, you're going to have a shortfall all the time. Solar drops to 10-25% effectiveness with cloud cover
If your grid has 40GW of nuclear and 60GW average of solar (which generates 120GW during daylight) then the 120GW drops to 30GW and you have 70GW of generation, which could be less than the demand before noon. You can also run the peaker plants even though you already had a surplus in those hours so you can put even more into the batteries to have enough later in the day.
> But again, peak demand isn't at midnight it's at 8pm. You can't just hand-wave away the challenge of meeting peak demand when intermittent sources aren't producing.
Suppose you have enough nuclear to meet the demand at midnight, which is enough to meet half of the demand at 8PM. So now you need to satisfy the other half from batteries for four hours, because before 6PM the sun is still up and after 10PM the demand falls off to what you're getting from nuclear. So you don't need enough batteries to satisfy all of the demand for the 12 hours when there is no sunlight, you only need enough to satisfy half of the demand for those four hours.
> And it aligns even worse during the Winter: solar is producing the least amount of energy, cloud cover is at its greatest extent.
But there is more wind in the winter, and at night, when there is more demand for heating.
And one of the ways to decarbonize heating in cities is nuclear with cogeneration. Then you're not using electricity for heat, you're using the waste heat from the reactors, which chops off a large chunk of even the existing nighttime load.
Again, electric vehicles only work as storage if people can charge their cars at work, drive home, and then tap that storage over night.
Many vehicles aren't being used for commuting, they're used for work or running errands. They're going to be used during the day and charged at night. People aren't just going to become nocturnal in order to charge their cars. Someone who drives to work in the morning, goes shopping during the day, and picks kids up in the evening isn't going to be able to do all these chores at night.
What happens if someone's car is discharged over night right before a trip to Grandma's and now they don't have enough range? They're going to be pissed and stop letting the grid tap into their car.
> If your grid has 40GW of nuclear and 60GW average of solar (which generates 120GW during daylight) then the 120GW drops to 30GW and you have 70GW of generation
If you've got 50GW deficit over a week of cloud cover you need 4,200 GWh of storage to offset that. Consider that in many parts of the world, cloud cover can be present for a month or longer.
These are absolutely massive storage figures. The latter is about ten times the annual global battery output. And energy demand is much higher than 120 GW. US electricity demand is 500 GW. Global electricity demand is 3,300 GW.
> Suppose you have enough nuclear to meet the demand at midnight, which is enough to meet half of the demand at 8PM.
It's enough to satisfy 70% of the demand at 8PM. And since nuclear is just as cheap to run at 100% capacity all day long, it's better to scrap the bulk of your renewables because the nuclear plants make them redundant.
> But there is more wind in the winter, and at night, when there is more demand for heating.
This depends on geography. On the west coast, it's the opposite: wind produces the least energy during winter.
> Again, electric vehicles only work as storage if people can charge their cars at work, drive home, and then tap that storage over night.
They're storing the energy that they themselves use. You're not using it for your house, you're charging during sunlight and then driving home or to a restaurant after dark. It allows close to the entire demand for electric vehicles to be served from renewable generation.
> They're going to be used during the day and charged at night.
A supercharger adds 200 miles of range in 15 minutes. You go into the shop, plug into the charger in the parking lot and when you come back out in 15 minutes you don't have to charge again until tomorrow.
> What happens if someone's car is discharged over night right before a trip to Grandma's and now they don't have enough range?
They'll plug it in and charge it. Charging your car at night isn't going to be illegal, but it will cost more, so people will prefer not to, and then 95% of the charging will happen during the day.
Notice also that "overnight" is not synonymous with "darkness" -- if the sun comes up at 6AM and you leave for work at 8AM, you've had two hours of sunlight to charge your car while it was parked in your garage, and there are computers that can automate this.
> If you've got 50GW deficit over a week of cloud cover you need 4,200 GWh of storage to offset that.
Or 50GW of peaker plants, or 40GW of peaker plants and 20% of that amount of storage.
> And energy demand is much higher than 120 GW.
These are just example numbers for a hypothetical grid with 100GW of average demand.
> It's enough to satisfy 70% of the demand at 8PM.
In California, in October. Not in Texas in July.
> And since nuclear is just as cheap to run at 100% capacity all day long, it's better to scrap the bulk of your renewables because the nuclear plants make them redundant.
Nuclear is base load. It can make sense to build as much of it as there is base load demand, i.e. the minimum demand during any part of the day. But what do you want to do about the higher demand periods? Build extra nuclear plants that only operate 4 hours a day? Use battery storage with nuclear, even though that's way more expensive than battery storage with renewables?
> This depends on geography. On the west coast, it's the opposite: wind produces the least energy during winter.
The west coast has mild winters and thus less heating demand.
> A supercharger adds 200 miles of range in 15 minutes. You go into the shop, plug into the charger in the parking lot and when you come back out in 15 minutes you don't have to charge again until tomorrow.
Except very few parking lots have superchargers. And those that do, usually have just 5 or 10 for a parking lot of hundreds of spaces. This whole idea that a grid operator is going to accept that traffic conditions are going to potentially cause blackouts is not where near feasible.
> Or 50GW of peaker plants, or 40GW of peaker plants and 20% of that amount of storage
Right so ultimately the solution is: keep using fossil fuels. Great...
And a 10GW deficit over a week is still 1,680 GWh of storage. This is massive several times annual global battery production.
> Notice also that "overnight" is not synonymous with "darkness" -- if the sun comes up at 6AM and you leave for work at 8AM, you've had two hours of sunlight to charge your car while it was parked in your garage, and there are computers that can automate this.
Solar output is sinusoidal. Right after sunrise is still producing nearly zero energy.
> Nuclear is base load. It can make sense to build as much of it as there is base load demand, i.e. the minimum demand during any part of the day.
Base load is the significant majority of the electricity demand. It's not a small part of electricity demand, it's almost all of it.
> The west coast has mild winters and thus less heating demand.
Uh no? Maybe in California, but here in Seattle we regularly see nighttime temperatures below freezing.
> Except very few parking lots have superchargers. And those that do, usually have just 5 or 10 for a parking lot of hundreds of spaces.
The number of chargers increases with the number of cars.
> This whole idea that a grid operator is going to accept that traffic conditions are going to potentially cause blackouts is not where near feasible.
Pricing determines demand. Charging at the supercharger at the mall during sunlight is cheap so people go out of their way to do it, and the mall loves this because people come to the mall and buy things when they know it's a quick and cheap place to charge.
> Right so ultimately the solution is: keep using fossil fuels. Great...
Cloudy for an entire week happens, but it's rare. So 98% of the time you use nuclear and solar, then one week out of the year you use nuclear and fossil fuels instead, and you've got a 99% decarbonized grid.
Or you find something to burn instead of natural gas. You could quite plausibly have a one-week supply of biofuels or synthetic fuel, since it's for emergency use rather than everyday generation.
> And a 10GW deficit over a week is still 1,680 GWh of storage. This is massive several times annual global battery production.
Yeah, batteries don't really work for long-term storage. But the amount of batteries you need to avoid running the peaker plants on a normal day, combined with running the peaker plants 24 hours a day instead, gets you through a period of two or three days of low generation.
At that point you're doing demand suppression through pricing and things, but only when the period of low generation lasts for many contiguous days. And getting 50% demand suppression probably isn't reasonable, but 10% is, especially when you only rarely have to do it.
> Solar output is sinusoidal. Right after sunrise is still producing nearly zero energy.
"Right after sunrise" is nearly zero, but two hours after sunrise is more than 50% of peak.
And human activity is sinusoidal too. You start getting some solar generation before anybody is awake and you can put it into charging vehicles. Generation increases as more people start to wake up. Moreover, some people leave for work at 8AM, some leave at 9AM, some will charge at work or the mall, so you're still spreading the load throughout the daylight hours rather than trying to fit everything into the period immediately after sunrise.
> Base load is the significant majority of the electricity demand. It's not a small part of electricity demand, it's almost all of it.
Peak load in summertime is nearly double the base load.
> Uh no? Maybe in California, but here in Seattle we regularly see nighttime temperatures below freezing.
The average winter temperature in California is 46 degrees F, and California is >75% of the population of the west coast. The average winter temperature in Washington state is 33 degrees. The average winter temperature in New York, despite a more southern latitude than Washington, is 23 degrees. In North Dakota, still slightly south of Washington, it's 12.
The west coast has mild winters and less heating demand.
It also has a ton of existing hydro and existing high voltage transmission lines because the hydro is in the northwest but powers much of California, so you could fill California to the brim with renewables and then use the existing hydro as your batteries.
> the first 10GW of peaker plants is economical against nuclear and the last 10GW is not.
I'm not sure how well this works given that peaker plants have to pay for their fuel whereas nuclear - to a first approximation - does not; everything is factored in the building cost. So it's more like the first 10GW of nuclear baseload can displace a lot of gas use, whereas the last 10GW shortfall might still be efficiently covered by peaker plants if they're cheaper to build for the same capacity and almost never have to burn fuel. It works out to mostly the same outcome, though.
> I'm not sure how well this works given that peaker plants have to pay for their fuel whereas nuclear - to a first approximation - does not; everything is factored in the building cost.
It's not about the cost of nuclear, it's about the utilization of peaker plants.
If you build 10GW of nuclear, you generate 10GW 100% of the time. If you build 10GW of peaker plants, do you use it 15% of the time or 0.5% of the time? The answer to which depends on if it's the first 10GW of peaker plants (10GW shortfall happens frequently) or the last 10GW of peaker plants (only needed if the first 80GW of installed capacity was used first; 90GW shortfall happens very rarely).
Sure, but a peaker plant is providing capacity even when it sits unused. That's a valuable service, and it gets rewarded by usual electricity-market arrangements. So the relevant question is when can this service be provided more cheaply than the same amount of nuclear-supplied capacity.
The value of the service depends on how many other peaker plants there already are.
> That's a valuable service, and it gets rewarded by usual electricity-market arrangements.
This is the bureaucracy papering over the details like an abstraction layer. You want to look at the underlying economics when you're redesigning a system like this. This is napkin math but it should be directionally accurate.
Suppose we assign rates to the relationship between supply and demand in the power grid. The price of electricity set by how much demand exceeds supply.
Then you're going to have a base rate, call it R0, essentially what electricity costs off-peak on a typical day. This is the rate which is going to prevail for around two thirds of the time. Then you have your normal day peak rate, R1, this is what it costs in the early evening when there is low renewable generation and high demand, basically the higher cost of intra-day energy storage. This might prevail for 1/6th of the hours in a year.
Two thirds plus 1/6th, you still have another 1/6th, that's when something bad is happening. It's a bit cloudy today, rate R2, happens 10% of the time. More cloudy, even higher rate R3, 3% of the time. Overcast for the whole day, rate R4, 2% of the time. Overcast for multiple days, rate R5, 1% of the time. Overcast for multiple days in the summer during peak air conditioning demand, rate R6, <1% of the time.
Now you can calculate how much value a nuclear plant has, because it generates all the time:
Solar has to recover its costs from just R0, which is the lowest rate, but it can do that by generating for a large number of hours a year relative to a peaker plant, and because it's cheap. Batteries recover their cost during R1, which isn't as many hours but the rate is higher.
The first peaker plant gets R2 through R6, because it gets turned on as soon as you hit R2, i.e. you need more than you get from current generation + energy storage. The second peaker plant only generates at R3 through R6 because it doesn't get turned on until then, since the first one was enough during R2.
The last peaker plant only generates during R6 and has to justify its entire cost by operating only 0.67% of the time. Which is why R6 is the highest rate, but even then it's a tough sell -- much harder than the other peaker plants which operate during R6 and R5, R4, etc.
Whereas the nuclear plant operates ~100% of the time, R0 through R6, replacing both renewable generation on a normal day and peaker generation during a supply crunch. But it costs more to build. So it should be worth it to replace the peaker plant that only operates during R6, but maybe not worth it to replace the peaker plant that operates from R3-R6, and definitely not worth it to replace the peaker plant that operates from R2-R6.
That's one side of the equation. The other side that's being explored is to vary the demand side. Only use 90 GW when 90 GW is available, and run the aluminum smelters undercapacity when there isn't the solar power to run them at full capacity.
It's an interesting balancing equation that someone could make an interesting game out of.
> Only use 90 GW when 90 GW is available, and run the aluminum smelters undercapacity when there isn't the solar power to run them at full capacity.
That gets you a few percent for the things that work like that, but most of the load on the power grid isn't able to do that. You can't turn off the heat in the winter, which is going to be an even larger proportion of the load in the future than it is now as we transition from fossil fuel heating systems to electric. People demand hot water. Refrigeration is a significant load but you turn it off and your food spoils.
Electrifying transportation is also a mixed blessing here. It helps smooth out the day/night load because you can charge them whenever generation exceeds demand, but after a couple days you can't put it off anymore and then you've just got more unsuppressible demand.
I would expect heating to be one of the electricity use that you _can_ adapt the easiest. First by using heatpumps instead of resistors, and then with some inertia management to buffer (e.g. water tanks, or something)
We're largely not using resistors now, we're using oil and gas. Heat pumps use less than resistors but not less electricity than burning hydrocarbons on site.
Buffering systems can be effective for intra-day load shifting. You want to use electricity at noon and have heat at night. It doesn't work when you want to use electricity on Monday to have heat on Friday (tank would have to be too big and losses are too high) and it definitely doesn't work when you need heat on Monday but won't have the generation capacity until Friday.