It looks like the biggest problem with nuclear is not the safety or whatnot, it's just that as a civilization, we just can't build them without cost overruns and massive delays. And this is a loss, because we need non-variable energy sources to augment wind/solar/tidal.
Perhaps the other way to solve the energy transition is to lean heavily into mass storage of variable energy sources. If you have enough energy storage, you don't need so much of the base energy suppliers like gas/coal/nuclear.
Just a point of clarity, we need demand resources, which means variable at our command vs not variable or variable based on wind/solar input out of our control.
The economically viable mass storage options (i.e pumped storage) are basically all deployed. Batteries are getting there but they still need some shenanigans in the ancillary market where they get paid for not running for complex reasons to make the economics in most instances. As the shenanigans price out nat gas in those markets it will be interesting to see how offer strategies of nat gas change to recoup the lost economic viability and if that makes batteries less profitable.
Batteries are already viable paired with solar. You see solar power purchase agreements in the 1.5 to 2c / kWh range and you can directly use that energy around 1/2 the time. Recharging and discharging batteries isn’t 100% efficient but it’s close.
Current numbers are lower, but in 2022 you could use LFP for ~5,000 discharge cycles and pay ~480$/kWh of capacity or ~9.6 c/kWh. Inverters etc last longer than the batteries themselves so you can amortize those costs across multiple generations of batteries, therefore it’s not quite upfront costs / number of discharge cycles, but it’s also not that far off of it. https://www.nrel.gov/docs/fy23osti/85332.pdf
Still roughly 2c/kWh * 105% + 9.6 c/kWh * 50% ~= 6.9c/kWh averaged over a day.
Currently batteries are discharged at peak demand, but the underlying economics scales just fine even if you double the number of solar panels for redundancy and aren’t being paid a premium. Further at scale demand that shifted to cheaper nighttime rates will instead shift to cheaper daytime rates.
We have to keep in mind that 6.9 isn’t yet the end user rate after grid fees, but still.
There is also a ton of potential in demand shaping by offering hourly prices to customers.
The place where batteries fail is long-term, seasonal storage. The price per charge is still low, but an asset that takes 5000 years to amortise (one charge cycle per year) is not a competitive investment.
So we will need other approaches here, where the substance that stores the energy is cheap and the cost is shifted onto the energy transforming device (this is e-fuels/ ammonia/ hydrogen)
There will be no need for "long-term, seasonal storage". We only need enough storage to last until a shipment of synthetic anhydrous ammonia, ordered from a nearby wind farm or from a solar farm in the tropics, can be delivered, to be burned in an existing combined-cycle turbine. That storage is some mix of batteries and tanked ammonia.
Well-provisioned solar farms can synthesize and tank their own ammonia during periods of excess production, and sell excess (over what local tankage holds) on the open market.
I’m not convinced we want significant seasonal storage as a separate system because depth of discharge impacts battery lifespan.
So rather than having a singe battery doing nothing for 364 days a year and getting used one day you have a battery bank which gets discharged slightly more 1 day a year and recovers that deficit over some time period.
For redundancy reasons you want excess generation capacity should something happen which would most of the time allow for a full charge soon afterwards.
PS: individual wind locations also tend to get more power on specific time of the year which can offset seasonal issues.
It’s not the battery being used for seasonal storage but the difference in rates of battery degradation.
Suppose a normal discharge cycle costs 10c/kWh and a very deep discharge costs 100c/kWh to access that last 10%. That means there are energy reserves unacceptably expensive for normal operations, but it also means simply operating batteries efficiently automatically creates reserve capacity which is the entire point of dedicated seasonal storage.
Put another way if you design a 1TWh battery for daily use it’s going to have a 0.1TWh reserve capacity just sitting there.
This isn’t a lot of power across a full season, but it address seasonal storage in terms of short term abnormal peaks like heatwaves.
PS: Proponents of seasonal storage argue for seasonal deficits in production rather than short term gaps. However, the daily variability of renewable production promotes significant excess generation capacity.
We can’t predict seasonal demand that accurately so trying to use a finite reserve for anything but outlier events is risky. Plan for a large deficit and the infrastructure you create to fill that reserve is sitting around in the off season reducing the size of your deficit. Plan for a small one and your estimate may be wildly inaccurate.
What’s left for “seasonal storage” is to cover gaps in production from extreme outlier events using surplus production. That is useful, but also largely covered by battery power as I just covered.
Granted if someone comes up with cheap enough seasonal storage it might have a place, but that’s a possibility not a guarantee.
I assume that's for already built solar PV, though it still seems extremely low to me so I'm curious where you are seeing this.
For reference on new build, at the UK government's most recently completed CfD auction (which happened in September), the solar PV bids accepted were at £47/MWh, CPI indexed and specified in 2012 prices: https://www.gov.uk/government/publications/contracts-for-dif...
Yes, though the fact that it's 2012 prices and indexed to CPI does a lot - that's roughly £65/MWh in current prices. And despite all that it's still cheaper than offshore wind.
You can also look at Spain - the CfDs aren't really comparable since the way CfDs are used there is different but LCOE is probably in the 40s-50s EUR/MWh for new build solar PV. But there are now meaningful curtailment issues in some places due to grid capacity there so developers will probably be working to higher numbers.
Yea, UK solar is still offsetting natural gas so it’s viable.
One issue for significantly higher UK solar production is they use more electricity in the winter. January 2023 was 26 TWh where July 2022 was only 20 TWh which is the opposite of solar’s peaks. https://www.nationalgrideso.com/electricity-explained/electr... (Older reports on bottom)
Yes, and this is when the vast majority of homes are still heated using gas central heating. If we switch away from that to heat pumps+resistive heating in anything like meaningful amounts, this difference will get much more pronounced.
A bit more clarity: right now, and for the rest of this decade just rolling out a lot more renewables will be the cheapest way to replace carbon generating sources which are right now burning and spewing carbon into the atmosphere to generate electricity around the globe.
So, there is a long term need, in order to reach a goal of 100% carbon free electricity (generally set for about 2035) for more on-demand resources (including responsive demand), but it is not the most pressing concern right now. And even when it is, it'll account for a small fraction of the total, like 10% or so.
(Luckily, batteries are gettting rolled out already because they have positive economic value in various niches, like in cars, frequency response, avoiding congestion and network upgrades so the ramp up is looking good)
What’s the plan for African nations pursuing industrialization agendas? My understanding is that wind and solar are too low-density for their tribal permitting environments. I don’t see the win in meeting a 2035 target that makes wildly optimistic assumptions about Africa’s trajectory over the next century if it means underinvesting in sources that will move the planet off coal.
We've used wind for 5000 years when it was available. It just takes a bit more time to wear out the machines if they sit there waiting most of the time.
Someone in Iran told me these are about 3500 years old
that's just America, and its not just nuclear, it's civil engineering & construction projects of all kind that have ballooned in complexity and cost due to myriad political and regulatory causes. France can build reactors cheaply and easily due to standardized reactor designs; I'm not sure the US has ever replicated a reactor build (ie, every reactor is built in a different & unique way)
They can, this one project is just the exception, it was a failure since the start (collaboration between France and Germany, Germany later ditched it in favor of their solar panel projects.. )
Would it be wrong to assume that we're dealing with an Apollo-style issue here (we "no longer have the technology to do it efficiently")? I'm certainly not an expert on nuclear power, but it is a little odd how the average nuclear reactor in the US is 42 years old [1].
It's just the U.S. China, France, and Japan have all successfully built nuclear at scale. I won't be surprised if developing nations in Africa start lapping the U.S. on nuclear power in the next few decades. They certainly have enough Uranium to do so.
Why would they take enormous risks on execution -- huge cost overruns and delays are par for the course with nuclear -- when solar + battery is so much cheaper and quicker?
> And this is a loss, because we need non-variable energy sources to augment wind/solar/tidal.
We need dispatchable energy to complement renewables. Nuclear is on the complete opposite side of the dispatchability equation with high CAPEX and low OPEX.
Like the cancellation exemplifies, even running at 100% nuclear is wholly uncompetitive. Being dispatchable means vastly lowering the utilization rate.
Not exactly. I work as a project manager in the transportation field, and I can tell you that inflation has doubled (or more) the cost of several of my projects in the past few years. Most of the increases can be attributed to rising costs, and not necessarily scope changes.
Combined with limited funding, the projects continue to get delayed.
These are to be mass produced, and shipped to the site. So really, the first few will have cost overruns and delays, but then they should be able to re-use all the tooling, processes, transportation equipment, etc, to get the rest going.
Only the reactor vessel which is probably one of the most expensive parts of the entire project (21%), along with the turbine (18%). Construction, engineering services, structures are probably where the bulk of cost overruns are happening.
There's been some work done on this in the US context (eg [0]) but not really enough. It seems like general large project issues (need to adapt to site specific conditions as they are discovered, difficulty ensuring work stays properly scheduled so that construction workers don't sit idle for large amounts of time, etc.) are a big part, and these are made worse because of the knock on impact when these interact with safety regulations/processes.
There's some hope that making designs smaller and more modular might enable less chance of big overspend at a slightly higher expected cost but that's speculation until we actually do it.
Ultimately, if you're forced to send cluster munitions because your ammo factory is a historically listed building that can't expand to fulfill demand, I think it won't matter how much you cut everything else. You just can't build another ammo factory, so you just don't get to do things.
As a civilization we have been capable of that multiple times in multiple places. Those technologies are not lost. In fact there are now new reactor projects that are simplified from older ones that could be built potentially fast.
People generalize a lot! There are a large variety of nuclear reactors. Even construction projects of the same design vary somewhat.
I wonder why the Navy or Core of Engineers can't make civilian nuclear systems. It arguably would have made us safer as a country to have energy independence by investing dollars in reactors for domestic use instead of for warheads and nuclear submarines.
Because their customer, the US navy, is the world's least price sensitive customer. That does not translate into competitive products on cutthroat markets.
Perhaps the other way to solve the energy transition is to lean heavily into mass storage of variable energy sources. If you have enough energy storage, you don't need so much of the base energy suppliers like gas/coal/nuclear.