Hacker News new | past | comments | ask | show | jobs | submit login

Your comment above was not entirely wrong. The majority of nuclear energy's costs are overhead costs of plant construction. If a plant is not operated for its intended service life, $/MWH will skyrocket. Opposition to nuclear power (which existed prior to Three Mile Island, and increased thereafter) caused plants to be closed prematurely. This drove up the effective cost of nuclear energy, fueling the push for more closures on the basis of higher costs, and so on.

Curiously, these sorts of high nuclear costs are not experienced by every country. France's electricity (70-80% nuclear) is a fraction of the cost of Germany's electricity production, while simultaneously emitting less carbon. Countries like South Korea, and Japan (until they cancelled nuclear plants following the tsunami) had cost effective nuclear industries. Cost overruns happened, but not as large as compared to the US. The main explanations for this that I have encountered is that US nuclear plants have tended to be one-off projects, whereas other countries implemented serialized production of a handful of designs. Building 10 instances of the same plant is easier than building 10 different plants.




Germany decided, as an act of global charity, to buy renewables when they were much more expensive (they were the top global market for PV ten years ago, when PV was far more expensive than it is now), to drive them down their learning curves. This was spectacularly successful, but they are still paying for that. It says nothing about what someone else would have to pay NOW to install the same renewable capacity.

At the same time, France has tried to get its nuclear industry going again with new reactors, to disastrous result.

So if you want to build energy infrastructure in 1980, sure, go ahead and choose nuclear over renewables. If you want to do it now, that choice would be crazy.


It was not spectacularly successful. Germany continues to pollute more than France, with roughly 65% of its electricity generation coming from fossil fuels [1]. In France, nuclear power generates 72% of electricity and fossil fuels account for 9% [2]. And yet, Germans pay twice the rate of France for electricity. It is the worst of both worlds: higher costs and higher carbon emissions.

> So if you want to build energy infrastructure in 1980, sure, go ahead and choose nuclear over renewables. If you want to do it now, that choice would be crazy.

Did technology magically get worse from 1980? Did the French somehow forget how to build power plants? Nuclear energy projects were cheap during the 1980s because France built a large number of a handful of plant designs. Modern plants are usually one-off projects with a single digit number of plants built of a given design. Building a dozen of the same reactors, turbines, etc. is much cheaper than building one. Political opposition has meant that only small allocations of funds have been dedicated towards nuclear power, so serialized production has not been done recently.

1. https://en.wikipedia.org/wiki/Electricity_sector_in_Germany

2. https://en.wikipedia.org/wiki/Electricity_sector_in_France


It was spectacularly successful in driving down the cost of PV.

> Did technology magically get worse from 1980?

Nothing magic about what happened. Did you expect the relative rankings of technologies to remain fixed and unchanging across the decades?

Nuclear is now a clear loser technology, far more expensive than renewables. Why would anyone want to invest now in the loser technology? What weird psychological phenomenon is driving this fixation?


The only measures in which intermittent sources beat nuclear is nameplate capacity. Net generation is a fraction of that for intermittent sources. The capacity factor (as in, what percentage of it's nameplate capacity is actually generates in real world use cases) of intermittent sources is low,1/4 and 1/3 for solar and wind. For nuclear it's over 90% [1].

People keep saying that intermittent sources are winners and that nuclear is a losing technology. Real world experience says otherwise. The only way intermittent sources are feasible is with cheap high capacity energy storage, which has never been built. In fact, Germany has had to build more fossil fuel plants following the closure of nuclear facilities. If we want to replace all electricity generation with carbon free sources nuclear is the only known way to do that (besides geography-dependent sources like hydroelectricity and geothermal power). If there's some psychological fixation at hand here, it's the aversion to nuclear power and draw to intermittent sources despite the unsolved fundamental limitations of the latter.

1. https://www.energy.gov/ne/articles/what-generation-capacity


Not sure what you mean by "nameplate capacity there", but renewables beat nuclear in at least three ways:

The first is Levelized Cost of Energy (total cost, divided by total energy generated.) For solar and wind this is current 3-4x better than new nuclear construction. This is an enormous difference.

The second is time to market. Renewables can be brought online much faster than a nuclear plant. This (along with staged introduction) allows renewables to track actual demand rather than needing to be able to predict future demand. Failure to do that was the cause of much utility pain in the first nuclear era, including the bankruptcy of the WPPSS ("Whoops"). Demand for electricity suddenly stopped growing. The same dynamic occurred more recently as the "nuclear renaissance" collided with suddenly cheap natural gas.

The final metric by which renewables are destroying nuclear is rate of improvement. Renewables have shown strong and consistent experience effects. Solar improves in cost by ~20% for each doubling of cumulative installed capacity. Nuclear, on the other hand, has not consistently shown any experience effects. This is likely related to the fundamental complexity of large, integrated nuclear systems, vs. the improvement trends seen in design and manufacture of small, highly replicated systems (a phenomenon well known in Silicon Valley).


Nameplate capacity is the generated capacity without accounting for intermittency in power generation. The reality is, plants do not generate their full capacity for the entirety of their operation.

> The second is time to market. Renewables can be brought online much faster than a nuclear plant. This (along with staged introduction) allows renewables to track actual demand rather than needing to be able to predict future demand. Failure to do that was the cause of much utility pain in the first nuclear era, including the bankruptcy of the WPPSS ("Whoops"). Demand for electricity suddenly stopped growing. The same dynamic occurred more recently as the "nuclear renaissance" collided with suddenly cheap natural gas.

You're hand-waving away the fact that intermittent sources are exactly that: intermittent. And what happens when the demand for energy occurs when the renewable supply of energy is not available? Germany and California both encountered the same problem, and they both used the same solution: use fossil fuels. That is why Germany's carbon emissions have remained flat for the last decade [1] despite renewable power generation doubling in the same period of time[2]. It doesn't matter how much energy wind and solar produces if you need to revert to hydrocarbons in the evening.

The levels of energy storage necessary to run an electrical entirely on renewable energy remain the stuff of science fiction. Batteries are possible, but using those would cause the price of electric vehicles to increase considerably and ensure that hydrocarbons remain the primary fuel for cars. California uses hydroelectric storage, but the efficiency is extremely low (over 70% energy lost).

By comparison, nuclear plants provide continuous output throughout the day. You paint this as a negative, when it's really not. If excess power is generated, then use that excess power to sequester carbon. Too much energy is an easy problem to solve, too little energy is not.

> The final metric by which renewables are destroying nuclear is rate of improvement. Renewables have shown strong and consistent experience effects. Solar improves in cost by ~20% for each doubling of cumulative installed capacity. Nuclear, on the other hand, has not consistently shown any experience effects. This is likely related to the fundamental complexity of large, integrated nuclear systems, vs. the improvement trends seen in design and manufacture of small, highly replicated systems (a phenomenon well known in Silicon Valley).

This is exactly that I explained in my comment above. Most nuclear power in the United States has been one-off projects to the public's unwillingness to see widespread adoption. By comparison, countries that used serialized plant designs (most famously, France) saw much cheaper nuclear power construction. Yes, replicated designs are much cheaper to build than unique designs. That is why nuclear power was so cost-effective in France while so much more expensive in the United States.

The reality is that there are only two proven methods of powering nations with carbon-free energy: Geographically dependent solutions like hydroelectric power and geothermal power. And nuclear power.

1. https://www.economist.com/europe/2017/11/09/germany-is-missi...

2. https://www.economist.com/sites/default/files/imagecache/128...


No, I'm not handwaving, I'm demonstrating that your claim that renewables only advantage is "nameplate capacity" (which is nonsensical on its face; nameplate capacity of either source can be as high as you want; did you mean cost per nameplate capacity?) was wrong, by giving three other metrics by which renewables are superior.

Intermittency has a cost, but the enormous levelized cost advantage of renewables is already dooming nuclear. There's a reason no merchant nuclear plants are being built -- it would be economically ludicrous to do so. I'll believe what the selfish investors who want to make money are doing before I believe what you think they should be doing.


If you don't know what a term means, try searching for the definition: https://en.wikipedia.org/wiki/Nameplate_capacity

First of all, your claim that countries aren't building new nuclear plants is factually incorrect. China, Korea, and other countries are continuing to build nuclear plants. What you really mean to say is that few Western countries are building nuclear plants, and that's primarily due to an unwillingness to build nuclear plants at any significant scale. Nuclear loses not because it is uncompetitive with solar and wind, but because it is uncompetitive with fossil fuels and because of irrational nuclear phobia that leads to intense opposition to nuclear power construction.

The levelized cost of intermittent sources may be low for the first 20-30% of electricity generation. But every country hits a wall around that figure, because there isn't any cost effective storage solution. If the levelized costs were altered to include energy storage costs to deliver power when power is in demand, intermittent sources would have terrible costs for providing power outside of its peak production time. This is why Germany, California, and every other countries that tries to use solar and wind energy ends up building the same amount of capacity with natural gas plants to fulfill peak demand. Politicians sell the dream of intermittent energy production, then turn around and build gas plants when they realize it's a fantasy.


Yes, that's what I knew nameplace capacity was. And by that definition, your claim was absurd. Solar doesn't have a higher nameplate capacity, it has higher nameplate capacity per dollar. Your units weren't even right.

The intermittency argument you are giving is quite flawed. Solar and wind can go far past 20-30% and still be cheaper than new nuclear. If they are not going higher now it's because of sunk costs in existing generating capacity, the slow growth of demand, and the slow turnover of existing generating capacity.


> The intermittency argument you are giving is quite flawed. Solar and wind can go far past 20-30% and still be cheaper than new nuclear.

I'll believe it when I see it. History so far has demonstrated otherwise. In the meantime, the French are laughing at all of us as they enjoy their cheap, carbon free energy with enough surplus to export 3 billion euros of it.


History has demonstrated no such thing.

The French are also not laughing, since their attempts to build new nuclear plants have gone disastrously wrong. Their nuclear construction industry is in a shambles, losing billions, and had to be bailed out by the government.


And yet, they're still paying well below average energy costs, and emitting a fraction of the carbon as other countries.


We don't actually know what their old nuclear plants cost to build. That information has been conveniently lost, and in any case was mixed up with their military nuclear program.

But that's all water under the bridge. Their attempts NOW to build nuclear capacity have demonstrated that they cannot build nukes competitively. The facts on the ground trump what you think should be true.


Calling the German policy a success when they burned coal for 30 years while France produced carbon-free power for all this time is really rich.

And Germany clear would have the chops to pull of a nuclear build up if that was their focus.

Had Germany done so they might now be close to done rather then still decades away from carbon-free energy.


It was a success at driving down the cost of PV. Do please try to read what I actually wrote.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: