Hacker News new | past | comments | ask | show | jobs | submit login
Commentaries on Criticisms of Magnetic Fusion [pdf] (pppl.gov)
16 points by theothermkn 24 days ago | hide | past | web | favorite | 27 comments



This is all you really need to read from Stacey:

"Based on our present understanding, D-T tokamak fusion reactors project a cost-of-electricity that is about 50% larger than the projected cost-of-electricity from advanced light-water reactors in the middle of the next century."

We all know what happened to the projected cost of fission reactors -- the projections turned out to be hopelessly optimistic, because of complexity and loss of experience. Fusion would face these problems in even worse form (indeed, ITER's cost ballooned 4x or more past the initial projections.)

The experience with fission has enabled us to calibrate the optimism bias in these projections, with damning results.

Simply being competitive with fission is no longer good enough for fusion to succeed. It has to be significantly better than fission.


I think the costs of fission were well understood in 1999. The critics he was responding to were all taking the position that fusion would not be competitive with fission.

Fusion is better in terms of proliferation issues, long-term radioactive waste, and fuel cost. Being in the same ball park in terms of the hard cost of electricity is probably a good enough target for now.


It would be good enough, if fission were otherwise the top candidate for powering the world.

But fission is now a loser technology, hopelessly uncompetitive vs. the hard charging renewables. It's not even close any more.


It's never clear what you are really arguing for. Why argue A>B and B>C if all you are trying to say is that A>C?

Solar with long-term chemical energy storage should play a huge role, but I think it is too soon to say if one will push out the other or if fission will stage a resurgence. A lot depends on incentives to phase out fossil fuels. If natural gas produced electricity remains the cheapest option the development of all energy alternatives will be effected.

If you are arguing that fusion will never compete economically with natural gas, you may be right in the short-term but in that case there may not be a long-term.


Because that's how you show A > C. Directly comparing fusion and PV would be difficult, but one can compare fusion and fission (because they share many of the same elements) and one can compare fission and PV (because they are directly competing in the actual market).

Fission provides a reality check on believing projections unmoored from empirical feedback. I would like to see the same methodology for fusion cost projections applied to fission, to see if the projections are anywhere close to what the plants actually cost to build in the real world.


Honestly, I think you are comparing apples to oranges. Average cost overrun for US nuclear fission plants was 207 percent, primarily due to mid-construction revisions and additional regulation. [https://www.cbo.gov/sites/default/files/cbofiles/ftpdocs/91x... page 17]

But what you seem to be talking about are cost projections for proposed pilot plants. It is possible that commercial fusion plants will have construction cost overruns but most modern fission and fusion proposals incorporate modular construction to limit the problem of one-of cost overruns.


> but most modern fission and fusion proposals incorporate modular construction to limit the problem of one-of cost overruns.

Let's see how that works out, friends!

"Modular by Design

The AP1000 plant has been designed to make use of modern, modular-construction techniques. The design incorporates vendor-designed skids and equipment packages, as well as large, multi-ton structural modules and special-equipment modules. Modularization allows construction tasks that were traditionally performed in sequence to be completed in parallel."

Oh dear. Modularization doesn't seem to have helped Westinghouse at all. Sorry about the bankruptcy, Toshiba!


Are you saying that fusion plants will face the same regulatory issues that fission plants experience? Again, I'm not sure what your point is.


The problem wasn't regulation, it was mismanagement. The complexity of building the plant simply overwhelmed the firms involved.

Fusion reactors would be even more complex. I think they're well beyond a practical upper bound for complexity of practical energy sources.

https://www.reuters.com/article/us-toshiba-accounting-westin...


What do you see as providing the majority of baseline load by 2060?


There will be no place for high levelized cost baseload generators. There will be intermittent sources, short term storage, and various chemical fuels (natural gas with carbon capture, biomass, hydrogen) for long term variability. Additionally, overproduction w. curtailment, dispatchable demand, and long distance transmission be used to ameliorate variability of supply.

By 2060, or likely well before, if solar continues down its historical learning curve, solar should be absurdly cheap, so cheap that resistive heat will be cheaper than burning any fossil fuel. We might even see artificial geothermal, where excess power is just dumped into heating rocks and water underground.


Current PV factories can only provide about 15% of the total energy need if they run at maximum capacity over the next 40 years. (Maybe much less if panels need to be replaced every 20-30 years.) Since the existing factories are losing money, how can you increase the number of factories by a factor of seven or more and still get absurdly cheap PV cells? New factories will only be built if they can justify the capital expense.

Yes PV cells will also get more efficient, but the problem of building all the factories to produce them and installing the low energy density solar farms, then storing the energy for daily and seasonal cycles and transmitting it to the population centers is a hugely wasteful process. Who is to say it is more practical than fusion. Spending a few billion to explore more economical ways to generate energy using one of the few methods that nature permits us seems like a drop in the bucket. If some optimists want to try it, good for them.

Edit: Most of the expanded PV factory production will be best utilized for industrial processes, further limiting the amount of PV available to replace baseload. https://kavli.berkeley.edu/kavli-ensi-retreat-solar-energy-f...


In an industry with a strong learning curve -- and solar has a strong one, better than wind -- it makes sense to take near term loses to accumulate experience.

Applying your same argument, one could conclude nuclear has no chance, since there is limited capacity to make nuclear power plants, and those making them have been losing large amounts of money. And unlike solar, there are not good experience effects there.

Combustion turbines have also shown economic trouble lately. GE's troubles stem in part from betting on that just before demand started collapsing.

Who says solar will be more practical than fusion, you ask? Extrapolating historical learning curves, the levelized cost of solar will drop to $0.01/kWh or so by the time the world transitions to mostly solar, especially in the sunniest areas. This is vastly lower than the projected cost of energy from fusion. If you say fusion will show experience effects too, then I'll note that fusion is most like fission, and fission (as I mentioned above) has not shown such effects, probably due to the inherent complexity and long construction time scales.

An ultimate low cost for solar will excuse many sins of variability and seasonality, allowing wasteful overinstallation and inefficient long term storage.


Learn from history. Nuclear power with all its cost overruns and other issues claimed 15% of the US electric market in 15 years. Where is solar after 40 years?

The difference is building factories to produce cells which are made into modules that need to be installed in a sunny location, connected to the grid, and then incorporated into a 24/365 electric demand cycle, versus building a plant that can produce 100% of its power on day one.

Part of using the learning curve is knowing where you are on the curve. And remember that it is log-log so you need to keep doubling production to keep getting those cost reductions (if you don't bottom out against reality.) High-temperature superconductors are presumably on the starting point of their learning curve. Increased demands for magnets for fusion research and other applications along with power transmission devices can potentially drive scale and efficiency factors for the production of REBCO tape to drive down the cost measured in $ per A/m.


No, you learn history.

Where is solar? A factor of 200 cheaper than it was then.

Nuclear has NEVER shown that kind of cost improvement. Its costs have stubbornly refused to fall. That alone marks it as a doomed technology. The only question was when the improving technologies would pass it. They have now done so.

You should pay attention to what the people with the money are doing with their money. They are, by and large, not building nukes, they are funding renewables. This is not because they are fools or cranks or green fanatics; it's because investment in nuclear just doesn't pencil out. It's been judged in the market and found wanting.

BTW, if you look at the cost figures for ARC, even if the magnets were totally free the thing would still be far outside the range of economic competitiveness.


any more recent commentaries?


There would be little benefit. ITER was delayed by 20 years due to a drop of funding, so most of the primary issues to address have been put on hold. The plasma physics and machine design has seen progress, but no one has claimed that wouldn’t be the case.

W7-X’s success has shifted some more funding into stellarators, but outside of that not much has changed in the landscape in the past 20 years. YCBO is still prohibitively expensive and fragile, but the use of high temperature superconductors with high critical currents would increase confinement time by a large amount.


Based on YBCO prices from a few years ago, the estitmate was $200M in material cost for the magnets of a 200MW reactor, not an outrageous amount considering all of the other costs. The real issue was getting the construction cost of the magnets down to 2x the materials cost from about the current 10x the material cost for large superconducting magnets.


The other argument is that if you use YBCO then the overall scale of the development is orders of magnitude reduced.


A delay of a project of 20 years or more is beyond my comprehension. That's people's entire careers. That's the entire senior staff of the project dying out or retiring. That's an extremely non-negligible amount of time from a people perspective.


And with this delay, and the decline of fission, when ITER is done there may be no tritium left to fuel any follow-on.


ITER was delayed because of incredible lowballing of the initial cost estimate, along with management screwups. They knew or should have known that estimate was fraudulent.


Oh? I assume you have a source to back that up. All accounts I’ve seen (wikipedia and the timeline on iter’s website) show that it is due to global politics, namely due to the US’s withdrawal (which was not motivated by finances).

You seem like you have a strong narrative you’re sticking to. Is there something you’d like to disclose?


https://fire.pppl.gov/FPA13_Freidberg2_2013.pptx

Look at the chart on page 13, and his comments on the next page.


That’s a retrospective look. That was not what happened in the early 2000s and is not why ITER stalled. ITER stalled because the US pulled out. You can’t change history to fit a narrative.


He is saying that if they had done due diligence, if they had honestly extrapolated the costs of previous experiments using the demonstrated empirical scaling laws, they'd have gotten costs closer to what ITER has ended up costing. They either knew or should have know the initial cost claims were dishonest.

The US opted out of ITER in 1999 and returned in 2003. The cost of constructing ITER was estimated at €5B in 2006. How could the earlier US pullout excuse the inaccuracy of that estimate? Your narrative there makes no sense.


Well, there's this from Abdou (2016) at UCLA (which seems to be the center in the US for the engineering side of fusion):

"MTBF for Blanket/FW/PFC in any DT fusion device is estimated top be very short while MTTR is predicted to be too long -- leading to very low availability of only a few percent"

http://www.fusion.ucla.edu/abdou/abdou%20presentations/2016/...

Contrast that with Stacey's optimism about reliability.




Applications are open for YC Winter 2020

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: