"Based on our present understanding, D-T tokamak fusion reactors project a cost-of-electricity that is about 50% larger than the projected cost-of-electricity from advanced light-water reactors in the middle of the next century."
We all know what happened to the projected cost of fission reactors -- the projections turned out to be hopelessly optimistic, because of complexity and loss of experience. Fusion would face these problems in even worse form (indeed, ITER's cost ballooned 4x or more past the initial projections.)
The experience with fission has enabled us to calibrate the optimism bias in these projections, with damning results.
Simply being competitive with fission is no longer good enough for fusion to succeed. It has to be significantly better than fission.
Fusion is better in terms of proliferation issues, long-term radioactive waste, and fuel cost. Being in the same ball park in terms of the hard cost of electricity is probably a good enough target for now.
But fission is now a loser technology, hopelessly uncompetitive vs. the hard charging renewables. It's not even close any more.
Solar with long-term chemical energy storage should play a huge role, but I think it is too soon to say if one will push out the other or if fission will stage a resurgence. A lot depends on incentives to phase out fossil fuels. If natural gas produced electricity remains the cheapest option the development of all energy alternatives will be effected.
If you are arguing that fusion will never compete economically with natural gas, you may be right in the short-term but in that case there may not be a long-term.
Fission provides a reality check on believing projections unmoored from empirical feedback. I would like to see the same methodology for fusion cost projections applied to fission, to see if the projections are anywhere close to what the plants actually cost to build in the real world.
But what you seem to be talking about are cost projections for proposed pilot plants. It is possible that commercial fusion plants will have construction cost overruns but most modern fission and fusion proposals incorporate modular construction to limit the problem of one-of cost overruns.
Let's see how that works out, friends!
"Modular by Design
The AP1000 plant has been designed to make use of modern, modular-construction techniques. The design incorporates vendor-designed skids and equipment packages, as well as large, multi-ton structural modules and special-equipment modules. Modularization allows construction tasks that were traditionally performed in sequence to be completed in parallel."
Oh dear. Modularization doesn't seem to have helped Westinghouse at all. Sorry about the bankruptcy, Toshiba!
Fusion reactors would be even more complex. I think they're well beyond a practical upper bound for complexity of practical energy sources.
By 2060, or likely well before, if solar continues down its historical learning curve, solar should be absurdly cheap, so cheap that resistive heat will be cheaper than burning any fossil fuel. We might even see artificial geothermal, where excess power is just dumped into heating rocks and water underground.
Yes PV cells will also get more efficient, but the problem of building all the factories to produce them and installing the low energy density solar farms, then storing the energy for daily and seasonal cycles and transmitting it to the population centers is a hugely wasteful process. Who is to say it is more practical than fusion. Spending a few billion to explore more economical ways to generate energy using one of the few methods that nature permits us seems like a drop in the bucket. If some optimists want to try it, good for them.
Edit: Most of the expanded PV factory production will be best utilized for industrial processes, further limiting the amount of PV available to replace baseload. https://kavli.berkeley.edu/kavli-ensi-retreat-solar-energy-f...
Applying your same argument, one could conclude nuclear has no chance, since there is limited capacity to make nuclear power plants, and those making them have been losing large amounts of money. And unlike solar, there are not good experience effects there.
Combustion turbines have also shown economic trouble lately. GE's troubles stem in part from betting on that just before demand started collapsing.
Who says solar will be more practical than fusion, you ask? Extrapolating historical learning curves, the levelized cost of solar will drop to $0.01/kWh or so by the time the world transitions to mostly solar, especially in the sunniest areas. This is vastly lower than the projected cost of energy from fusion. If you say fusion will show experience effects too, then I'll note that fusion is most like fission, and fission (as I mentioned above) has not shown such effects, probably due to the inherent complexity and long construction time scales.
An ultimate low cost for solar will excuse many sins of variability and seasonality, allowing wasteful overinstallation and inefficient long term storage.
The difference is building factories to produce cells which are made into modules that need to be installed in a sunny location, connected to the grid, and then incorporated into a 24/365 electric demand cycle, versus building a plant that can produce 100% of its power on day one.
Part of using the learning curve is knowing where you are on the curve. And remember that it is log-log so you need to keep doubling production to keep getting those cost reductions (if you don't bottom out against reality.) High-temperature superconductors are presumably on the starting point of their learning curve. Increased demands for magnets for fusion research and other applications along with power transmission devices can potentially drive scale and efficiency factors for the production of REBCO tape to drive down the cost measured in $ per A/m.
Where is solar? A factor of 200 cheaper than it was then.
Nuclear has NEVER shown that kind of cost improvement. Its costs have stubbornly refused to fall. That alone marks it as a doomed technology. The only question was when the improving technologies would pass it. They have now done so.
You should pay attention to what the people with the money are doing with their money. They are, by and large, not building nukes, they are funding renewables. This is not because they are fools or cranks or green fanatics; it's because investment in nuclear just doesn't pencil out. It's been judged in the market and found wanting.
BTW, if you look at the cost figures for ARC, even if the magnets were totally free the thing would still be far outside the range of economic competitiveness.
W7-X’s success has shifted some more funding into stellarators, but outside of that not much has changed in the landscape in the past 20 years. YCBO is still prohibitively expensive and fragile, but the use of high temperature superconductors with high critical currents would increase confinement time by a large amount.
You seem like you have a strong narrative you’re sticking to. Is there something you’d like to disclose?
Look at the chart on page 13, and his comments on the next page.
The US opted out of ITER in 1999 and returned in 2003. The cost of constructing ITER was estimated at €5B in 2006. How could the earlier US pullout excuse the inaccuracy of that estimate? Your narrative there makes no sense.
"MTBF for Blanket/FW/PFC in any DT fusion device is estimated top be very short while MTTR is predicted to be too long -- leading to very low availability of only a few percent"
Contrast that with Stacey's optimism about reliability.