Hacker News new | past | comments | ask | show | jobs | submit login
Fusion Turns Up the Heat (aps.org)
36 points by swamp40 on Aug 12, 2022 | hide | past | favorite | 37 comments



> The current results are thus tantalizingly close to achieving unit gain—at the current rate of improvement, I expect this to happen within the next couple of years.

Does anyone have a graph of the rate of improvement? I'm curious.

> But for a fusion reactor to be commercially viable and deliver a sizeable amount of electricity to the grid, much higher gains (of order 100) are needed to compensate for the wall-plug efficiency of the laser and for the losses in energy collection and in the electricity production and distribution system.

Can anyone ELI8 why it needs to be 100? That seems excessive. IIUC, almost all the heat (at Fusion temps) can be turned to electricity. IIUC only about 5% of energy is lost during transfer. Is ~90% of power really being lost somewhere else? And if so, where!? This seems like a problem more easily solved than Fusion...

In my naive mind - G of even 2 seems like literally a money printing machine... I get it - a ridiculously expensive money printing machine to build. Does it print dollars fast enough to pay for building it, and maintaining it, etc...

Also, I know Newsweek isn't a great source, but they have another article about this experiment at the NIF: https://www.newsweek.com/nuclear-fusion-energy-milestone-ign...


>Does anyone have a graph of the rate of improvement?

It's not exactly a timeline, but you can roughly see the Lawson criteria development over the decades in figure 4 from this paper: https://link.springer.com/article/10.1007/s13280-015-0732-y

>In my naive mind - G of even 2 seems like literally a money printing machine...

That will get you nothing when you consider that a steam turbine has a maximal efficiency below 50% - and that's already pretty much the best way we have of turning heat into electricity. In practice there are also tons of other losses, so a G of at least 10 is probably required to get just about anything economical from it, and only a G of 100 or more would be a real money printing machine.


And, not even then, because it would still be overwhelmingly more expensive to operate than a fission plant, which is already not competitive.

So, under no circumstances could it ever be a "money-printing machine", or even profitable at all.


>more expensive to operate than a fission plant, which is already not competitive.

You're probably referring to this study that claimed fission power plants are not economical anywhere in the world. FYI, this was published by a german institute to justify their politicians' decision to phaseout nuclear power, but it has since been disproven (source also german): https://www.kernd.de/kernd-wAssets/docs/fachzeitschrift-atw/... If you use realistic real world data, nuclear is very much profitable and competitive in many sectors.


Nukes have only ever been "profitable" where heavily subsidized with public money. These subsidies are often cunningly concealed, as for instance the liability cap which amounts to a tax-paid insurance policy. No nuke could afford to operate if it paid market insurance rates -- if indeed anybody could afford to offer to insure one at all. Similarly, decommissioning cost is never included.

And, a system an order of magnitude more costly, as fusion would necessarily be, would certainly be far from competitive, even were regular nukes honestly viable.

Neglecting all subsidies, and also construction cost, current nukes are considered about on par with renewables, but renewables costs are still falling very sharply. So, any nuke started today, without neglecting CAPEX, absolutely could not compete with renewables built at the time it is finally fired up.

Corollary is that existing nukes, where not explicitly propped up by coercive funding, will be mothballed long before their design life is up, and their CAPEX amortized over the many fewer kWh actually produced will mean they cost way more per than originally projected.


> And, a system an order of magnitude more costly, as fusion would necessarily be, would certainly be far from competitive, even were regular nukes honestly viable.

This is a strawman.

We don't even have fusion. We don't know what it would cost.

It doesn't have the environmental issues, so decommissioning and insurance are lesser issues.


We do, in fact, know what it would cost to extract the heat in usable form. And that would be enormously more costly than what we do with fission. Extracting a few grams of tritium or helium-3, at a few parts per billion, from a thousand tons of molten, radioactive lithium every day, for fuel, can be no picnic. We can anyway be glad most of the lithium itself is not radioactive, just metal impurities spalled from surfaces of the pipes it ran in.

Decommissioning would necessarily be at least as big a job as a regular nuke, because the whole reactor, thousands of tons of embrittled metal, would have been blasted with hot neutrons for months or years. A good home for a thousand tons of what was molten neutron-irradiated lithium is no easier to find than for spent uranium.


This argument about subsidies was disproven in the linked paper. And don't forget that fossils is still the most subsidised industry, so it's not like we'd lose anything by transitioning.


I agree that fusion is unlikely to be a cheap source of power, but I can see it being cheaper than fission because you need fewer safety measures if you don't have large amounts of highly radioactive material around.


A thousand tons of molten radioactive lithium needs containment as urgently as a melted core.


There is no risk of any runaway reaction. It's much easier to contain, and also much less radioactive.


I can tell you have never seen an alkali metal fire.


We handle far more dangerous chemicals in many chemical plants.


The lasers are around 10-15% efficient in converting electricity to light.

These pellet experiments are not about energy production. They are about telling scientists how these materials would handle the pressure and heat of a collapsing uranium cylinder, how efficient they would fuse, and how much energy and neutrons they would produce.


Yes: weapons research. Period.

Absolutely no prospect of useful civil power, ever. And, no intention ever to try for it.


No graph, but it’s always two-to-three years away for the last 50 years give or take


As a kid I’d always hear it was 10-20 years away, or conservatively more like 50.

It seems like real progress is being made. 10-20 years was wrong, but I could see 50 being close (which would put us around 2035)


AFAIK the predictions were wrong because funding had been reduced continuously pretty much as the same rate as progress was made


Funding was reduced as the realistic prospects for producing anything of value, ever, was seen to drop without bound.


> Without doubt, fusion reactors are going from strength to strength

A cheeky way to say "from nowhere near where they need to be to not quite there," fusion fanbois!


20 years away, just like 20 years ago...


Can we stop posting this same comment under every article mentioning fusion? Inevitably someone then answers with the graph that shows funding over time and a whole chain of comments repeats for each of these articles.


Yes, it promotes the fiction that there will ever be commercially competitive fusion.

The more we learn, the farther into the future the prospect recedes.


Reminder that NIF is research for weapons, not power generation.


From their website:

> [We] conduct one-of-a-kind experiments that help ensure the nation’s security through stockpile stewardship, make important advances toward achieving fusion ignition in the laboratory for the first time, and lay the groundwork for a safe, carbon-free, secure energy future through inertial fusion energy.

It seems their main thing is safe maintenance/storage of nuclear stockpiles, but it's not their only thing


When energy people express skepticism about the value of NIF, the NIF team says it's for weapons work.

When weapons people express skepticism about the value of NIF, the NIF team says it's for energy work.


Maybe they have dual missions? That’s not unusual in government agencies

For example when a military Is standing around they tend to do peaceful things like help deliver emergency supplies to areas hit by natural disasters. Militaries have great logistics and are used to going to “hard to reach” locations.

And after the Cold War, nuclear subs support whale researchers by monitoring them


The point is that unless the weapons and energy people team up and compare their criticisms, each can be deflected by reference to the other.


The word "safe" has a very specific, legal meaning as applied by the US DoE. It means, informally, "it goes bang when you push the button". Period. It does not, in particular, mean "it doesn't go bang when you don't push the button". Nor, "it doesn't leak radioactives into ground water", nor "it doesn't land in terrorists' hands", or "doesn't fall into farmers' fields and blow up", or literally anything else.

Similarly, DoE's one other statutory responsibility, "reliability", means "it makes as big a bang as designed for", and literally nothing else.

Every other thing you can think of is literally somebody else's job, if indeed anybody's.


> Fulfilling the Lawson criterion for a burning plasma doesn’t yet mean that we can produce useful energy with fusion power plants. The next step toward that goal would be to demonstrate a fusion scheme that produces as much energy as that contained in the laser pulses driving the reaction. In other words, the scheme should have a net gain, G, of 1.

This is what I had always kinda sorta thought, but seeing someone put it in words made me realize that it actually doesn't make sense at all.

If 1 unit of laser energy gets you 0.2 units of fusion, that means you end up with 1.2 units of heat energy total. Okay so 1.2 units of heat just sounds like a crappy heat pump. But given that the heat is so damn hot (meaning ~0 entropy) you could in theory convert it all to electricity and come out ahead.


The main problem is that -- especially with laser fusion-- you are not inputting 1 unit of electricity and getting out 1.2 units of high grade heat. You are inputting 100 units of electricity, most of which is lost as low grade heat through wires, cooling devices, etc to power a laser that outputs 1 unit of photons. The fusion adds a bit of energy, but not enough to surpass the 100 units you put in. I don't know the exact efficiency of these lasers, so 100x may be wrong, but I think I'm in the right order of magnitude.


Ah that makes sense. I had no idea the lasers were that inefficient. Still it's not an inherent feature of the problem. But maybe getting the fusion reaction to self-sustain is the easier thing to solve.


Lasers are likely 10% efficient, not 1%, but it's still an order of magnitude less than needed to break even.


The laser at NIF has a 0.5% efficiency [1]. Some lasers are highly efficient (for example diode lasers). High-energy pulsed lasers are usually not.

[1] https://physics.aps.org/articles/v14/168


How would you convert it into energy without losses?

For your example of 0.2 increase, you need 83% efficiency — or 91% efficiency on both input and output.

Steam turbines may not hit that level of efficiency [1].

If you assume that input and output are each 80% efficient, you need to generate 0.5 units per unit put into the system to break even; if they’re 70% efficient, you need to generate 1 unit per unit put into the system.

[1] — https://www.epa.gov/sites/default/files/2015-07/documents/ca...


I just picked a random number. The article claims you actually get 0.72 units of heat rather than 0.2, so steam turbines should be fine. But as sacred_numbers pointed out the laser issue is more serious.


In short, you never, ever will.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: