Hacker News new | past | comments | ask | show | jobs | submit login

> The current results are thus tantalizingly close to achieving unit gain—at the current rate of improvement, I expect this to happen within the next couple of years.

Does anyone have a graph of the rate of improvement? I'm curious.

> But for a fusion reactor to be commercially viable and deliver a sizeable amount of electricity to the grid, much higher gains (of order 100) are needed to compensate for the wall-plug efficiency of the laser and for the losses in energy collection and in the electricity production and distribution system.

Can anyone ELI8 why it needs to be 100? That seems excessive. IIUC, almost all the heat (at Fusion temps) can be turned to electricity. IIUC only about 5% of energy is lost during transfer. Is ~90% of power really being lost somewhere else? And if so, where!? This seems like a problem more easily solved than Fusion...

In my naive mind - G of even 2 seems like literally a money printing machine... I get it - a ridiculously expensive money printing machine to build. Does it print dollars fast enough to pay for building it, and maintaining it, etc...

Also, I know Newsweek isn't a great source, but they have another article about this experiment at the NIF: https://www.newsweek.com/nuclear-fusion-energy-milestone-ign...




>Does anyone have a graph of the rate of improvement?

It's not exactly a timeline, but you can roughly see the Lawson criteria development over the decades in figure 4 from this paper: https://link.springer.com/article/10.1007/s13280-015-0732-y

>In my naive mind - G of even 2 seems like literally a money printing machine...

That will get you nothing when you consider that a steam turbine has a maximal efficiency below 50% - and that's already pretty much the best way we have of turning heat into electricity. In practice there are also tons of other losses, so a G of at least 10 is probably required to get just about anything economical from it, and only a G of 100 or more would be a real money printing machine.


And, not even then, because it would still be overwhelmingly more expensive to operate than a fission plant, which is already not competitive.

So, under no circumstances could it ever be a "money-printing machine", or even profitable at all.


>more expensive to operate than a fission plant, which is already not competitive.

You're probably referring to this study that claimed fission power plants are not economical anywhere in the world. FYI, this was published by a german institute to justify their politicians' decision to phaseout nuclear power, but it has since been disproven (source also german): https://www.kernd.de/kernd-wAssets/docs/fachzeitschrift-atw/... If you use realistic real world data, nuclear is very much profitable and competitive in many sectors.


Nukes have only ever been "profitable" where heavily subsidized with public money. These subsidies are often cunningly concealed, as for instance the liability cap which amounts to a tax-paid insurance policy. No nuke could afford to operate if it paid market insurance rates -- if indeed anybody could afford to offer to insure one at all. Similarly, decommissioning cost is never included.

And, a system an order of magnitude more costly, as fusion would necessarily be, would certainly be far from competitive, even were regular nukes honestly viable.

Neglecting all subsidies, and also construction cost, current nukes are considered about on par with renewables, but renewables costs are still falling very sharply. So, any nuke started today, without neglecting CAPEX, absolutely could not compete with renewables built at the time it is finally fired up.

Corollary is that existing nukes, where not explicitly propped up by coercive funding, will be mothballed long before their design life is up, and their CAPEX amortized over the many fewer kWh actually produced will mean they cost way more per than originally projected.


> And, a system an order of magnitude more costly, as fusion would necessarily be, would certainly be far from competitive, even were regular nukes honestly viable.

This is a strawman.

We don't even have fusion. We don't know what it would cost.

It doesn't have the environmental issues, so decommissioning and insurance are lesser issues.


We do, in fact, know what it would cost to extract the heat in usable form. And that would be enormously more costly than what we do with fission. Extracting a few grams of tritium or helium-3, at a few parts per billion, from a thousand tons of molten, radioactive lithium every day, for fuel, can be no picnic. We can anyway be glad most of the lithium itself is not radioactive, just metal impurities spalled from surfaces of the pipes it ran in.

Decommissioning would necessarily be at least as big a job as a regular nuke, because the whole reactor, thousands of tons of embrittled metal, would have been blasted with hot neutrons for months or years. A good home for a thousand tons of what was molten neutron-irradiated lithium is no easier to find than for spent uranium.


This argument about subsidies was disproven in the linked paper. And don't forget that fossils is still the most subsidised industry, so it's not like we'd lose anything by transitioning.


I agree that fusion is unlikely to be a cheap source of power, but I can see it being cheaper than fission because you need fewer safety measures if you don't have large amounts of highly radioactive material around.


A thousand tons of molten radioactive lithium needs containment as urgently as a melted core.


There is no risk of any runaway reaction. It's much easier to contain, and also much less radioactive.


I can tell you have never seen an alkali metal fire.


We handle far more dangerous chemicals in many chemical plants.


The lasers are around 10-15% efficient in converting electricity to light.

These pellet experiments are not about energy production. They are about telling scientists how these materials would handle the pressure and heat of a collapsing uranium cylinder, how efficient they would fuse, and how much energy and neutrons they would produce.


Yes: weapons research. Period.

Absolutely no prospect of useful civil power, ever. And, no intention ever to try for it.


No graph, but it’s always two-to-three years away for the last 50 years give or take


As a kid I’d always hear it was 10-20 years away, or conservatively more like 50.

It seems like real progress is being made. 10-20 years was wrong, but I could see 50 being close (which would put us around 2035)


AFAIK the predictions were wrong because funding had been reduced continuously pretty much as the same rate as progress was made


Funding was reduced as the realistic prospects for producing anything of value, ever, was seen to drop without bound.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: