Hacker News new | past | comments | ask | show | jobs | submit login
US Department of Energy: Fusion Ignition Achieved (energy.gov)
2786 points by novateg on Dec 13, 2022 | hide | past | favorite | 1021 comments



> LLNL’s experiment surpassed the fusion threshold by delivering 2.05 megajoules (MJ) of energy to the target, resulting in 3.15 MJ of fusion energy output, demonstrating for the first time a most fundamental science basis for inertial fusion energy (IFE)

Yesterday, everyone was complaining about the 2.2:2.0 ratio, but now we're working with 3.15:2.05.

With modern lasers, that'd be a total Q of 0.375 assuming 100% efficiency through direct-energy-capture.

The jumps to get here included

- 40% with the new targets

- 60% with magnetic confinement

- 35% with crycooling of the target

The recent NIF experiments have jumped up in power. The first shot that started this new chain of research was about 1.7 MJ of energy delivered. Now, 2.15 MJ. However, the output has jumped non-linearly, demonstrating the scaling laws at work.

> I’ve helped to secure the highest ever authorization of over $624 million this year in the National Defense Authorization Act for the ICF program to build on this amazing breakthrough.”

It's nice to see this milestone recognized, even if the funding it still rather small.


Their total power draw from the grid was 300 megajoules and they got back about 3 megajoules, so don't start celebrating yet. Source: New York Times.


I guess we should take this as a lesson in communications. The "breakeven" thing is a red-herring that should have been have been left out of the message, or at least only mentioned as a footnote. The critical ELI5 message that should have been presented is that they used a laser to create some tiny amount of fusion. But we have been able to do that for a while now. The important thing is that they were then able to use the heat and pressure of the laser generated fusion to create even more fusion. A tiny amount of fusion creates even more fusion, a positive feedback loop. The secondary fusion is still small, but it is more than the tiny amount of laser generated fusion. The gain is greater than one. That's the important message. And for the future, the important takeaway is that the next step is to take the tiny amount of laser fusion to create a small amount of fusion, and that small amount of fusion to create a medium amount of fusion. And eventually scale it up enough that you have a large amount of fusion, but controlled, and not a gigantic amount of fusion that you have in thermonuclear weapons, or the ginormous fusion of the sun.


> [...], or the ginormous fusion of the sun.

The sun actually has very little fusion per cubic metre or per kg.

Per volume the core of the sun produces only a quarter of the heat of the human body (and per kg it's even less, owing to high density).

That's why our fusion reactors can't just mimic stars, they have to far surpass them to be useful to us.


This is incredible.


To give some more detail: hydrogen to helium fusion (even with intermediate steps) is extremely unlikely to happen. That's part of why the sun will last for billions of years. And that's also why first human attempts at fusion are not trying to use straight up hydrogen as the fuel.

Good old Wikipedia has this gem:

> The large power output of the Sun is mainly due to the huge size and density of its core (compared to Earth and objects on Earth), with only a fairly small amount of power being generated per cubic metre. Theoretical models of the Sun's interior indicate a maximum power density, or energy production, of approximately 276.5 watts per cubic metre at the center of the core,[63] which is about the same power density inside a compost pile.

https://en.wikipedia.org/wiki/Sun#Core

Another fun fact: there's a decades old design for a gadget that fits at the top of your desk and does nuclear fusion. You could build one yourself, if you are sufficiently dedicated. Unfortunately, no one has ever worked out how to run one of them as a power plant. Ie how to get more useful energy out than you have to put in.

https://en.wikipedia.org/wiki/Fusor


Those Fusors can make a decent neutron source however. Those were invented by Philo Farnsworth, who invented the cathode ray tube for television.


I’m so disappointed I can’t just buy a fusor online.

But here is a MAKE magazine article explaining how to build! https://makezine.com/projects/nuclear-fusor/


The per-volume claim rings alarms in my head.

If it produced a quarter of the heat of the human body per volume, its temperature would be lower as well (less than 37 degrees Celsius).[1] This is obviously not the case.

[1] Obviously heat and temperature are not the same, I know that. But when something’s temperature is higher than another thing’s, then heat is exchanged along that gradient. Meaning if the sun produced less volumetric heat than the human body, a human body placed within the sun would warm the sun and cool the human.


Heat and temperature are indeed not the same.

For both the sun and a human on earth there are two processes going on:

1. Heat production per unit volume.

2. Heat loss per unit surface area.

The volume to surface area ratio for the sun is much larger than for the human, for a minor reason (the sun is a sphere) and a major reason (the sun's linear size is much bigger). So the equilibrium temperature of the sun in the same ambient outside environment is higher than the human's.

Your thought experiment about placing a human inside the sun would in fact work as you say, if a human body continued to produce heat once it had achieved thermal equilibrium with the surrounding plasma.


Yeah, this is a morbid analogy but if you got a bunch of people enclosed in a small area, the people in the middle will get so hot they will heatstroke, even if it's freezing outside. See the recent Korean crushing disaster.


Don't give them ideas, harnessing people power would solve all the major problems. Overpopulation, global warming, energy crisis,... Reminds me of that 'Mitchell and Webb - Kill all the poor' sketch.


Well, plant the idea that there is a fusion reaction going on in mitochondria at a very low scale. Throw in terms like "proton gradient", "electron transport chain" and create a science conspiracy.


There's no overpopulation problem, and no energy crisis.



Energy was a bit more expensive than usual for a short while.

I guess you can call that a crisis.


Can you say more about the ways in which fusion reactors need to surpass stars, and why people believe it's feasible that we can sufficiently get to that point?

(Also - thanks for sharing one of the most interesting comments I've read on the internet in quite a while.)



That, but also mimicing the pressure of the sun here is just not possible (yet? :D), why we need to play with higher temperaturs with different problematic consequences.


Well, have a look at https://qr.ae/prpVIL which says:

> The highest instantaneous pressures we can obtain here on Earth are in the Fusion reactor at the National Ignition Facility and in Thermonuclear weapon detonations. These achieve pressures of 5 x 10^12 and 6.5 x 10^15 Pascal respectively. For comparison, the pressure inside our Sun’s core is 2.5 x 10^16 Pascal.


Okok, amazing! But far from the controlled thing we'd like to have? :D


How is the sun so hot then?


Presumably it's so big that this amount adds up, and it's sufficiently isolated to heat up without losing too much power towards the outside


One square inch of your skin has a couple inches of flesh behind it making heat.

One square inch of sun has billions of inches of hydrogen behind it making heat.


Total power radiated by a black body per unit surface area scales as T^4 (in Kelvin).

So for black bodies with identical shape and linear dimensions R1 and R2, with identical power production per unit volume, both in thermal equilibrium with whatever is outside them, you would expect:

R1/R2 = (T1/T2)^4

(because setting power produced equal to power radiated gives R proportional to T^4).

Pretending humans are spheres with radius 1m and the sun is a sphere with radius 7*10^8m, you would expect the sun to have ~160 times the temperature of a human at equilibrium in vacuum. It's going to be lower because not all of the sun is power-producing, of course. But higher because a human is not 1m in radius. And again higher because humans are not spheres and lose heat more than a sphere would for the same volume (more surface area).

The sun is about 6000K on the surface. That would give us ~40K for the equilibrium temperature of a human in vacuum, which at least seems truthy.

TL;DR: the sun is big, with a small surface area compared to its volume, because it's big.


and not a gigantic amount of fusion that you have in thermonuclear weapons,

Ironically, this experiment was designed primarily to simulate the fusion you have in thermonuclear weapons. That's the NIF's purpose and the purpose of this experiment. From Nature https://www.nature.com/articles/d41586-022-04440-7

"Herrmann acknowledges as much, saying that there are many steps on the path to laser fusion energy. “NIF was not designed to be efficient,” he says. “It was designed to be the biggest laser we could possibly build to give us the data we need for the [nuclear] stockpile research programme.”


How would it help build better weapons? I thought the existing thermonuclear bombs do the job perfectly fine. Sure, the military might want to make them a bit smaller or a bit cheaper, but is it really such a big deal as to warrant a major announcement?


"Nuclear stockpile research" is only making weapons "better" in the sense that they are reliable despite not having been tested in decades. There's probably a component of "unemployed nuclear weapons designers is a bad thing" also.

The fusion for power experiments are using the same laser equipment but different targets and sensors.


I wonder if they achieved a high Q value decades ago, but using the targets more inline with the bomb materials.


The DOD also probably likes having some nuclear weapons designers available on short notice


I guess they also don't want to blow up random civilians by accident again.

When castle bravo was tested, we didn't knew that lithium7 fusion was possible and that it would generate energy. The bomb had a lot of lithium7 because it was cheaper than lithium6. Castle Bravo then proceeded to explode with way more power than intended, it vaporized the measurement instruments, ruined the test site, damaged civilian property and caused a horrible amount of fallout that screwed a enormous amount of people from more than one country.

Even during war, I suppose you want your explosions to behave in the way you expect... so you need to figure out all the physics related to them.



Given the limitations on nuclear research for weapons purposes any information that can be gleaned from these experiments that is 'dual use' is more than welcome with the parties that are currently stymied by various arms control agreements. This is also why you will see a lot of supercomputer capacity near such research, it allows simulation of experiments with high fidelity rather than the experiments themselves. These are all exploitation of loopholes. The biggest value is probably in being able to confirm that the various computer models accurately predict the experimental outcomes. This when confirmed at scale will allow for the computer models to be used for different applications (ie: weapons research) with a higher level of confidence.


Presumably these computer models are mostly useful for creating new designs (since the old designs were proven by real tests). Would such new designs be convincing enough to the adversary to fulfill its role as a strategic deterrence?

When (in XX years?) almost all US nukes are only simulated on computers and not actually tested, the Russians may start wondering if the US aresnal actually works, no? That would be a horrible outcome, since it means the Russians would be taking somewhat greater risks in their decision-making. Wouldn't far outweigh any opertaional or financial benefits the newer designs offer?

I suppose one could argue that if the loss of confidence in strategic weapons matched the actual loss in reliability, it might be a "no op" (although even this is arguable). But if the Russians think the US simulations suck, while the US is actually building really good simulations, the loss of confidence would be greater than the actual loss in reliability. In the extreme case, the nukes work great, but everyone thinks they are scrap metal.

Of course, the same happens in reverse: if the Russians are upgrading their weapons to untested designs, the US may start underestimating the risk.


> the Russians may start wondering if the US aresnal actually works, no?

If anything the last year or so has probably made the reverse happening and the US and its adversaries likely both have very high confidence in that the US arsenal actually works.


Or the US finds that after x many years bombs degrade in unexpected ways, and that while we were able to figure this out and fix it. Then we speculate that the Russians probably haven't fixed theirs in the same way and their bombs aren't good anymore. Which means the risk of a nuclear war just jumps up, since MAD is compromised.


To learn about their degradation modes and how to maintain them (since full-scale nuclear tests are now verboten, and subcritical experiments only deal with fission part of the entire assembly -- we now also need some experimental setup to test the fusion part: radiation pressure, X-ray reflection, ablation modes etc. NIF is this setup).

Also, there is a constant need to improve fusion/fission rate in the total energy output, and perhaps eventually design pure fusion weapons, though this is still probably out of reach.


The USA no longer has the technical capability to manufacture the necessary parts to maintain the current stockpile. The whole strategic arms reduction treaty regime is basically a fig leaf to cover for the fact that we have to cannibalize some legacy weapons to maintain the rest. If nothing changes the USA probably won't have an effective arsenal within a century. Given the likely state of the USA by that time, that's probably for the best.


> USA probably won't have an effective arsenal within a century.

If other countries joined, it would be a great outcome.


> > USA probably won't have an effective arsenal within a century.

> If other countries joined, it would be a great outcome.

Why the optimism? Without MAD, it's nearly certain that we'd have a world war at some point in time. Sooner or later, it will surely happen. If you think it won't happen, or won't cost millions of lives, or won't employ re-developed nukes eventually, please tell me why you think so. (No sarcasm.)


I don’t think MAD is all that effective. At this moment in time we’re seeing:

* Several concurrent arms races in the Middle East, Asia, and Europe

* A high intensity conflict in Ukraine

* China threatening a land invasion into Taiwan

MAD might be preventing a country like Poland from jumping into the Ukraine conflict, but more likely it’s because of its involvement in NATO.

I think collective security organisations are a far more potent force for peace than nuclear weapons. If countries abided by their security agreements in WW2, then we’d have nipped the entire thing in the bud.


MAD means that the leaders know they risk their own lives. That's the elephant in the room.

Before MAD, for thousands of years, all the big populations were shaped/educated/pushed into limitless sacrifice for the motherland.

> collective security organisations are a far more potent force for peace than nuclear weapons

I think if you take MAD away, the "collective security organisations" would quickly break into good ol' alliances.


For most of the thousands of years of warfare kings their armies into battle. Only in the last 500 years or so has that changed


> I don’t think MAD is all that effective. At this moment in time we’re seeing: > * Several concurrent arms races in the Middle East, Asia, and Europe > * A high intensity conflict in Ukraine > * China threatening a land invasion into Taiwan

I mean... Only one of those is an actual fight. And there MAD doesn't apply because the defender doesn't have the Assured Destruction capability needed.


> Several concurrent arms races in the Middle East, Asia, and Europe

Which is to say, possible future proxy wars between the great powers where MAD will supposedly restrict conflict intensity. See below.

> A high intensity conflict in Ukraine

What's going on in Ukraine is a bog standard cold war style proxy war. The NATO plan is basically to turn it into another Afghanistan for the Russians. It's the exact thing that MAD is meant to keep from spilling over into a world war between the principals.

> China threatening a land invasion into Taiwan

This is more interesting. US conventional forces almost certainly have no hope of beating China that close to home. Therefore, any effective US response would require nuking China and China is presumably deterring that with their nukes. There is an argument to be made here that a non-nuclear Chinese military would be in Taiwan's best interests. However, I see no scenario where either a nuclear or non-nuclear China and a non-nuclear USA is in Taiwan's best interests. So while the MAD case isn't the best case for Taiwan here, it's also not the worst.


There are a lot more armed conflicts throughout the world that are proxies or partial proxies between the various powers in the world (US, Russia, and more). See Yemen, Syria, and many more throughout Africa. World powers tend to get on one side or another as they see their interests align and oftentimes this prolongs the conflict rather than bring it to any resolution.


I'm not really optimistic about this option, just a bit of wishful thinking about peace upon the world and such. OTOH, even with MAD we still have wars and probably much more than 100K people on average get killed in wars every year. Even with MAD, NATO is pushing conflict with Russia well beyond a proxy war at this point. MAD doesn't work if the world goes mad...


Interesting, but:

1. How does this research help address this problem?

2. What are the sources for your opinion?


> 1. How does this research help address this problem?

I see it the other way around, this problem makes me doubt that this research will ever actually lead anywhere, supposing it's even as good a result as it first appears.

> 2. What are the sources for your opinion?

It's a fact. And my source is dead tree media. I don't recall all the details, but there are some very finicky parts that go into a state of the art warhead and we have lost the capability to manufacture them. Is this really so surprising? We can't even build new F-22s anymore!


The most famous key component that was speculated or leaked, was that we lost the makeup and reason for the Styrofoam that holds the primary and surrounds the secondary.

This research successfully initiated fusion, using a capsule of hydrogen made of some material, surrounded by something, with an outer layer. This outer layer is turned into X-Rays by the laser, which then ablate the hydrogen capsule's casing casing the inwards pressure. You could speculate, that they just found the makeup for something that would replace the Styrofoam, or we just improved upon it.


This is incorrect.

Similar to the "we can't make concrete as good as the romans" line of woo. Caveat Emptor


I like how you topped off the Roman non sequitur with some Latin. Alas, it is correct. Here[1] is an article about "Fogbank."

[1] https://www.motherjones.com/politics/2009/05/fogbank-america...


We don't need to make exact replicas of every single old material. "Won't have an effective arsenal within a century" is silly.

And that absolutely was not a non-sequitur.


Thermonuclear weapons have considerably tighter engineering requirements than sidewalks. So much so that the comparison is ludicrous.


Even if you think the comparison is very bad, it's not a non-sequitur.


"If Roman cement misconceptions exist then the US hasn't forgotten how to make Fogbank" is literally a non sequitur.


experimental verification of models with actual materials used in existing bombs, for one.


From what I can tell this is mostly a useful cover story as a fallback for the moon-shot type energy potential. But are there serious downsides to having this be the case? Does it significantly distract away from the pursuit of energy? Or somehow fundamentally constrain the projects scope?


Exactly the opposite: their task, assigned by Congress, is weapons work. Period.

But pretending to work on "carbon-free energy" is good for funding, just now. Four years ago, being all about weapons opened the tap.

Make no mistake, there is no story here. There will be no "unlimited free energy" from this, or any other fusion project. The fusion startups are spending down investors' money with zero possibility of payback (Helion conceivably excepted), because if they did get Q>1, they have no workable way to harness it. ITER will not even try to produce one watt-second of electricity. Its follow-on demo reactor won't start building until 2050, if not further delayed.

We know the way to get unlimited free energy: solar. Build more, get more. It doesn't have bomb scientists making inflated claims; it just works, and better every year.


> The fusion startups are spending down investors' money with zero possibility of payback (Helion conceivably excepted), because if they did get Q>1, they have no workable way to harness it. ITER will not even try to produce one watt-second of electricity. Its follow-on demo reactor won't start building until 2050, if not further delayed.

You seem to be awfully certain of that. I'm not an expert in this area, but my understanding is that the MIT Arc reactor is planned to use FLiBe as a liquid coolant that absorbs heat/neutrons/etc from the fusion reaction and is pumped into heat exchangers to boil water to run turbines. I mean, maybe there's some details not worked out and maybe I'm misunderstanding how it works, but it seems like a plan to generate electricity to me.

There's no plan to hook ITER up to a thermal plant because it's a research reactor not a power plant, but there's no conceptual reason they couldn't do it. (Not that ITER is a great example; the design is already antiquated before it's even finished.)


"Some details"... Have you ever built anything?

Driving steam turbines, even with other costs at zero, leaves you uncompetitive with renewables. But other costs would be very, very far from zero. Extracting the grams of tritium at PPB concentration dissolved in 1000 tons of FLiBe every day so you have fuel for tomorrow is an expensive job all by itself.

Making a whole new reactor every year or two because it destroyed itself with neutron bombardment is another.


> Driving steam turbines, even with other costs at zero, leaves you uncompetitive with renewables.

Everything gets beaten by renewables when they're at high output.

But renewables plus reliable storage costs a lot more, and there's no way it's cheaper than steam turbines attached to a bottomless source.


You seem to forget that storage also has no operating expense.

The cost of operating a steam turbine far exceeds the cost of the coal or uranium driving it. But the steam turbine would not be the only operating expense for fusion. We don't know exactly what it would cost to sieve a thousand tons of molten FLiBe every day to get out the tritium produced that day, because no one even knows any way to achieve it at all. But it would certainly be a huge daily expense, if achieved.


> You seem to forget that storage also has no operating expense.

Who said operating expense only? I was definitely considering capital costs too, amortized over a few decades.

Steam turbines are fine for backing up unreliable renewables.

> But the steam turbine would not be the only operating expense for fusion. [...]

Okay, but I was only addressing the idea of steam turbine costs by themselves.


Combined-cycle gas turbines are fine for backing up renewables and storage. They are used for that today. As the amount of renewables and then storage built out increases, the amount of time the gas turbines must run, and thus total operating expense, declines.

It should be clear that to build out storage when there is not surplus renewable generating capacity to "charge" it from would be foolish. The immediate exception is to time-shift renewable energy generated at midday peak for evening delivery, as is being done successfully today.

Steam turbines, by contrast, are expensive to operate, and slow to start up and shut down.

Capital expense of renewables is very low already, and still falling. Even substantial overbuild to charge storage from does not change this. Cost of various forms of storage is falling even faster. By the time much storage is needed, it will be very cheap.


> Combined-cycle gas turbines are fine for backing up renewables and storage.

So are plain steam turbines, if you have cheap steam.

> Steam turbines, by contrast, are expensive to operate, and slow to start up and shut down.

Huh? Combined cycle setups use steam turbines as part of the system. Steam turbines can ramp up and down plenty fast. It's traditional heat sources that don't ramp well.

> By the time much storage is needed, it will be very cheap.

That would be nice but I'm not depending on it, and I'm definitely not going to assume that long term storage will ever be cheaper than steam turbines.


> There's no plan to hook ITER up to a thermal plant because it's a research reactor not a power plant

Interestingly, that's not really true: IIRC the japanese team working on the WCCB breeder module (that uses supercritical water as coolant) plans on connecting the water loop to a small turbine. If they succeed it would be the first ever electrical power produced from fusion.


Ability of steam to drive a turbine doesn't seem to be in dispute.

Making tea would be better theater, if they need that, and cheaper: "the first tea ever brewed by over-unity fusing neutrons".


At least ITER has a plausible, in a sci-fi story sense, way of creating a sustainable fusion reaction to which more fuel could be given. The fact the neutrons produced, if it did reach the high Q ratios, would irradiate and break down everything involved, is probably 2070's scientists and engineers problems to solve.


In that paradise of abundant cheap solar and wind power, only weapons people will be interested in fusion.

They will probably be trying to come up with zero-fallout things they can use tactically.


Why won't fusion still be a worthwhile goal even if we do have abundant solar/wind?

Abundant and cheap are relative terms. Solar and wind could be abundant in a "powers everything we have now and for the foreseeable future of population growth" sense, but maybe not in a "gigantic power-hungry megaprojects which aren't remotely possible today" sense?


Also, is solar truly renewable ? Don’t we need some rare earth component to manufacture them.

What happen when we run out of that ?


Solar needs nothing that can be run out of. Likewise, storage. There are plenty of lies repeated, about that and about missing "breakthroughs".


Never heard about space and that sound outright silly. We do parking and we have space for that crap.

The other was deeply ingrained in my understanding. That’s good news. I need to do my own research, like the crackpots are saying.


Solar and wind power are not sufficient for spacecraft, and many other applications. Fusion power still has very interesting applications.


Most likely, spacecraft will rely on power delivered via laser.

If anybody succeeds in working out D-3He fusion, that could work in a spacecraft. (D-T, no.) We could probably scare up enough 3He to use for that, if there weren't too many.


If anybody in the universe is doing interstellar travel I think they would have developed D-D fusion which is somewhat more difficult than D-³He or D-T fusion but probably possible with the a scaled up version of the same machine.

Outside the frost line there is a lot of water and a higher percentage of D relative to H so it seems possible to "live off the land" between the stars without being dependent on starshine. A D-D reactor would produce ³He and T, a lot of those products would burn up in the reactor because the reaction rates are high but it would probably be possible to separate some of those out and use it as a breeder reactor that makes fuel for D-³He and D-T reactors elsewhere. I could picture the big D-D reactor running on a large comet or dwarf planet like Pluto producing D-³He for smaller reactors on spacecraft. (D-T not only produces a lot of neutrons but the T has a half life of 12 or so years and won't last for long journies.)

My guess is that interstellar travelers would develop a lifestyle that works around the frost line, where generic bodies above a certain size have liquid water inside. If they were grabby they might consume Ceres or Pluto but might not really care about dry, idiosyncratic worlds like the Earth and Mars.


Anybody doing interstellar travel should hang their collective head-analog in shame if they haven't mastered aneutronic p-11B fusion yet. (They will need to have figured out how to reflect xrays.)

Having got used to spending interminable ages out in the infinite chill void, they probably have come to prefer being there, so have no desire to roast deep in a stellar gravity well. Their equipment might not even work if warmed too much.


Solar is completely sufficient for 99% of applications.


Honest question, because I want that to be true at scale.

Do we know how to store it properly yet? How does solar pan out in case of surge ( eg : very cold winter night )


I think parent was talking about space applications.

Anyway unlike fusion, seasonal thermal storage is viable and available now, and will be scaled up in immediate future. Also, with electrical vehicles inducing massive investment into the grid, there will be both pressure and resources to solve the rest.


How do you power a moon base with solar? 14 days of battery?

Fine. How do you power a Europa base with Solar? A Neptune probe?


Since we're talking in present tense that's the remaining 1%.

Moon base can be fine with power beamed from a satellite or plain mirrors in orbit, no atmosphere in the way. Might end up being still cheaper than hauling nuclear reactor there plus all the infra to reliably dump waste heat from it.


"Solar and wind power are not sufficient for spacecraft"

I guess you're right that solar is useful for 99% of spacecraft -- in that they use it currently. Not a very useful observation

"and many other applications"

Power in polar regions. How well does Solar work in Antarctica? Or Alaska for that matter?


It works super-great, collected in the tropics and shipped in chemical form. Before you object to depending on imported liquid fuel, consider that most of the world does already.

The main difference is that literally anybody can make it, not just "oil exporting countries" and "fuel refiners". And, will. And export excess production when local tankage is full.


Last I looked, round-tripping solar via liquid fuel and back to electricity was under 2% before transport costs


Maybe look again without assuming hydrocarbon. Ammonia is a good transport medium, liquid at room temperature under low compression.


> How do you power a moon base with solar? 14 days of battery?

Either that or a place near one of the poles where you get water and lots of sunshine. A small fission reactor is a handy thing to have, however.

> Fine. How do you power a Europa base with Solar?

A lot more solar panels, or wire loops harnessing Jupiter's magnetic field and Europa's momentum, etc. Fission is still cool for that.

> A Neptune probe?

Now we enter the nuclear fission territory. Maybe fusion, some day.


>How do you power a moon base

For those interested in near term answer:

https://www.nasa.gov/mission_pages/tdm/fission-surface-power...


Laser beams from solar concentrators are likely until aneutronic fusion pans out, if ever. Maybe after that, too.


I do software for a living so that gives you my level of ignorance of the real world.

I do have two question about solar

- is it “drivable” / “pilotable” ?

meaning reacting to surge in the grid? My understanding is that this feature is highly desirable for a grid.

- can we actually build enough solar panel, physically ?

Don’t we need some rare earth thingy that is not in sufficient quantity on our planet as far as we know ? ( follow up : if there is enough, will there be enough in 200 years ? )


The "rare-earths thingy" is a common, transparent lie told frequently about both solar and wind. No rare-earths are used in any solar panel. Some wind turbine generators have used them, but not the big ones. And, "rare-earths" are not in fact rare. So, a double lie.

Solar panels provide cheap power generation on a schedule. For dispatchability, you rely on storage. There are many different kinds of practical, efficient storage; which are used where will depend on local conditions. Which will be cheapest isn't clear, but probably not batteries. Batteries used won't need lithium, or rare-earths, either

The lie most frequently repeated is that storage needs some sort of "breakthrough". Second is that the small amount built out means more than that there is not enough renewable power yet to charge it from; when there is will be time to build it. In the meantime, we fill in with NG burning. The third is that "pumped hydro", the most common used just now, needs "special geography". Hills are very common.

The lie most frequently repeated about solar is that there is any shortage of places to put it. It is most efficiently floated on water reservoirs, where it cuts evaporation and biofouling, although efficiency is only one consideration. It shares nicely with pasture and even crop land, cutting water demand and heat stress without reducing yield.

There will never be any shortage of wind or solar: need more, build more; materials needed are all abundant. Likewise storage. Costs are still falling as fast as ever, but are already lowest of any energy source ever known.


> And, "rare-earths" are not in fact rare.

It's not a lie but is an unfortunate synonym. They are rare in the sense that they are rarified, spread thin everywhere not concentrated in ores.


They are in ores, but are mixed with other lanthanides that they are expensive to separate from. Two of them, yttrium and scandium, are not lanthanides and are relatively easy to separate out.

A new powerfully magnetic iron-nickel allotrope may eliminate much of the market for several of them.


In regards to your first question, the word you're looking for in google-able energy industry jargon is "dispatchable". And yes, dispatchability of intermittent generation is achieved in a couple of ways in contemporary electricity networks:

  1. Deliberately backing off wind or solar generation from full capacity to provide reserves for demand spikes, transmission/generator outages, etc.  This means other generation that may otherwise not have generated at all over that period, is brought online to cover the shortfall.

  2. Co-locating grid-scale batteries at intermittent generation sites ("hybrid generation facilities" in energy industry jargon) to cover short-term contingency events.


Thank you, not my industry and not my language so “dispatchable” is a valuable keyword for me. ( it would be “pilotable” in French; if you ever have to discuss that abroad with my snotty kind )

Anyway. What I read is : having something else on the side can make solar dispatchable. Realistically, what would be that other things ?

Nuclear don’t like to be turned on/off. Wind has the same issue… are we saying the good ol’ coal burning kettle ?


> is probably 2070's scientists and engineers problems to solve.

With some clever choices in building materials, we may be able to dismantle ITER and its successors to use them as fission fuels.


If they can get a continuous fusion reaction going converting the heat energy from that to electricity won't be a problem. Getting a contained fusion reaction that gives out more energy than input is the problem how to convert that into electricity is not going to be a problem.


> If they can get a continuous fusion reaction going converting the heat energy from that to electricity won't be a problem.

Even accepting the qualification that's not just a mere matter of engineering, capturing that heat from a source that hot is not without trouble. A bit like how there is plenty of energy in a single lightning strike and yet we can't easily catch it even though in principle 'just build a large enough capacitor and connect it to a lightning rod' is a workable recipe.

> Getting a contained fusion reaction that gives out more energy than input is the problem

Not in the least because the container itself is a very hard problem to solve.

> how to convert that into electricity is not going to be a problem.

It is also a problem, albeit a lesser one.

The better way to look at all of these fusion projects is a way to do an end run around arms control limitations with as a very unlikely by-product the possible future generation of energy. But I would not hold my breath for that. Meanwhile, I'm all for capturing more of the energy output by that other fusion reactor that we all have access to, and learning how to store it over longer periods. Preferably to start with a couple of days with something that doesn't degrade (think very high density super capacitor rather than a battery), but I'll take advanced battery technology if it can be done cheap enough per storage cycle. We're getting there.


There’s not plenty of energy in a lightning strike.

https://engineering.mit.edu/engage/ask-an-engineer/is-there-...


Funnily enough it's about the same order of magnitude as the fusion event discussed in this TFA: about 1 MJ.

Energy from one hohlraum ≈ energy from two lightning strikes


It doesn't make heat. It makes fast neutrons. Turning them into usable heat is a project of its own.

Turning dumb heat into electric power is expensive. Nothing that depends on doing that can ever compete with wind and solar, anymore.

Tritium doesn't grow on trees. Making it by blasting those hot neutrons into a thousand tons of FLiBe is easy enough. Getting your few grams a day, at PPB concentration, out of that thousand tons of stuff is... nobody has any idea how. But you need to, to have fuel for tomorrow.

No, there won't be any of that. It would be fantastically more expensive than fission. Fission is not competitive, and gets less so by the day. Fusion is nothing but a money pit (with the just barely-possible exception of D-3He).


You've been all over this thread with some aggressive takes that are pretty seriously not supported.

> Turning dumb heat into electric power is expensive. Nothing that depends on doing that can ever compete with wind and solar, anymore.

This isn't even strictly true today when focusing on current "dumb heat sources":

https://en.wikipedia.org/wiki/Cost_of_electricity_by_source#...

"cost of extension of operations of existing nuclear power plants (LTO, long-term operations) has the lowest LCOE of low-carbon energy sources; "

I'm going to go with the OECD and NEA on this one.

> Tritium doesn't grow on trees.

https://en.wikipedia.org/wiki/Breeding_blanket

Saying that there are unknown engineering challenges is kind of a "duh", otherwise we wouldn't be researching we would be implementing. As you also mentioned there are other alternatives which we could consider than tritium.

> Fission is not competitive, and gets less so by the day.

https://www.sciencedirect.com/science/article/abs/pii/S03062... yeah that's not true

> Fusion is nothing but a money pit (with the just barely-possible exception of D-3He).

We genuinely don't know if fusion is a money pit or not, because we don't have any idea how much a successful form will cost. Tritium blankets may be easy or not. Maybe helion's D-3HE will have a breakthrough. Maybe it's ICF.

I've not seen suggestions by anyone that wind and solar build-outs stop, or get diminished. Indeed at this point because the cost are low, industry will continue to invest in them regardless.

However, we will need a lot more energy production than folks think. We need to decarbonize the atmosphere. And that's going to require a lot of power.

All that aside, solar and wind are not getting you to mars in a timely fashion. We have reasons to research fusion that escape large commercial power generation.


Cost of solar and wind are still plummeting. Anything that can match them today won't tomorrow. OECD and NEA agree.

Not going to Mars sounds like a great plan. Sign me up!


We certainly need a lot more energy production but to decarbonize the atmosphere there has be a target level with an evidential basis on which to proceed. If it's considered (as by many) that we have a climate emergency (though this is not reported at all in the IPCC report) then any decarbonization at all will obviously serve for starters. If this is not the case then there are other considerations such as the fact that as CO2 levels have risen so has global food production - about 30% over the last 30 years.

Simulations with multiple global ecosystem models suggest that CO2 fertilization effects explain 70% of the observed greening trend, followed by nitrogen deposition (9%), climate change (8%) and land cover change (LCC) (4%). CO2 fertilization effects explain most of the greening trends in the tropics, whereas climate change resulted in greening of the high latitudes and the Tibetan Plateau.. https://sites.bu.edu/cliveg/files/2016/04/zhu-greening-earth...

This is not a surprise given that carbon is needed for plant growth, a fact well understood by commercial growers who pipe CO2 into their greenhouses. So one issue might be, if decarbonization is successful then what might be the acceptable level of reduction in global food supply?

Another issue relates to temperature. From an analysis of 974 million deaths in 384 locations across 13 countries it’s been concluded that twenty times more people die from the cold as from the heat.https://composite-indicators.jrc.ec.europa.eu/sites/default/... A recent paper (Dec 12 2022) regarding heart attacks states “extreme temperatures accounted for 2.2 additional deaths per 1,000 on hot days and 9.1 additional deaths per 1,000 on cold days.” Circulation. doi.org/10.1161/CIRCULATIONAHA.122.061832.

Do any of the reports present an ethical problem? No, they do not given an extreme interpretation of climate models.


Climate catastrophe denialism is a strange hill to choose to die on.

Direct influence on economic variables and individuals are the smallest of its effects.


Eh, the ‘mistake’ was intentional, near as I can tell.


They put 2.05 MJ in and got 3.15 MJ out. The 300MJ comes from equipment inefficiency- remember this is a research facility to prove plasma physics and ignition itself, mainly designed for weapons research. The fact they got definite fusion at 1.53x input energy is not just breakeven, but net gain. This is the first time ever we've gotten net gain, in any fusion device. A lot of people in gov leadership thought fusion was impossible, the fact they can said they did it, despite 1980s era technology and a shoestring budget for decades, really says something. ICF is just one fusion technology, and probably not the best one, but fusion in general will probably see a massive budget increase worldwide now that it's been proven it can be done.

I fully expect a working fusion plant of some kind by 2030, assuming funding increases. Once we get them commercialized; coal, wind, solar and other power production will be obsolete for the most part. We can also use fusion heat to separate waste into it's base elements (you can recycle anything!), and help make any process needing a lot of thermal or electrical energy more efficient.


Regular fission plants take 5+ years to construct. So in the next 2-3 years you're expecting this technology to become mature enough to roll out and hook up to the power grid?


> I fully expect a working fusion plant of some kind by 2030

Right. The first law of fusion physics: working fusion power generation in 10 years. It has been proven true for the last 7 decades. Rock solid.


The joke used to be fusion was always 30 years away. Then the joke was it was always 20 years away. Now the joke is it's always 10 years away.

People paying attention over this time have noticed it getting closer as we improve our capabilities.

The next iteration of the joke will be 'fusion was always inevitable'


At some point, people will still be predicting fusion in the next 10 years, and then it will happen in the next.


By 2030? Lol. Lmao, even. 2050 if we're lucky--we're still not even close to developing a reactor that can produce meaningful amounts of power for a sustained period. It'll take decades more to get to that point, and once all the technical limitations have been crossed and we figure out how to get enough tritium, the actual hard work begins of getting it online and integrated with our current system. Fusion is distant future technology.


300 MJ is about 83 KWh.

In the UK is 1 KWh is £0.34

So, this costs £28 in electricity to run this experiment. The experiment is a momentary thing.

Clearly there is now some work to, but now this is becoming an engineering problem of how to extend, sustain, and scale this process.


To have a hope of supplying grid power, they need to scale up the energy gain by four orders of magnitude and reactor run time by 12 orders of magnitude.

Those are just two of the engineering problems. It'll be a while, and I doubt it will ever compete with solar, wind, and storage.


Remember that the human genome project was started in in 1990, completed in 2003. In 2003 it took a year to sequence 80% of the human genome, now it takes a day to do 98%. In 1990 they had the understanding on how to sequence DNA, but scaling that across whole human genome seemed like monumental task.

I'm guessing in a decade we'll have a viable early stage industrial process, and in 2 decades we have commissioned fusion reactors.

If we don't kill the planet with nuclear war or the climate crisis.


> [...], and I doubt it will ever compete with solar, wind, and storage.

Maybe not on earth, but there are applications in deep space.


> Maybe not on earth, but there are applications in deep space.

Depends on how long the interstellar craft is supposed to travel. If it's under 100 years, fission should be able to do the trick of keeping the craft warm and the lights on for the sealed ecosystem to function during the decades of coasting between stars.

Fusion rockets would be more convenient than fission ones because you can store the hydrogen you need in the form of water and water also acts as a great radiation shield while in deep space. Then, to brake, you use your radiation shield as reaction mass for fission or fusion rockets.

If we are talking about much more than that, fusion is probably a better answer as fission fuel will half-life itself into paperweights over a grand transgalactic tour.


> Depends on how long the interstellar craft is supposed to travel.

I didn't have (only) travel in mind. I was thinking of living between the stars.


You'll need a constant power supply for the entirety of the trip. After doing the math, it turns out, fission seems quite viable - all usual fuels, in storage, have half-lives of more than 10,000 years.

So, if you have enough fissiles for keeping the closed ecosystem happy for the duration of the flight, you can go quite far.

The ship/colony will need to enter orbit around a star and drop by a rocky planet at some point, to gather more fissiles and reaction mass (and other materials needed for fixes and upgrades), so it wouldn't be able to stay indefinitely in deep space. If it's fusion-driven, a gas giant may be a good option for both fuel and reaction mass, and icy moons may work well for replacing water.


I'm not talking about travel. I am talking about permanently living between the stars, and essentially living off hydrogen harvested from the interstellar medium.


This is exactly why we need to keep plugging away at fusion - one day we’ll need it in space. It doesn’t have to be cheap, it just has to work.


Of course, whether that's a good investment of resources right now is another question.

Even if we had no other goal than becoming an intergalactic species as soon as possible, we might still benefit from working on other things first.


You are operating under the rather naive assumption that the people and resources used in fusion research are fungible with the people and resources used in other things.

When you have a bunch of people who know how to build nuclear bombs sitting around with nothing to do, you damn well keep them busy before another country finds them a job.


For further discussion, we'd need to be careful whose perspective we are taking and what timescales we are talking about.

You seem to be taking the perspective of individual countries? And not eg humanity.

And timescales of perhaps decades?

On longer timescales: people don't get born knowing how to build nuclear bombs. They are trained up.


Having worked in the field for 6 years, the estimated cost per shot at NIF is roughly $1MM. The estimated cost for a day of shots at OMEGA is $250k-$300k.

The cost per target varies a lot due to the precise manufacturing tolerances and the methods to get them. For example, the sphere with the fuel in it is made by dropping liquid glass from a drop tower. And then metrology is done on hundreds and hundreds of glass spheres.

So though the electricity might cost that, we are talking about a building in which just the lasers and their optical paths take up 3 foot ball fields of advanced warehouse space. And the target chamber is at ultra high vacuum, which is 10 meters in diameter. There are also countless diagnostics, computers, and other electronics, the lights for all the facility, and the number of people required to run it so this delicate experiment goes off without a hitch.

Honestly, it's almost not worth talking about as a power source anytime soon. Even if Q > 2 on NIF there are countless engineering problems that would have to be overcome (and haven't really been thought too hard on in the ICF field) to get a power reactor out of this tech.

My two cents, look towards MIT and CFS for news on their SPARC tokamak and plans for ARC tokamak. Based on some data I have seen, SPARC should hit Q>1 pretty easily. With some estimates of reaching Q> 3 to 9. And before you scoff at it, this reactor design is using magnetic tech that has proven it can withstand and produce a 20T magnetic field! In MCF, field strength and heating are the two key metrics. To put this into perspective, the massive tokamak being built in Europe has a MAX possible field strength of 13T, assuming it's run to the edge of it's theoretical design limitations. The SPARC one hasn't even been run to it's design limitations, most likely due to the mechanical stresses a 20T field produces in a 3-4 meter D coil.


Plus $10M for the target bead.


Where did you see this estimate for hohlraum cost?

The NIF fired 368 shots in 2021:

https://lasers.llnl.gov/for-users/nif-target-shot-metrics

At $10 million per target that would cost $3.7 billion. The annual LLNL budget (which includes NIF) is only $2.8 billion:

https://www.llnl.gov/doing-business/economic-impact

As of 2004, the targets were reported to cost $2500 each:

https://www.osti.gov/servlets/purl/828518


Got it from a comment on here.

The 2004 targets didn't work. Neither did the 368 shots in 2021. Maybe ones that work cost more?

Even $2500 for 1 kWh is rather steep.


A quick peruse through this wikipedia article will tell you what still needs done, and what this means:

https://en.wikipedia.org/wiki/Laser_Inertial_Fusion_Energy

Give it a read. Intertial confinement fusion with a Q > 1 may very well point the way towards a realistic powerplant. Fusion is at a point it deserves investment. The NIF cost about the same as a single b-2 bomber.


Total annual US public research into fusion that is not ITER or this project (which does not come from the fusion pot. As others have said this project was built for weapons research) is about $300 million per year[1]. We, as a country, do not give a rats ass about having fusion as a power source. VC funding recently (for the first time) supported many fusion startup ideas at or above the $300 million level. Too bad ZIRP is over and private funding is now likely to dry up. The recent private funding in fusion was really amazing. Many new ideas and methods are being tried. This is what fusion research needed. Not spending billions on magnets and cement in France that is ITER. Hopefully some of them have a long enough runway to get promising work done and get follow-up funding.

[1]http://large.stanford.edu/courses/2021/ph241/margraf1/images...


We may be certain, anyway, that all the startups will spend all the money they get. There will be no power generated. The investors will not get any of their money back.

That would be fine, except some of the investors are pension funds.


As I understand it that's largely because some of their equipment is outdated or not optimised. Building a new facility from scratch would result in a much lower power consumption from the grid.


given how long it takes to build these places, I feel like the number of iteration cycles is an important metric, especially when other options are not standing still.

I do understand that with more focus things can happen faster but you can really only pour so much concrete per day. Hopeful that we can figure out "leaner" ways to get this done.

I would love to see progress on this stuff, just don't like the idea of betting on successive megaprojects in the age of "a website is hard".


Iteration is often overlapping. Day 2 of operations is usually planning the next generation (for a given project)

Adding more funding will probably see more parallel projects as well, especially at major institutions


I think the "With modern lasers" part is addressing that.


If it takes 300 one time in the future to start a long term reaction that’s a price I think we would all pay. The fact that we proved the idea now makes this just an engineering problem.


I'm not trying to undermine but more just act as a reminder to some folk. There is no such thing as "just an engineering problem". These things do not exist in a vacuum. It is a part of the tripod of economics and politics. If any one of these three buckles, the whole thing doesn't happen. Concord was an engineering problem we solved, but the economics and politics around it made it a white elephant. Here's hoping this doesn't happen to this.


Now add in the 50% efficiency of conversion of heat to electricity (and that's very optimistic), and it's only 1.5 megajoules, or 0.5%, right where I calculated it based on initial information.

IANAP, but I see no path forward to sufficient Q-total using plasma fusion to put this to any practical use. Unless the reaction can somehow be self-sustaining, I do not believe this will ever work.


don't you think it's nuts that your arrogance would have you believe that the people on the ground doing the experiment know less about the power consumption and output than you do? Because I do.

There's a lot of supporting stuff as well as energy to drive that stuff that goes into leading edge tech development like this, that does not matter in terms of the reaction itself.

If they say they achieved more output than input, then I will believe them over a random HN comment snob any day of any week.


I'm sorry but you sound much more arrogant here with your strange assumptions. AFAIK all of those things he said are based on the data from the experiment. And I've read the same (i.e. that it's very far from being energy efficient).

> will believe Wouldn't be better if you were able in to verify stuff to some degree yourself instead blindly trusting every expert (not at all implying that the people who did this experiment are untrustworthy but your bound to run into some bad apples with this attitude eventually)


The paper was very clear what was involved and that information was poorly communicated. The experiment used X energy from a laser pulse to get Y thermal energy. The issue w0mbat referred to is to be self sustaining you need to use the output energy to drive the process. Aka the important number is after all relevant losses including electricity > lasers > fusion > heat > electricity, with each of those stages have losses. The reason they don’t communicate end to end efficiency is from a scientific standpoint it’s meaningless as they aren’t converting any thermal energy into electricity.

For example p + p Fusion releases neutrinos which then escape any practical device without depositing their energy as heat. This isn’t a concern with DT fusion but again the point is we don’t really care about the actual mass to energy conversion but rather the amount of useful energy obtained.


none of what anyone said is about a self-sustaining reaction.

this is about more power leaving the reaction chamber than what entered it. that's all the announcement is about.

this is NOT about how much energy it takes to ready the lasers. this is NOT about the electricity consumed by lighting, computers, cooling, or measurement or anything else-- none of that counts when you are measuring the efficiency of the reaction itself.

this is about more energy leaving the reaction chamber than went in.

understanding that is key to understanding the significance of the announcement, and this is significant.

and I maintain that the poster I originally called arrogant is arrogant, because they indicated in their comment that they knew how to calculate reaction efficiency better than the physicists doing the work. I called it arrogant because it is --objectively-- an arrogant position to take.

if that makes me arrogant, then so be it. my arrogance is independent of theirs and has no bearing on comments made before my own, and my comment did not influence theirs. (they were being arrogant before I pointed it out.)


> none of what anyone said is about a self-sustaining reaction.

Actual power plants are self sustaining as in they use the electricity they produce to operate, it’s mandatory though not sufficient for any commercial fusion power plant.

So, this isn’t about a different way to “calculate reaction efficiency better than the physicists doing the work” he was directly quoting their numbers from the paper. It’s only a question of communicating the meaning of efficiency.

> this is about more energy leaving the reaction chamber than went in.

The exact same energy was there before and after fusion only it’s form changed. It might seem pedantic to point that out, but if you don’t make it clear people will misunderstand.

Also, the applied laser energy also leaves the reaction chamber so any fusion would be net positive thermal energy by that yardstick.


> Actual power plants are self sustaining Actual power plants are not necessarily self-sustaining and practically none of them can start if the grid power is not available. See: black start power plants and https://youtu.be/uOSnQM1Zu4w


If it on average produces less electricity than it consumes then it isn’t a power plant.

Black start is obviously a different question than being self sustaining.


Explains why the experiment took place at 1am.


I agree! These results are NO cause for celebration. To do so would be deliberately misleading, and potentially devalue celebrations in general.


This is a stupid question but I don't know anything about fusion:

How is it possible for X energy to create X+Y energy in output? Doesn't that violate some fundamental law of physics?


If you look at the mass before of their fuel, 1 deuterium atom + 1 tritium atom:

    2.01410177811 u + 3.01604928 u = 5.03015105811 u
vs the mass of the fusion products of 1 helium atom and 1 neutron:

    4.002602 u + 1.008 u = 5.010602 u
You'll notice that even though we started with 5 neutrons and 2 protons and ended up with the same number there was some additional binding energy that is unaccounted for in the new configuration. This is the energy released by the fusion reaction via E = mc^2. Here we see the mass difference is:

    5.03015105811 u - 5.010602 u = 0.01954905811 u
Converting that to energy you find that is 17.6 MeV. As you go up the periodic table fusing nuclei you will get less and less marginal energy until you get to iron where at that point fusion become net negative and fission is then takes over where breaking nuclei apart gains energy, marginally more as you go up the periodic table. That's why you want to fuse light particles and fission very heavy particles. It is also why there is so much iron as it is kind of the base state of both of these reactions.


Tangentially related, but I think this is an interesting fact, all the atoms in our universe/galaxy/solar system with a mass up to that of iron are formed in the core of stars in stellar fusion. Hydrogen fuses into helium, and as a star nears the end of its lifetime you get heavier elements like lithium, carbon, and so on. Under normal stellar fusion no elements heavier than iron will be produced, and iron is only element number 25. If you just looked at nucleosynthesis through the lens of stellar fusion, it isn't obvious that there should be any heaver-than-iron atoms at all in the universe.

These heaver-than-iron elements are created in a very interesting and exotic process. When a large enough star dies it explodes in a supernova, and a huge amount of energy and neutrons are released in a very short period of time. This supernova generates enough energy and neutron material that small amounts of heavier elements like gold, platinum, etc. are created through exotic nuclear fusion reactions, even though these heavy fusion reactions are energy-absorbing.

It's interesting to think when you're wearing jewelry made from gold or platinum, all of those atoms in your jewelry were created during the death of a star.


  “The nitrogen in our DNA, 
  the calcium in our teeth, 
  the iron in our blood, 
  the carbon in our apple pies
  were made in the interiors of collapsing stars.
  We are made of star stuff”.
  – Carl Sagan


  We have calcium in our bones, 
  iron in our veins, 
  carbon in our souls, 
  and nitrogen in our brains.
  93 percent stardust,
  with souls made of flames, 
  we are all just stars 
  that have people names"
  Nikita Gill


I understand this is a poem that is focused on artistic expression and not scientific accuracy, but I find the line about “carbon in our souls” to be out of place. I guess the rest of the poem is incidentally correct (when not abstract)


>Carbon in our souls

I think it might be an allusion to alchemy. Basically, the alchemists believed that ash (what was left after burning something) was the soul of all things...And-- this is where my complete lack of understanding about science shows-- I'm pretty sure Ash has lots of carbon? It's, you know, poetic. Many have claimed that poems are the "language of paradox" so it's okay for it to be a little non-literal. My interpretation of it, though, is that the soul is something impure that you must burn away, or maybe that the soul is polluted by our own words and behavior. It's definitely not meant to be scientifically accurate.


ash has no carbon

carbon's oxides are all gaseous at standard temperature and pressure


Not quite. Plenty of carbon is release as carbon oxides, but calcium carbonate is also a major component of ash (or at least wood ash).


good point

little carbon, then, not none


Going by what wikipedia says, wood ash can have carbon. If I understand correctly, how much depends largely on how hot it burned.

https://en.wikipedia.org/wiki/Wood_ash#Elemental_analysis


Sure, the word “soul” comes from the proto Germanic “saiwiz” (for sea or ocean).

But not because “you are like a drop in the ocean,” but because “you are like an ocean in a drop.”

The idea of soul can be objectionable when it is based on an immortal being or on a vitalist life-force (like “anima” of the Latin). But it seems fine when it is based on the psyche (like the “Psuche” of the Greek).

I embrace taboo words like soul because they 1. are common 2. are useful for referring to things that seem pretty important (like avoiding soulless companies or products or buildings) and 3. are challenging to my normal (scientific) understanding of the world.

Still, I’d be more comfortable if the poem referred to the “carbon of our souls” rather than “carbon in our souls.” Hmm…


One who is a materialist could argue that your "soul" comes from the stuff you're made out of, so your "soul" probably has carbon it somewhere.


You could define soul as the fuel engine for life, which is basically burning carbon. As long as that furnace is functioning you're alive == you have a soul.


Oh wait I can play this game:

You and I are complicated but we're made of elements Like a box of paints that are mixed to make every shade They either combine to make a chemical compound or stand alone as they are

- They Might Be Giants


I find this poem inaccurate.


?


Souls are clearly not made of both carbon and flames.


The ones in Hell could be.


Was that originally in a different language? "Heart" makes more sense than "soul", and could be translates loosely.


It’s funny that alchemy was kinda onto something, but underestimated the energy requirements by orders of magnitude of orders of magnitude


True, all you need to turn lead into gold is a moderately powerful collider and some hydrogen.

The conversion is also very slow. And expensive. To make it this way it would cost a Quadrillion dollars an ounce.

https://www.scientificamerican.com/article/fact-or-fiction-l...


Quadrillion now, but costs will drive down and once is turned to gold, it stays gold. In the far future where all gold was mined this will be the only process left to get more. Or explode stars and capture gold from them.


I joke with my Son all the time about turning Mercury into gold (He works in the nuclear industry).

Once you get past the enormous energy costs to do this you have a secondary problem, all the gold produced this way is radioactive and it beta decays to.. Mercury.


Actually current modeling has supernovae as being only a small contributor to the measured abundance of heavy nucleii. These guys tend to come from a more exotic source still: material thrown off as a neutron star is tidally disrupted during a merger event with another neutron star or black hole.

The wikipedia page is pretty good, as always: https://en.wikipedia.org/wiki/Nucleosynthesis

Almost everything with mass of 90 or above comes predominantly from neutron star mergers, basically.


Good Lord the universe is old. To think of how rare all that must be and how it had to have time to somehow get here to our planet.


If they weren't here, we wouldn't be able to talk about it.

I'd be interested to know if we're in an element rich vein of the wider universe or if all the good stuff is more or less evenly distributed?


I'm not sure but this has some interesting info such as-

>Some whole galaxies have average metallicities only 1/10 of the Sun's. Some new stars in our galaxy have more metals in them than the original solar nebula that birthed the Sun and the planets did. So the amount of "metals" like oxygen and carbon can vary by a few orders of magnitude from star to star, depending upon it's age and history.

https://www.reddit.com/r/askscience/comments/9tujxn/are_the_...


Phosphorus is supposed to be rare in the wider world. Considering how central it is to everything important, life might find it very difficult to start in our temperature range, without.


After all, is there anything better to create new elements than a hot, dense neutron soup?


I'm always fascinated by the sheer and unfathomable amounts of energy that is thrown around in these events. Just thinking about the fact that a single spoonful of neutron star matter contains more mass than Mount Everest fills me with wonder about the world we live in.


What happens whena tablespoon of neutron soup gets thrown out of the well of a neutron star? Does it suddenly expand to the size of everest? Where do the electrons come from?


In terms of "where do the electrons come from", an ordinary neutron in free space has a half-life of about 10 minutes, decaying through beta decay which produces a proton, an electron and a neutrino.


That sounds fairly energetic... So after 10min you'd have some odd mix of heavy elements probably approaching a decent fraction of the volume of everest. Half the volume of everest x (densityofgranite / densityoflead)

That kind of expansion rate has to rival any explosion imaginable.


It's essentially a giant atomic nucleus, so absent a star's worth of gravity holding it together it's going to decay rapidly into stable isotopes. So essentially it would act more or less the same as a huge fission bomb of the same mass.


I'd imagine some of the energy and degenerate matter consisting of neutrons would convert to protons and electrons, and nucleosynthesis would take place to form elements.

I have no idea, though, but I'm pretty sure I watched a video about this.


So that means that for life to form, we probably need a star to die so that the heavier atoms used in complicated life forming chemical reactions (correct me if i am wrong here as what I'm about to say depends on it), hence it could be the case that if the universe is 13.5 billion years old, then we humans are appearing in the universe at the earliest possible time.

13.5 billion years seems like the time required to create a star, have the star die and blow up, have all that material settle and create a new star, then the planets are formed, than enough time on one of those planets needs to pass for life to form, then complicated life.


Not necessarily. First generation stars were, theoretically, enormous both due to low metallicity of the collapsed medium and a higher average concentration of said medium. These stars lifespans were extremely short, shorter that blue giants we see today. So novas due to the death of these stars happened fairly early in the lifespan of the universe (talking about few million years after the big bang).

Therefore, life could have developed in a few tens to few hundreds of millions of years after the big bang. That's still true even if we assume that heavier elements are created mainly when neutron stars collide and not by super/hypernovas as we theorized before LIGO/Virgo observatories.

Consequently, we likely are not a "progenitor" civilization in the universe if we only consider planets formation. We might not see anyone out there either because there's a great filter for intelligent life to emerge (so the bottleneck is in our past) or because few/no civilizations get to have an impact on their host stars (the filter is in our future) that would allow us to see them.


Basic life (single-celled?) requiring the elements above lead might have a chance at that time, but complex life like us wouldn't do so well if there were still supernovas going off left and right. There's a theory with decent evidence that at least one of the mass extinctions was caused by a supernova: https://www.space.com/supernova-caused-earth-mass-extinction...

That being said, I wasn't aware of how LIGO changed the understanding of how heavier elements are usually formed, guessing it changed the expected neutron star prevalence? Do you have any additional reading on that?


You are right about supernova hampering life evolution, but it's unclear how long the fireworks lasted. In my comment I argued that it is possible to have the conditions of life emerge much earlier than 13.5 billion years. Not that it necessarily happened.

Regarding the second point have a look at https://www.ligo.org/science/Publication-GW170817Kilonova/in... . That isn't my field of specialization, so I am not sure about recent publications. At the time though this was a big deal as kilonovas seem to be the primary source of heavy nuclei in the universe. That particular event crested between 1/100th to 1/1000th solar masses worth of heavy ( heavier than iron) nuclei. This is a greater rate than supernovas estimations.


> how LIGO changed the understanding of how heavier elements are usually formed, guessing it changed the expected neutron star prevalence?

It's not about the prevalence, but about the light curves observed during the event AT 2017gfo. They indicate significant heavy metal ejection but, what's interesting, also production.

> mergers of neutron stars contribute to rapid neutron capture (r-process) nucleosynthesis

These two articles cite the relevant papers:

https://en.wikipedia.org/wiki/GW170817#Scientific_importance

https://en.wikipedia.org/wiki/Nucleosynthesis#Neutron_star_c...


I'm just a layman but I believe by the time our sun has formed, we've gone through multiple star cycles. The early stars were very pure - made basically purely of hydrogen (maybe some helium?). They were huge, burned very bright and died comparatively quickly. Each time stars died, more heavy elements (and heavier elements than before) were produced. Over time the heavy element content (called metallicity) has increased in all stars. I believe there are also theories of white dwarf mergers undergoing runaway fusion and a lot of heavy elements being generated during the explosion.

You raise an interesting question though: what is the earlier point of time where the heavy elements were abundant enough for life (as we know it) to form? Just because we started existing at +13.5 billion years, it doesn't mean carbon based life couldn't have formed much earlier.


Very much a laymen also, however funnily enough I was listening to a bbc program called in our time, a couple of nights ago, where a similar topic was discussed one comment was that life is carbon based and for carbon to exist a star has to die, so yes therefore we are in the early stages. Will try to fin the episode….


I have zero ability to answer your question but I would love to know about about this. If life (like we know it) requires the explosion of aged stars, what is the earliest it would take. What is the minimum time needed to form, grow and explode a single star? Has there been time for this to occur 10s, 100s of times since the Big Bang? (obviously they can happen in parallel, but I'm thinking about how many in series).


There's a large range of stellar lifetimes, from a few million years up to trillions of years.


> 13.5 billion years seems like the time required to create a star, have the star die and blow up, have all that material settle and create a new star, then the planets are formed, than enough time on one of those planets needs to pass for life to form, then complicated life.

Maybe for a main sequence star, but there other processes that involve nucleosynthesis.


That assumes that life as we know it is the only possible form of life.


Not necessarily, but look up the phosphorus problem.


  >all the atoms in our universe/galaxy/solar system with a mass up to that of iron are formed in the core of stars
Note that this is not, strictly speaking, true.

Roughly 90% of the helium atoms in the universe were created via Big Bang nucleosynthesis.

https://en.wikipedia.org/wiki/Big_Bang_nucleosynthesis


Iron is always spoken of as the dividing line, but I'd like to know whether iron is exactly on the line, on one side (which?), or it depends. IOW, does fusion of iron atoms release energy (hydrogen side of the line), absorb energy (uranium side of the line), neither, or either (depending on conditions)?


My son has a masters in nuclear physics and i've always been curious as to why iron causes stars to explode.

He does an excellent job explaining things and put it to me like this.

Elements to the Left of Iron can undergo fusion and release energy, Elements to the right can undergo fission and release energy.

Iron IS the line because it needs energy to do either of these.

All elements want to find stability, and Iron is that Element because it needs energy for either fission or fusion.

So yes, Iron is the dividing line and this is what makes it so stable.

Edit: forgot to link the chart when referencing left or right..

http://www.splung.com/content/sid/5/page/benergy


Fusion to the left of Fe, fission to the right, here I am stuck in the middle with glu(ons)

(Yes I know it's technically mesons at the scale we're talking about but it doesn't rhyme so there)


Mesons should been called gluons, way stickier for sure.



The periodic table is pretty awesome. But I wonder if there's a layout that takes into account this fission/fusion division? Would it just be a line?

-- Yeah there are a few alternative designs, but nothing that seems to do the division. https://en.wikipedia.org/wiki/Alternative_periodic_tables


That's because the periodic table is essentially about chemistry (i.e. about electron orbitals), not about nuclear physics (i.e. the atom's nucleus). For example it doesn't talk much about isotopes, aside from usually reporting the average atomic mass.


My point is that there are other ways of organizing the same information to perhaps reveal unexpected connections. From the same link, a 'physicists periodic table' - https://en.wikipedia.org/wiki/Alternative_periodic_tables#Th...

And really, i guess everything is physics if you get specific enough.


As I understand it, iron is the first element that absorbs energy under fusion, and therefore won't fuse further. Could be wrong, though.


It will happily fuse further. It just won't support the outside of a star against gravity, while doing it. So the star collapses, fuses lots more stuff even heavier than iron, and then explodes. Most of the iron and heavier stuff fuses into the core of a neutron star, the ultimate in energy-consuming fusion.


None of this is accurate.

Iron will not happily fuse further because this NEEDS energy and where would that energy come from?

"heavier than iron" elements are produced when a star explodes because that collapse produces enormous amounts of energy.

During the collapse, the outer edge of the star is accelerated to something like 20% of the speed of light, that is an ENORMOUS amount of energy slamming down on the core.

Lastly, neutron starts don't produce energy, they are the incompressible remnants of a dead star.


You answer your own question: the energy for further fusion, all the way to neutron degeneracy, is provided by gravitational collapse. The outer layers fusing provide energy for the explosion.


... which, it must be said, is probably more important than gravitation in overcoming nuclear charge repulsion.


> all the atoms in our universe/galaxy/solar system with a mass up to that of iron are formed in the core of stars in stellar fusion.

all is a strong word

for example: the nuclear fusion experiment in the article, that would have produced a few atoms heaver that hydrogen

(not to mention all the nuclear bombs that had a fusion stage)


If I understand this right, I think some of the lighter than iron elements are also created in those exotic processes. But yes, unlike for the heavier-than-iron elements, the lighter ones are _also_ created in normal stellar fusion.


Heavier elements supposedly come from two neutron stars colliding because even a single star exploding isn't enough energy.

So think about two stars killing each other to make your jewelry.


we are all made of star stuff


or as I like to say to encourage myself that small things matter

"Stars are made of quarks"



Star stuff sounds like some German element.


Exactly what I thought! I checked with google translate.

star stuff = sternzeug. stern stoff = star fabric


Stoff had more than one translation into English. One is cloth. Another refers to substances. Eg hydrogen is called Wasserstoff in German: water-stuff


See “waterstuff” in the Uncleftish Beholding:

https://www.ling.upenn.edu/~beatrice/110/docs/uncleftish_beh...


yet we all want to be


It is also interesting that the Milky Way will collide with Andromeda and then we will be invaded by zerg.


And on a total tangent, this fact played a part in worldbuilding done by the author L. E. Modesitt, Jr.

> When I initially decided to write The Magic of Recluce in the late 1980s, I'd been writing science fiction exclusively... I conveyed a certain dismay about the lack of concern about economic, political, and technological infrastructures in various fantasies then being written and published in the field...

> I faced the very real problem of creating a magic system that was logical... Most fantasy epics have magic systems. Unfortunately, many of them, particularly those designed by beginning authors, aren't well thought out, or they're lifted whole from either traditional folklore or gaming systems and may not exactly apply to what the author has in mind.

> I began by thinking about some of the features and tropes of traditional fantasy. One aspect of both legend and folklore that stuck out was the use of "cold iron" to break faerie magic, even to burn the creatures of faerie, or to stand against sorcery. Why iron? Why not gold or silver or copper? Not surprisingly, I didn't find any answers in traditional folklore or even contemporary fantasy. Oh, there were more than a few examples, but no real explanations except the traditional ones along the lines of "that's just the way it works."

> For some reason, my mind went back to astronomy and astrophysics and the role that nuclear fusion has in creating a nova... Each of these fusion reactions creates a heavier element and releases energy... The proton-proton reaction that produces iron, however, is different, because it is an endothermic reaction...

> At the same time, the fact that metals such as copper or silver conducted heat and electrical energy suggested that they were certainly less than ideal for containing electrical energy. Gold and lead, while far heavier than iron, do not have iron's strength, and other metals are too rare and too hard to work, particularly in a low-tech society.

> At this point, I had a starting point for my magic system. I couldn't say exactly what spurred this revelation, but to me it certainly made sense. Iron can absorb a great amount of heat. If you don't think so, stand on an iron plate barefoot in the blazing sun or in the chill of winter. Heat is a form of energy. In fantasy, magic is a form of energy. Therefore, iron can absorb magic and, by doing so, bind it.

https://www.lemodesittjr.com/the-books/saga-recluce/recluce-...


Do we know how much tritium is needed for a city's energy generation? What about a state etc? Reason I ask is the only uses I have seen for tritium is on old watch dials made in the pre-90's. Curious how much of this resource is out there.


Take spullara's numbers:

2.01410177811 u = 3.34449439340696e-24 g deuterium

3.01604928 u = 5.008267217094e-24 g tritium

17.6 MeV = 7.832863e-19 kWh energy

Divide through, and you will see that you need 4.27 ug/kWh of deuterium, and 6.39 ug/kWh of tritium.

A random source [1] says that New York will use 50.6 TWh per year by 2027. That would require ~216 kg/yr of deuterium and ~323/yr kg of tritium.

This is all assuming 100% efficiency. A quick read suggests 50% efficiency might be practical, so double those quantities.

Also, i could easily have messed up that calculation somewhere, so please do check it!

[1] https://www.buildingcongress.com/advocacy-and-reports/report...


Tritium has a half-life of 12 years so there is no deep pool of tritium upon which to draw. The primary source of tritium on earth is cosmic-ray interactions in the upper atmosphere that produces 7.5Kg of tritium a year worldwide.[1]

Isn't that going to cause a serious problem if it requires 323Kg/yr of tritium just to power New York City?

[1] https://www.sciencedirect.com/topics/earth-and-planetary-sci...


Apparently no, because it can be made by the fusion process itself, via contact with lithium, and there's enough proven reserves of the latter to supply us for 100s of years.


Nobody knows how to get tritium at PPB concentration from the thousand tons of radioactive molten FLiBe it is dissolved in. You have to process all of it every day because you need that tritium for fuel tomorrow.


I wonder if that is much or not. I have no idea how hard this is to produce. Do you have an idea about that?


Oooof, looks like the primary way of creating it is through... fission reactors.

https://en.wikipedia.org/wiki/Tritium#Production

And there's not much of it:

> According to a 1996 report from Institute for Energy and Environmental Research on the US Department of Energy, only 225 kg (496 lb) of tritium had been produced in the United States from 1955 to 1996.[a] Since it continually decays into helium-3, the total amount remaining was about 75 kg (165 lb) at the time of the report.


The concentration of deuterium in the ocean is about 150-160 parts per million and with 1233.91 quintillion liters covering the earth we have approximately 8.2260667e+12kg worth of it to extract, so we've got a bit to work through!

Tritium however is far more rare with only trace amounts of it being available within nature and barely more than a kg produced per year. Producing the 100s of kgs required per year still seems to be an unsolved problem, although my quick searching shows there's a couple viable solutions for it.


The solution is that fusion power plants can breed tritium and become net producers of it...

Though in practice enough will be lost that probably they'll still be somewhat net consumers-- just not nearly to the extent predicted by a simple thermodynamic model.

Still, even if fusion becomes a net producer of tritium, the whole tritium-is-hard-to-get problem will likely be a constraint that we'll be fighting as we ramp up use of fusion power in the future.


That such a small amount of matter could generate so much power is pretty remarkable. I might be way off base but from what I can find online it seems you'd need over 2000 times as much uranium?


Tritium is very popular on gun sights as well- as it's a glow in the dark sight that doesn't need to be charged. I'm now questioning the practice of appendix carrying with tritium sights.


Tritium decays by beta-decay (an electron). The electron can not travel very far in air (1/4 inch), and is stopped by even the thinnest piece of metal. It's even stopped by the dead outer layer of your skin.

i.e. it's completely harmless unless you eat it.


Not quite: beta decay will penetrate the skin enough to damage living tissue - beta burns are what caused the fatalities of the Chernobyl first responder fire fighters.

They spent a few hours covered in dust on their coats, and did a bunch of subsurface skin damage which manifested as third degree burns. Sepsis, not radiation poisoning, generally killed them.


Well, it lasts for several years, but considerably less than even a human lifetime: Tritium's half-life is only about 11 years, so gun sights, dark-proof glow-in-the-dark signage (usually reserved for critical industrial plants, ships and offshore platforms due to expense), etc, will become seriously degraded in just a few years. (Since the glow is directly proportional to the remaining low-level beta radioactivity, which can barely penetrate the glass envelope in the first place - you'd get more radiation (from radium) living in a brick house than carrying 24-7.)

FWIW, tritium and a phosphor granule encapsulated in glass microspheres have been developed for self-illuminating runway paint, but again, no one really uses it because tritium is stupid expensive, and again, it' loses half its brightness in only a decade.

On the other hand, I've been told that Trijicon will replace their tritium gun sights for the lifetime of the original owner. I plan to live long enough to cost them money...


> Tritium's half-life is only about 11 years, so gun sights ... will become seriously degraded in just a few years.

We'll all be using RDSs by then ;) ?

> ...I've been told that Trijicon will replace their tritium gun sights for the lifetime of the original owner.

I didn't know this, thanks.


Real Engineering recently made a video that covers fuel needs.

https://www.youtube.com/watch?v=BzK0ydOF0oU


And the (otherwise excellent) channel is supposed to soon post an adv... I mean informer... I mean "exclusive documentary" about "repeat after me we're totally not a scam - we just play one on YouTube" fusion startup Helion.

Which I'm going to watch, because even though everything I hear about this company gives me insane Theranos vibes... Well, if they pull it off... They might light a bulb with fusion in my lifetime.


The Helion approach seems plausible to me. Can you point me towards some of the sources that give you vibes?


A recent episode on Matt Ferrel's Undecided [1]

(And I though I had seen a similar segment on "Answer with Joe", but I can't find it.)

None of those would be opposable source, of course - there won't be any until they open a plant, publish data, etc...

In the meantime, I see a hype cycle brewing, and it makes me a bit uncomfortable - but, we'll see.

[1] https://youtu.be/yNP8by6V3RA


Tritium is also used in gun sights.


TLDR is there isn't nearly enough tritium, but fusion reactors can make more (while still generating energy)


Can you collect them easily in the reactor ?


If I understand correctly, tritium is the result of bombarding lithium with neutrons.

I'm not sure how tokomaks are expected to work; do you just add lithium and expect the tritium to get where it needs to go to keep the reaction going, or do you actively remove gases from vessel, filter out the tritium, and re-use it as fuel?

Either way, I don't imagine it'd be too hard to recapture the stuff.

https://en.wikipedia.org/wiki/Breeding_blanket


> It is also why there is so much iron as it is kind of the base state of both of these reactions.

I always thought this is the interesting takeaway, both fusion and fission funnel their constituent matter towards iron, as the final stable state of matter. Far future civilizations will have a lot of iron on their hands.


This is an A+ comment—helpful, well-formatted, concise, with bonus interesting additional detail.

Thank you.


Thanks, finally got to use my physics masters degree.


First time for everything!!


Yes well done .


Great comment. But you meant 2 helium atoms and 1 neutron in the second equation, correct?


Nope. Helium has 2 protons and 2 neutrons. Here is an image that shows the reaction:

https://www.energy.gov/sites/default/files/styles/full_artic...


It's not "creating" energy, it's releasing energy.

The original atoms (exactly which atoms depends on the reactor, but let's assume it's deuterium and tritium) have a certain starting energy. When you fuse them together the resulting atom (helium-4, if you start with deuterium and tritium) moves it into a lower energy state.

Since the fused atom has lower energy than the input atoms, the fusion reaction releases the difference in energy, which you can then capture.


You're using X energy to release Y energy from the fuel (the pellet, containing deuterium and tritium). It was there already, just not in a usable form.


You have a piece of wood. You ignite it with a match. This causes a self sustaining reaction in the form of fire that releases far more energy than the match could ever create.

This is effectively what is happening with any energy generator.


Mass and energy are equivalent. You're using X energy to reduce the mass of your fuel and converting that mass into Y energy. When X < Y you have useful energy production at the cost of the mass of your fuel. Energy is conserved.

The reason nuclear fusion is such a desirable goal is because it only takes a relatively small amount of mass to convert into a relatively large amount of useful energy, and the mass (the fuel) is relatively easy to obtain.

Like all energy generation, it's converting one type of energy into another, more convenient type, to do useful work. Like a hydroelectric dam converting the potential energy of water into more useful electrical energy. Energy is conserved when water spins a turbine, it's just that electrical energy is more useful for work than the potential energy of the water. Of course you can still use the potential (or kinetic) energy of the water directly, such as with a water mill. But the energy to work ratio is worse in that form (especially if the work to be done is far away from the watermill).

Whenever you build a fire you need to input some amount of energy to begin the chemical reaction that releases energy. In this instance we get not electrical energy, but energy in the form of infrared and visible light, to heat our home and light our way. Yet the total energy released by the fire far surpasses the energy you used to start the reaction, but because the wood's mass is consumed, energy is ultimately conserved. You have converted wood (not useful for heating your home) into infrared light (useful for heating your home).


> Mass and energy are equivalent.

Mass is energy at rest, hence equivalence with exception of massless particles like photons that have zero mass and non-zero energy. Also, photons travel with the speed of light in vacuum and cannot be found at rest in any frame of reference. Modern physics is fun, isn't it?

P.S. Neutrinos were thought to have zero mass as well, but according to the Standard model they have mass.


Yeah, my explanation is very much simplified.


A daily life parallel: You can use a lighter to put a small amount of energy into a bunch of wood to extract more energy than you put it. In the case of fusion, this energy is coming from fusing hydrogen atoms, rather than a chemical reaction.


Energy is released when two atomic nuclei combine to form a larger atomic nuclei.

There's a threshold of energy required to attain this fusion reaction (otherwise there would be no light nuclei in the universe), and once the nuclei combine there's energy that is released, similar to how some chemical reactions can be exothermic in nature.


You take a lighter and light gasoline on fire, using X energy. Then once the gasoline catches on fire, it'll burn on its own, releasing Y energy.

Same principle, different means.


The Y comes from the mass of matter converted in to energy. The mass of the material before the reaction is greater than the mass of its products.

The law you're referring to might be the conservation of energy, but that applies to non-nuclear reactions and is more accurately called the law of conservation of mass-energy. In this case the energy in times the mass at the start is still equal to the energy out times the mass at the end. For the energy to increase, the mass must decrease to produce the Y in your equation.


The energy is "released" from the binding energy of the nuclei. It's similar to how throwing a bottle of nitroglycerine can generate a huge explosion, even though you use a tiny amount of energy to throw it.


The input energy X is used to create the conditions of high temperature and pressure that are needed for fusion to take place.

When fusion happens, two hydrogen atoms fuse together into one, losing a bit of mass in the process. The mass difference is converted into energy Y (using E=mc2).

In this case, Y was greater than X, so there was a net gain in useful energy.


The answers you got are excellent but a short response might just say you don't start with just X energy.

In a sense it's no more mysterious (conservation of energy-wise) than adding the energy of a spark results in the energy of the wood fire.

i.e. energy is transformed, not created as you quite accurately write.


Same way you use spark plug to get the engine going in an ICE car


You know how when you burn something you release the energy in the chemical bonds?

Fusion and fission are like that but for atoms instead of molecules.


Except… chemical bonds don’t ‘store’ energy. Molecules are a low energy configuration. It takes energy to rip them apart!

But, O2 molecules, with their double bond, don’t take much energy to break apart. If they do, and then pair up with say a bunch of Hydrogen and Carbon atoms that were nearby in some long chain or something, they form bonds that are stronger - that take more energy to break - and you end up with some leftover energy. Water and CO2 molecules are an even lower energy configuration.

but the extra energy you get wasn’t exactly ‘in’ the oxygen bond though - any more than when you have a ball at the top of a hill it has potential energy ‘in’ it.


There's a fundamental tradeoff between technical precision and explaining things in a way that's relatable to simpler stuff. Deal. With. It.


Not if the original description is exactly the opposite of what happens. You get energy out by making stronger bonds.


Stronger bonds by definition are lower energy bonds. So you get energy out by making the total bond energy go lower or "releasing it".


sometimes, when two small particles fuse, they become a single larger particle, but the larger particle's mass is slightly less than the sum of the masses of the two smaller particles. The slight difference becomes energy released, and the amount of energy released, roughly speaking, is E = mc^2.


I think it's more like a release of potential energy; kind of like how you can lightly nudge a large object teetering on the edge of a cliff and it'll make a huge splash at the bottom. It took a lot of energy to create the big splash, but you didn't need much to trigger it.


The analogy is that settings a pile of wood on fire requires a small amount of fire to start a big fire.

Energy in total is still conserved, but it makes engineering sense to compare the size of the starting fire to the total inferno created.


Y is the energy that was holding some hydrogen atoms together which you have liberated (while destroying the atoms in question but that's OK cause it's abundant).


I think you should think of it as “this lumber only needs 1 match to light it on fire while that lumber requires 10 matches to get started.”


E = mc^2

the "extra" comes from the mass


equally confusing is the BBC article which mentions the energy used for the reaction does not include the energy needed to power the lasers, which renders it a net loss


Nothing is being actually "created", just converted.


to answer essentially,

because matter is energy, you can use conversion energy to turn matter into its stored energy.



This tweet claims that 500MJ of energy was required: https://twitter.com/latzenpratz/status/1602686252486217728

> "had to put 500 megajoules of energy into the lasers to then send 1.8 megajoules to the target - so even though they got 2.5 megajoules out, that's still far less than the energy they originally needed for the lasers," says Tony Roulstone of the University of Cambridge.

But it's good to finally see progress. Very few technologies can transform the world the way a practical fusion reactor could.


It was discussed a lot in the threads about this yesterday, but apparently the lab had relatively inefficient lasers. Newer ones are an order of magnitude more efficient


This is what I captured from the press conference:

300 megajoules was used to generate the laser (this is also captured in [1]). They also mentioned that newer lasers have 20% wall plug efficiency. If so, they need to improve the energy output by 5x in order to break even relative to wall plug energy consumption.

[1] https://techcrunch.com/2022/12/13/world-record-fusion-experi...


“Wall plug energy consumption”

I don’t know why but this caused me to picture Alec from Tech Connections in a few years time, showing off his fusion laser plugged in to a kill-a—watt, while he explains carefully, through the magic of buying two of them, why you can get more power out than you put in, and why these old inertial confinement fusors were pretty neat actually.


That's just the lasers, the rest of the plant needs power too. Big water pumps are big power hogs, as is the rest of the supporting equipment that any power plant requires to operate. Far over "wall plug" break even is required for commercial viability.


Just bootstrap a second fusion power plant with the first, then continue on, similar to how compilers for a language can be written in the language itself.


Bootstrapping the power plant isn't the problem. The economics of the power plant are the problem, naming producing a worthwhile surplus of power, after accounting for all the power needed by the plant itself.


So if we had the 20% wall plug efficiency, then we need that 3.2:2 ratio to become 10:2 to hit break-even. That looks like a huge gap but maybe there are tricks here that resolve this once you get over the initial gap.


That’s only because they’re using old school flash pumped lasers, not the new solid state lasers you’d use today if you wanted to make a power plant demo.


Can solid state lasers produce the high energy needed here?



I'm not really seeing any convincing numbers there. Mercury lasers seem to only be 10% efficient. I get that this is better than the lasers that were just used at NIF, but that still seems pretty far from useful.


It’s useful enough to produce significant net energy if they continue with the gain scaling that ignition allows.


Why are they using these "flash pumped" lasers if more efficient ones are available?


Because they are researching inertial confinement fusion, not trying to build a working power plant. The efficiency of the lasers doesn't matter, since it doesn't affect their research.


Is energy on the order of 300MJ so cheap? You’d think that cutting it down to 150MJ would allow them to do more experiments.


300 megajoules is 83 kilowatt hours

a typical power price at trading hubs is US$40 per megawatt hour, though this varies considerably depending on many factors and is sometimes actually negative

a typical retail price is US$120 per megawatt hour

so this is about US$10 worth of electrical energy


300MJ ~= 83kWh which is like, $2000 in CA


I think you're out by some orders of magnitude. With the current energy issues in the UK it'd be under £100. Other things suggest in California it's more like 20 cents per kWh so were you thinking ~$20?


You're right of course, messing up my units again.


It's a very old lab, and replacing them isn't cheap/easy.

You don't need to use efficient lasers to get the scientific results they're after - other people have already very accurately measured the properties of modern lasers, so we can predict how they would perform without having to actually use them.


It’s a proof of concept. Upgrading lasers that already work is not necessarily the best use of limited funds.


They could get higher power out of more efficient lasers, enabling research at higher energies or bigger targets


That's not the purpose of the research, though. They are solely focusing on the energy transfer between the lasers themselves, and the output from the reaction. It's not clear that higher energies or bigger targets will teach us anything new.

Upgrading the lasers would slow the project down as new hardware is installed and issues are worked out. Not to mention I doubt the new hardware is cheap, and may be more expensive than burning excess energy using old laser tech in the meantime.

Other research groups work on laser efficiency, and the "final product" using this method (if it ever proves viable) would put together all the best pieces to get the best efficiencies.


Not to mention the money they would save on electricity bills.


The electricity bill is like $2000. It'd waste more money for a manager to think about how to replace equipment, at California pay rates. This whole thread is making me lose faith in HN.


Actually 83 KwH costs only about $20 in CA (avg $0.25 / KwH and probably LRL pays even less.


It just seems a little strange to take credit for a milestone when the milestone everyone cares about is yet to be reached. (More energy out than in.)

Good to hear that there's a laser design that might achieve that.


Don't take it too personally, but you, and many others here, need to rethink their approach. You see a short tweet without context about a topic you clearly know nothing about (which is totally, fully okay, it's a complex topic), and think you are now able to criticize milestones in this impossibly complex topic.

Not even ask questions, not something like "hey, I saw this tweet, I know it's just a tweet, but can someone help me understand context?", no, you actually go ahead and criticize work that you know nothing about, and when confronted, you double down.

On some level, you must know yourself that it might be better to ask as many unloaded questions as you want, but otherwise sit this one out in terms of assessment.


No thanks.

I get that people are emotional about this, but it’s important to treat science with a critical eye.

The claim is that more energy came out than was put in. This is false.

It’s not just me saying it. https://www.tiktok.com/t/ZTRVP5Pmg/

There is no “context” to understand. Yes, it’s an impressive feat. Yes, other laser designs might fix the huge ignition costs. But that hasn’t happened yet, and until it does, it’s completely fair to point that out.

Will it win me any friends? Probably not. It’s like showing up to a party and saying the reason for the party is mistaken. Very few people care.

But scientists should, and I am one. Doubly so for incorrect reporting to laymen. We have a responsibility to convey what was actually achieved, not what we wish was achieved.


[flagged]


Which criticism am I dismissing?

People do seem to be getting emotional about fusion, and pointing that out is hardly edgy.

Once fusion achieves more output than input, I’ll be celebrating right there with you. But until then, ignoring the Doberman in the room is a worse look, from a scientific standpoint.

I even cited a source from someone with a phd in mathematical physics, who is likely far more qualified to be talking about this than most of us here. So in terms of dismissing criticism, the stack seems to be in the other direction.

Scientific reporting matters. Reporting something false is generally a bad idea. Saying “we got more energy out than we put in” is false. Which link in this chain of reasoning is invalid?


> It just seems a little strange to take credit for a milestone when the milestone everyone cares about is yet to be reached. (More energy out than in.)

That comment/criticism is a little strange in and of itself. I would say it's the oddness or seeming petulance of the above comment that brought on boc's comment.

A silly, but illustrative analogy:

  Kid:    Dad, look! I scored a home-run!  
  Father: Who cares? Have you won the game yet? Stop celebrating until you do something that everyone cares about!


There's a difference between encouraging a kid and discussing energy technology.

It's entirely valid to remark that fusion still is far from viable as a source of energy.


I agree. To explain further, my post and the included (noted silly) analogy were made with respect to explaining the aspect of sillysaurusx's post(s) that boc seemed to be criticizing.


Your "honest feedback" is nothing more than naked insults.

Sillysaurusx is right. The "impossibly complex" matter is actually quite simple, Q=1 is little more than a psychological milestone, not some sort of technical tipping point where further progress becomes easier. And they haven't even gotten to Q=1 unless you buy into the justifications they give for dodgy accounting of the energy they put into it. The "impossibly complex" matter of commercial fusion is actually quite simple, it needs to put out a lot more energy than you put into it after you fully account for all the energy you put in. They aren't even close to this.


When a baseball batter hits a ball at a record 120mph, you calculate the impulse of force (∆p) they put into the swing to cause that result, not the total calories the player consumed during the past year in order to build their muscles.

You're arguing that the process of charging some inefficient lasers (aka eating food throughout the year) is invalidating this entire result. That was never part of the experiment, nor is it relevant to this test.

I understand exactly what you wrote above, and I'm telling you that it's not relevant to this discovery. You're arguing a non-sequitur in the classic definition.


Just as a note, since I made the same mistake initially, the person you're replying to didn't make the post from which you are quoting "impossibly complex".

It seems, to me, that boc was criticizing the unnecessarily dour tone of sillysaurusx's previous comment and not the technical aspect of the achievement.

The whole thing seems to come down to whether one interprets the announcement as an attempt to deceive the public at large or simply a celebration of a milestone that many in the fusion research community have been trying to achieve for a long time. I can understand it being interpreted both ways, but I think the more charitable interpretation is that science reporting, in general, doesn't usually properly explain the levels of nuance of various achievements and, as such, something that is genuinely exciting for those in the community is not necessarily as exciting for those outside of it - which comes across as deceptive.


You are entirely correct. I'll just add that it's not only about the energy put in, but ultimately about the cost. Net positive energy output is the absolute basic requirement. We're not there, we're not close, and even if we were, the hurdle would be to make it economically viable.


I think the confusion here is at least partially due to most articles obscuring the primary purpose of the NIF. Its not supposed to support commercial energy development, its supposed to support nuclear weapons development under the Nuclear Test Ban Treaty, where testing bombs via setting them off is banned.

So the NIF is supposed to give a testbed to study implosion created fusion reactions that produce enough energy to "ignite", that is, propegate the reaction to the rest of a hypothetical bomb. In that case, the amount of energy needed for the infrastructure to produce the initial implosion doesn't matter, what matters is that the energy coming out is more then the actual energy that triggered the reaction, so that the hypothetical bomb would blow up and not fizzle.


Exactly! It's so strange this is somehow made to be about fusion as an energy source. I'm happy to announce that Q>>1 has already been achieved on multiple occasions. In the cores of the thermonuclear devices tested by the US and USSR in the 50s. (Unfortunately that technology proved unviable as a path to civilian fusion power.)


It is a milestone, and I do think the researchers deserve credit for that. Getting more energy out of the reaction than was delivered to it by the lasers is actually important.

No one (except perhaps poor science "reporters") is claiming that this means we now have free and cheap fusion power. Of course the energy put in to operate the lasers themselves needs to be accounted for -- and it is! -- but that doesn't make what they've achieved useless. It's also useful to remember that the researchers involved are not the people writing press releases and articles; let's not minimize their achievement just because of sloppy, sensationalist reporting.

I like the analogy downthread of a kid being excited about scoring a home run in baseball, but the dad chastising the kid for celebrating before actually winning the game. That's what it feels like is happening here.

This is a huge step in the right direction, and it should be celebrated as such.


It's a significant milestone because demonstrating you can get net energy from the reaction removes a lot of uncertainty of whether it's possible in the real world. It starts to turn inertial fusion into an engineering problem of how you increase the efficiency of each stage.


Worth noting that the milestone achieved was positive Q_plasma (more energy out of the plasma than in).

They are using inefficient lasers because they are cheaper to buy/maintain/modify for research purposes.

Determining the conditions for positive Q_plasma is largely a matter of science/research so the external system doesn't matter as long as the variables are controlled and results are reproducible.

Once positive Q_plasma is well understood/reproducible, achieving positive Q_total (more energy produced than spent running the infrastructure) is just a matter of engineering and potentially waiting for the SOTA for components (like lasers or materials) to catch up.

TLDR: This is the scientists proving the theory. Now it's the scientists' job to refine the theory. Then the engineers get to put it into production.


The theory was proven to everybody's satisfaction in 1950.

There will be no production, except of new, smaller thermonuclear warheads. That is their legislated remit.


The funding does seem miniscule given we spend 100x more on funding military excursions in the name of energy security


Is the bottleneck for fusion money though?


yes. fusion funding has been sliding downwards for decades which is a large reason why it takes so long to do anything.

It's largely the same reason why NASA takes so long to do anything.

1. Shortage of funding

2. failure can result in loss/exhaustion of funding

3. extremely low risk tolerance

4. physical experiments needing new HW only happen when the likelihood of success is extremely high

5. projects are over engineered to reduce chance of failure

6. projects are over budget and over schedule

7. projects only make minor incremental progress

8. lack of fast/exciting progress drives decrease in funding

9. GOTO 1


I can't agree that funding is "largely the reason" why NASA takes so long to do anything. I doubt funding is a top 3 reason.

NASA just isn't about high-risk / high-reward "moonshots" anymore. The overarching political environment doesn't allow it, never mind the office politics.

NASA will get back to the moon using easily an order of magnitude more funding than it should have taken, with a launch system that costs an order of magnitude more money for each launch than it should. (almost two?)


Have to +1 this. A lot (most?) of NASA's funding is directed toward keeping people employed and skilled, as opposed to accomplishing goals, as with a lot of government money. NASA could do a LOT more with the funding they already have, if they were willing to divest from older technologies and vendors, but the politics of its funding doesn't allow that.


> A lot (most?) of NASA's funding is directed toward keeping people employed and skilled, as opposed to accomplishing goals

This is so hilarious wrong I don't even know where to start.

> but the politics of its funding doesn't allow that.

The post that you are agreeing with says that "funding" is not the reason for their plan.


> This is so hilarious wrong I don't even know where to start.

Anywhere at all would be better than nowhere. I worked for a defense contractor for a few years, so I'm basing my comment on my experience there.

> The post that you are agreeing with says that "funding" is not the reason for their plan.

Not sure what you are saying here.


I agree however that culture was caused by a lack of funding.

You can't be swift and lean when you are given very limited, budgeted funding. You can't take risks or you risk putting people out of a job and killing the program.

That leads to an overly conservative culture that restricts any risk taking and over-engineers everything to the point failure is effectively impossible.

This slow movement, overly conservative, design by committee approach helps limit risk but it absolutely balloons costs in the long run and horrifically delays progress. Of course if they were a company they'd eventually run out of money but that's not really an option for gov orgs so when the overly conservative, limited run designs end up encountering production issues, the projects explode in cost with nearly no upper limit.

TLDR: The political climate is a direct consequence of the lack of budget and continued restriction of that budget only worsens the problem.


> I can't agree that funding is "largely the reason"

> NASA just isn't about high-risk / high-reward "moonshots" anymore. The overarching political environment doesn't allow it, never mind the office politics.

Why doesn't the political environment allow for it. What could happen. What could regulatory bodies do to NASA for taking a risk and failing. What sort of constricting change could political bodies do in such a situation.


> Why doesn't the political environment allow for it. What could happen.

Three astronauts were incinerated alive. That was when they started to take safety more seriously. Subsequent accidents have only reinforced this.


Yes, but what happened after the astronauts died as a response?

The entire point of my comment is that funding cuts happened. That is the reaction.


The funding senator became the administrator of the current moon attempt. The funding insist on using the old technology in the funding. All these sounded bad. If nasa has more freehand. But then the fund will not get back to the states …


"Funding" isn't really a good answer IMO. I don't know a ton about Fusion research specifically, but NASA is horrifically inefficient with money compared to private competitors. Giving them more money won't magically make them more efficient. Reasons why include:

- Their incentive is to optimize for political approval, which means spreading facilities among as many congressional districts as possible, which creates a ton of inefficiency from poor communication and the need to constantly ship things around

- Public approval is the goal and failure is the worst possible option, so things tend to be optimized to take as few engineering risks as possible and have huge amounts of bureaucracy to spread the blame for any possible failure

There's a reason why SpaceX started landing rockets with a fraction of the money that NASA spent on building ridiculous boondoggles.


Also, SpaceX did it 60 years later.


>> Their incentive is to optimize for political approval, which means spreading facilities among as many congressional districts as possible, which creates a ton of inefficiency from poor communication

Ummm.. I thought remote work was no less efficient?


Remote work is fine in some circumstances. One of the circumstances where it is definitely not fine is in designing, manufacturing, and testing high-precision aerospace hardware. You aren't gonna put a 5-axis CNC mill in your garage.


It has nothing to do with funding. SpaceX had far less than NASA.

For another example from a different angle, the military has limitless supplies of funding but innovates even less than NASA.


You are comparing company that makes trucks with a company that makes precision scientific instrumers, and you are declaring that truck companu is more efficient per kilo of produce. this is stupid.

Nasa develops nuclear reactors, landed on titan and has reached pluto. Spacex vehicle has never left the Earth-moon system.


SpaceX is not Tesla. It's disingenuous to call SpaceX a "company that makes trucks". Just like NASA, they also make precision scientific instruments. They're the first privately funded mission to the ISS and run a massive satellite constellation.

They may not have the same accomplishments as NASA, but they're far from a "company that makes trucks".


> Just like NASA, they also make precision scientific instruments.

Which scientific instruments does SpaceX make? I have never heard of a single one.

In contrast, curiosity rover has dozens.

https://youtu.be/GUQ91fdYbP4


You think their satellite fleet has absolutely no precise equipment for knowing and maintaining its position? Or for communicating with terminals and each other?

Like, come on. I get shitting on Musk is the cool new thing, but this is genuinely the case where SpaceX is doing cool things in space, and at an extremely fast pace. Get over yourself if you can’t see through your Musk hate and only see them as “a company that builds trucks”.


The analogy is apt in at least defining a separation between the overall complexity of what SpaceX produces compared to NASA, to say something of how the two different models of R&D work, but maybe off in degrees as you discussed.

"NASA makes precision scientific instruments and SpaceX makes precision scientific instruments that have higher tolerances with a higher focus on throughput, and there are rapidly diminishing returns in how much funding can be used to close the gap" is probably the right take if not as fun.


> Spacex vehicle has never left the Earth-moon system.

https://www.whereisroadster.com/


Not your typical space vehicle


Lol “spacex makes trucks”. Way to throw on the anti-Musk blinders and completely ignore the back-half of the comment and misunderstand the first half.

Also, spacex has launched outside of the earth-moon system. It was a roadster


There are like 20 designs of various fusion reactors on the drawing board that need to be built and tested.

Scientists also don't work for free. They arent mushrooms that grow by themselves


One of the things that I think I noticed from the press conference, is that funding is going to be the bare minimum to meet some goal for a design they select.

This seems like a gross mistake.

If we are going to avert a climate catastrophe we will need TW of power to "unburn" the carbon we put into the environment (ocean and atmosphere). Instead of barely hitting this target, we should over deliver since we are running out of wall-clock time.

Every project that meets a bar for feasibility, organizational/operational capabilities (if they dont have it, either fix it, or transfer design to capable team) should be given funding (50-100M). We should be dropping BILLIONS on this, if we can drop 50B+ on semiconductors we can do the same for fusion.


Dump trillions of dollars into fusion energy today and it will still be decades before the first fusion power plant is connected to the grid. You'd be better off funding the construction of fission power plants. Those are very expensive and take years to build, but they're still a hell of a lot cheaper and faster than funding fusion to the degree you're suggesting.


Each dollar diverted to chase nuke wills-o'-th'-wisp brings climate catastrophe nearer.

Money is fungible. Dropping $billions on this means not dropping those $billions on something that works already, works fantastically well, and would work even better with more money. We already know how to prevent (more) climate catastrophe. We just need to do more of it.


So drop it on fission. There. Settled.


Fission means, in practice, paying enough for coal generation, over the decade, to have built enough solar to displace the nuke; and paying many times that, on top, to build the nuke.

So, no. Each dollar diverted from building out solar to mining coal or fooling with nukes brings existential catastrophe nearer.


Beginner question, why is it hard(not possible) to calculate the gain without actually building a reactor?

Surely if you have a non-satisfactory yield you shouldn't be building it. And that should be possible to estimate before we spend billions on it?


Simulations and estimations about processes in the physical world always leave room for surprises when one is doing things that haven't been done before.


More money attracts more eyes, hands, grey matter which they would otherwise focus on something with more money. But it is a long process.


I think a major bottleneck for fusion research from the lay public (including myself) is lack of interest.

Fission reactors work really well and have been around for 50+ years. If we are going to go nuclear instead of renewable, we need to address the elephant in the room.

The elephant in the room is: why not just fission?


Fukushima most recently


And somehow harnessing the power of the literal sun will have a better safety rate? It's all speculation at this point because fusion doesn't exist yet... but it does seem like a huge undertaking to get superior safety over fission.


It’s not speculation because so much is known about the physics and elements involved. With fusion you want lighter particles which tend to be less radioactive, rather than heavy uranium etc particles for fission.

See https://www.iaea.org/bulletin/safety-in-fusion for more details


”Regulatory bodies have vast experience in the realm of safety and security for fission. We are working with them to ensure that all applicable knowledge is transferred to fusion."

Let me put the question another way: suppose we get fusion that's equally as safe as fission like IAEA is saying here. Why would we switch to it if we were unwilling to switch to fission?


You’re making false equivalences, and ignoring all of the fundamental differences between the technologies. The materials involved are different, the chain reactions are different, the kinds of radiation and half lives are different, the way you build the cores are different (and are still being worked on for fusion).

The entire reason why people are working on fusion IS the fact that it’s much much safer due to all of the key differences. That doesn’t mean there aren’t lessons from fission to transfer, just as lessons from ICE cars have gone to EVs.


Fair enough. What you said sounds promising.

My point stands though: we're not doing fusion at scale yet, so we'll see.


money is always the bottleneck in some sense. There are diminishing returns but in general more money will make it happen faster.


This is like saying thread count is always the bottle neck in computation. More money allows more parallelism as you can pay for more people and more equipment for more research. As in computing, there are diminishing marginal returns and surely a version of Amdahl's Law for human endeavors.


> thread count is always the bottle neck in computation

The softer the bed linen, the more rested the computer scientists will be and the more likely they are to come up with novel solutions that lead to faster computing.


I would have to agree. The "in general" though is carrying an enormous amount of weight in that statement.

I think what other commentors may be getting at is that in many cases the simple analogy of asking how 9 women can have a baby in 1 month is instructive here. You could throw trillions at that problem, a need to have a baby in 1 month. Sometimes there are hard limits that money has a hard time addressing.

A case could be made that with enough money put towards advanced technology, like gene therapy to force a fetus to maturity in 1 month vs 9, it could be done with horrendous side effects.

So to your point money does solve all problems, but I think diminishing returns is putting it very lightly.


You might be right — we threw enough money and put humans on the Moon. The Manhattan Project was well funded and produced results as well.


I believe the Manhattan project (where we basically built an entire new city, and entire new manufacturing process from scratch: mining operations, refineries, enrichment, milling, etc.) cost less in constant dollars than the stealth fighter.


They also had a strong ideological element too.


The talent + purpose (which drew the talent) was what defined Manhattan. The money came easy when all of the greatest minds in the country were pointing a giant flashing light in one obvious direction.

This is big news but fusion will largely be a product of the people it attracts. The people can do the job attracting money and other talent if it's justified.

I've seen the Canadian gov try to throw money at trying to build a local tech scene and it all went to hucksters, old school finance suits with megacorp resumes, and administrators. While all the tech talent just kept going to SF where the capital was going into high risk ventures... not expensive buildings, events, 'entrepreneur/small business programs', and propping up old school D-round investors. Money is easily wasted even when the pursuit sounds noble and valuable on the surface.


The laws of physics also a limiting factor


"Yesterday, everyone was complaining about the 2.2:2.0 ratio, but now we're working with 3.15:2.05."

What is the difference between todays announcement and yesterdays ?

I thought we were just getting re-submitted headlines, but apparently this news is different than yesterday ?


Speculation.

People believed that X:Y was 2.2:2.0, but it's now 3.15:2.05.

The 10% EROI apparently wasn't impressive enough, it's now 54% EROI on paper (assuming less because of capture inefficiencies).

I'm no expert, but theoretically my understanding is that this ratio should scale along this same pattern for higher values of Y.


The X:Y ratio should increase for high values of Y in ICF. See https://suli.pppl.gov/2019/course/SULI2019_IntroICF_TMa_2019...


The official announcement (with the real info) was this morning.

Before that, the only official word was that an announcement was coming, and all the info was unofficial leaks and rumors.


Yesterdays news was that this result was leaked to the FT with apparently preliminary numbers. And today there was a press conference that had somewhat better looking numbers.


> demonstrating the scaling laws at work

One would think that one look at the sky would be enough to say that "hey this fusion thing scales pretty well".


I wonder if it might be possible to gain not percentages but orders of magnitude more or less just by making the targets bigger. Is it conceivable that the same basic approach and a comparable amount of input energy could be used to ignite a 100 MJ or even 1 GJ target ? Of course that would present some containment challenges but perhaps not insurmountable ones.

I'm also a bit concerned that this type of research may encounter national security related obstacles. Obviously a pure fusion bomb would be a game changer for nuclear (non-)proliferation.


I don't think a pure fusion bomb will have any form of advantage compared to the current hydro-bombs. They wouldn't produce more energy, but will need more gear to reach ignition.


A pure fusion bomb would produce less (not zero) fallout. Neutron activation would still produce some fallout, but you wouldn't have the fission byproducts like caesium-137, iodine-129 or strontium-90.

This is probably a bad thing; politicians might decide the bombs are clean enough to use.


even without actual radioactivity, pure fusion bombs would still be politically radioactive. look at the fallout (so to speak!) from the Hafnium controversy. they nixed all the research and stopped looking, after realizing that nuclear isomers would do little for energy storage (due to emitting energy as gamma radiation) but lots for bypassing restrictions on fissile materials.


To be clear, pure fusion bombs would still emit massive amounts of radiation. Gamma rays, x-rays, thermal radiation, all off that EM radiation would be emitted just like a regular fission bomb. Neutron radiation too. You'd have less (not zero) contamination of the earth itself afterwards, but everybody in the area would still be very badly irradiated.

I don't know enough about the Hafnium controversy to comment on it.


The advantage would be that you wouldn't need tightly controlled and hard to make materials like U-235 or Pu to make one.

I'm not in any way saying that using lasers would be a plausible route to such a weapon, since the NIF facility is huge, but if it turns out that the research needs to focus on how to get more output per shot, which I think it inevitably would since a typical conventional or nuclear power plant generates on the order of 1 GW thermal power (To match that with a 1 Hz repetition rate, likely a stretch for a MJ class laser, you would need to generate 1 GJ per shot, comparable to the energy in a ton of TNT.), it would probably be touching on areas that are highly classified.


Shiiiit... here I was thinking how cool it would be if they could miniaturize this, having somehow forgotten that my pet solution to the Fermi paradox is that a nigh-inevitable wrung on the ladder to interstellar presence involves discovering One Weird Trick to release a whole bunch of energy pretty easily, even on a DIY basis. Instant end of civilization. Even ant-like societies might have mutated members who'd go rogue and misuse the tech, and it wouldn't take many to ruin everything.

Basically it's a twist on the ice-9 solution to the paradox.


any sufficiently speedy spacecraft makes for a deadly kinetic kill vehicle, unfortunately.


One use for such a 'laser initiated fusion' is the LLNL (Teller) proposed X-Ray laser satellite from the Star Wars program in the 80s. The proposed solution then was to use the X-rays from an exploding bomb at the heart of a satellite and amplify and focus them through a 'lasing' material. It turns out they never found a lasing material that would work, and it would be fairly easy to confuse defeat it, so the project died. What also killed it was the end of the testing program.


I mean almost all of the power in a nuclear bomb comes from fusion. The fission part of it is just like the detonator for the real explosion.


This is actually backwards. Fusion weapons are substantially higher yield because they result in more fission, partly by preventing the fission primary from blowing itself up before it has finished.

Wikipedia: "Fast fission of the tamper and radiation case is the main contribution to the total yield and is the dominant process that produces radioactive fission product fallout."

https://en.m.wikipedia.org/wiki/Thermonuclear_weapon


Most of the fission energy in an H-bomb comes from the massive amounts of neutrons created by the fusion bomb initiating a fission reaction in the U-238 tamper of the secondary. The primary is used for its X-Rays, which cause the incredible pressures within the secondary by ablating the surface of the cylinder. When you look at this experiment and see it uses x-rays to ablate case containing the hydrogen, causing an implosion, the purpose of the experiment is clear.


It varies quite a bit by design, apparently the USSR’s initial design was only 15-20% fusion while US designs where closer to 50% which is still apparently the most efficient option in terms of warhead size.

However it’s possible to have higher fusion ratios at the expense of a larger device for the same yield. Most notably in the case of the Tsar Bomba’s which reduced the contribution of fission and too massively reduce the amount of fallout produced.


Almost all nuclear weapons rely heavily on fission of the tamper for yield.

Suggest "Ripple: An Investigation of the World’s Most Advanced High-Yield Thermonuclear Weapon Design" from the Journal of Cold War studies to read about a predominantly fusion device family.


This seems to say to me that D-T reactions produce neutrons, and that the kinetic energy of the neutrons is smaller than what you get by hitting U with that neutron. You already have the energy from the neutron (which will land somewhere in the system eventually), and you might as well get a multiplier by putting a blanket of U-238 in front of it.

That could be carbon-copied to a fusion power plant, and indeed, there are many proposals of hybrid fusion-fission plants in the literature that only require Q values marginally greater than 1. But if you go that route, you have radiation just like a fission plant, and one starts to question why you don't just build a fission plant (indeed, why don't we?).

My personal pet theory of the future is that, one day, we'll progress so far in fusion research that we get economic energy. But at the same time, the line blurs between both fission and weapons technology, so people are unhappy with the result. This doesn't feel particularly contrarian but no one ever seems to bring it up.


Since you asked: We don't build fission plants because they cost more than every other energy source. Fusion plants, if they could ever be made to work at all, would cost a lot more. So, there won't be any.


This particular research is literally built off of weapons research (National Ignition Facility).


Yes but, as far as I know, research into achieving laser induced fusion hasn't itself run into many classification hurdles. I suspect that may not be the case with the scaling phase.


It is interesting that there hasn't been many classification hurdles on something that is pretty explicitly weapons research. My guess is that they achieved high Q values a long time ago in classified research using mechanisms that would give away some secrets.


I think it's mainly because using a laser the size of a football stadium was never seen as a practical way to make a bomb.


Well, its not so much in making the bomb, but the particular ways of focusing the X-Rays and particular efficiencies in creating the tamper for the most velocity and thus most compression in the secondary.

The key thing here is the Lasers aren't doing the immediate compression, the lasers are simulating X-Ray radiation which then is ablating the casing around the tritium. Figuring out how to create and amplify a x-ray pulse was a major sticking point in the Star Wars program.


It's intended to confirm computer simulations. It doesn't have to be viable for the end article.


The bigger the target, the more energy is needed to compress it. So it would require more laser energy to get to 100 MJ or 1 GJ, I think. But maybe only a few times more powerful. (Pity they didn't build in some head room!)


The question is whether once ignition is achieved there could be a way to design the target/geometry in such a way that the fusion reaction becomes self-sustaining/self-propagating.

The history of thermonuclear weapons leads me to think that the answer may be yes. There does not seem to be any real upper limit to how large a thermonuclear weapon can be made. In the 50's and/or 60's there were proposals to build GT devices. I don't think the fission trigger for such a weapon could have scaled by nearly as much.

So by analogy if a relatively small fission trigger can cause a fusion explosion that is many orders of magnitudes larger maybe one or more tiny laser induced fusion reactions could be used to trigger a much larger one.

If that were the case the efficiency of the laser trigger would be of little importance.


3.15 Mj = 0.875 kWh

the unsolved issue with laser fusion (ICF) as an energy source is the fast degradation of high powered lasers. High powered beam degrades optics on it's path and those things are expensive. ITER has similar problem with superconducting magnets.


I keep getting lost in the numbers here. What was the net gain/loss for the entire system? Without the “lasers are 1% efficient at 20% energy loss with 40% energy transfer loss” and all that.


The net gain of the entire system at NIF doesn't matter, because the system at NIF was never designed to make a net gain.

People are estimating how this result moves the equation for an overall system that is designed for power production. Most numbers I have seen still leave a theoretically optimal power plant producing around a 30% loss in power with this number.


They needed roughly 500MJ of energy to power the lasers and produce the 2.5MJ of energy, so the net loss was... about 500MJ.


AFAICT: There's a large net gain compared to the energy emitted by the lasers. There is still a considerable loss compared to the energy consumed by the lasers.

While at it: I don't think the NIF approach will ever be applicable to commercial power generation on Earth. But I hope it will be one day applicable to a fusion-based rocket engine.


> It's nice to see this milestone recognized, even if the funding it still rather small.

Just wait until the DoD figures out they can use this for some military application and it will get 100x funding overnight.


We've had thermonuclear weapons (h-bombs) for a long time already.

Harnessing the energy in a controlled and sustainable fashion is what's hard.


H-bombs use the hydrogen to produce more neutrons which boosts the fission process. It still is a fission bomb.


No, you can have a fusion booster, like in the "Sloika" [0] design, but for a Teller-Ulam design, that is a H-bomb, you use a nuclear primer to ignite a fusion reaction and by far the most energy comes from the fusion part. [1]

[0] https://www.atomicarchive.com/history/hydrogen-bomb/page-11....

[1] https://nuclearweaponarchive.org/Library/Teller.html


You are correct that in a Teller-Ulam design most of the energy can come from fusion, but as a note it's generally a fission-fusion-fission design in "modern" deployments with about 50% of the energy coming from fusion.

https://en.wikipedia.org/wiki/Thermonuclear_weapon

Only very large bombs, such as the Tsar Bomba (97% from fusion) ever where primarily fusion, and those are not in modern arsenals.


Half of the energy comes from the fissile "spark plug" that's inside the fusion material. Less than half of the energy comes from fusion.

It still very much is a fission weapon.


You have it backwards. The fission component is just the "trigger" for the fusion element, which produces the vast majority of the energy release.


Half of the energy comes from the fissile "spark plug" that's inside the fusion material. Less than half of the energy comes from fusion.

It still very much is a fission weapon.


Nope...

"thermonuclear bomb, also called hydrogen bomb, or H-bomb, weapon whose enormous explosive power results from an uncontrolled self-sustaining chain reaction in which isotopes of hydrogen combine under extremely high temperatures to form helium in a process known as nuclear fusion."


Yes, but if you get into the details... The second stage of a bomb is the fusion in which a cylinder of hydrogen and/or lithium is encased by U-238. The primary stage, a trinity-like a-bomb, forces the cylinder to compress by its x-rays. The fusion reaction creates a bunch of energy, but more importantly a huge amount of neutrons. These neutrons cause the u-238 to undergo fission, which is responsible for a majority of the energy and pretty much all of the fallout.


The fissile "spark plug" inside the fusion material is responsible for the majority of the energy...


The appropriate analogy for this technology would be that it may be possible to initiate a thermonuclear weapon without relying on fission at all. Currently we use a fission nuclear bomb just to generate the temperature and pressure needed to start the fusion reaction, same as the one on today's announcement.

So far it hasn't proven to be viable, but time will tell.


This is already used for nuclear weapons research, which is why it's under Dept. of Energy.


Not really. I mean, yes, nuclear weapons are a thing, but Dept. of Energy supports many many directions not related to nuclear weapons. Physics research is mostly funded by DOE Office of Science or the National Science Foundation.


It's an explicit goal of the NIF to better understand the physics of fusion for weapons research. Not the main one but it's pretty important!


My point was: NIF would be funded via DOE whether or not it's relevant for weapon research. Sorry for not have been clearer.


I'm no expert, but I think you have it backwards. My understanding is NIF raison d'etre is weapons research, with a power generation being a secondary concern. It got funded because of weapons research regardless of whether it was relevant to fusion power generation.

It may surprise people, but the DOE is the government body that is responsible for nuclear weapons research in the US.


If it was already decided to be funded, yes it would have been under DoE. Though I believe the weapons aspect had a very major contribution in deciding for it to be funded at all. It was proposed shortly after the nuclear testing ban and has been a big part in fulfilling that area.

I'm not trying to correct you, but adding context for the weapons aspect.


Yes, you are right, the funding decision was probably influenced by the weapons research.


That doesn’t follow. The DOE covers a bunch of energy research that has no relationship to nuclear weapons (e.g. solar).


Cheap and abundant energy has a multitude of military applications.

Peaceful ones too, probably.


Uhhh... it's definitely being used for "stockpile stewardship." The fusion crowd went splitsies with the weapons folks to get this funded in the first place.


The targets are really the secret sauce right? If there were a civil ICF for power program, would NIF designs and data even be available to help, or is it all classified?


There are pictures of CAD and experimental setups for the targets. They’re also pretty open with the setup numbers, so in theory, you could make your own NIF setup and try to get their target designs working.

From what I understand, a lot of the work from the past years has been trying to piece together geometries, pulse timing, stability, and quality of targets.


The Q needs to be something like 500 to 1000, not because of energy breakeven, but to produce enough energy that the shot is financially positive. The amount of fusion energy produced in this shot is worth a penny or two.

(And even then, it's dubious a laser fusion scheme will be competitive with other energy sources.)


Its worth pointing out that per unit energy a lot of money is made making economically unviable cargo ship power, submarine and other military naval power, space ship power sources, diesel-electric locomotives ...

True if you want to replace base load of a civilization size network it needs to be economically viable, but we generate "a lot" of power at higher than market minima. Ironically, "good batteries" are the natural enemy of fusion research.

One fun thing about laser fusion is it theoretically can scale down very low and has a trivial "off" switch making it a good resource for engineering tokamak reactor materials or sensors or similar tasks.

The inner lining of a production fusion reactor is hard to make, so a laser facility would be ideal for research. Which is why we have one...


DT fusion reactors would be terrible for mobile applications, since there are so much larger than fission reactors of the same capacity. In space or mass constrained applications they would be ruinously inferior to fission.


> The Q needs to be something like 500 to 1000

Is there any calculation to this? What’s the cost of shot? Is there any limetime limit of the laser?


>The Q needs to be something like 500 to 1000

Why wouldn't a final Q of 50 be economically viable? Interest on capital costs? Other?


Inertial confinement fusion requires fairly expensive targets to collapse. To make it economically viable they have to produce a lot more energy per target destroyed.


> fairly expensive targets

The targets are only expensive because they aren't produced at scale yet.

They are the exact kind of thing a machine could churn millions of per day out, and then use them at the same rate.

Even if the targets were made of expensive materials (eg. platinum), most of that platinum could later be recovered from the reactor wall, so it still wouldn't be very expensive.


"most of that platinum could later be recovered from the reactor wall, so it still wouldn't be very expensive. "

And recovering comes for free?

Every step costs energy (or money).

There is no working design yet. It is waay too early to make any predictions about how scaling could reduce costs. Scaling can even increase costs, if it depletes limited resources like tritium.


Capital cost, cost of individual targets.


If you've done the math to determine the threshold, you may as well show it already


I'm repeating what I've heard. Personally, I suspect even that wouldn't reach the goal of being competitive.


It will get there. It's just a matter of time and resources.

The destination will be a milestone for humanity, so we should not give up.


It will get there, but it won't be from NIF or via any technology developed for this experiment. What they are doing is not viable for a rector, won't ever be viable for a reactor, and won't even be considered a starting point for any future rector.

It's a fusion plasma research experiment. It's not a program that is being run with the goal of creating a usable fusion energy power plant.


Why should I agree with this article of faith? The obstacles appear quite grave to me. Moreover, even reaching that Q doesn't mean we're there. That's a necessary, not sufficient, condition.

A large, complex machine that explodes the equivalent of 500 lb. bombs to generate heat to drive a turbine sounds like an engineering nightmare.


Because sustainable positive energy out has never been achieved before in 60 years of research. This is gigantic. It’s potential to decarbonize the world is massive, and now it became a whole lot less theoretical.

It’s an incredible milestone, not a solved problem.


It's a common joke that fusion power is always 50 years away. With this milestone, is it finally less than 50 years away?


That's a circular argument. It's big because the people doing it call it big. Why should I, an outsider, care about their internal goals, their egoes, or their status in their field? What does it do or imply for me?


To achieve fusion for power production, you need more output than input. For 60+ years this hasn’t been achieved in a replicated fashion. Now it has, and it’s 50% more power rather than 0.1% more power as was sometimes shown for 2 nanoseconds before. So now we know fusion for power is possible. If it can be scaled successfully (now likely not an if anymore, but a function of time), then we have the ability to have clean and safe energy 24/7. That would help mitigate the worst of climate change, and if cheap, turbocharge the entire economy.

What’s the circle or not big milestone here?


The issue is, we've been able to get more energy out of a fusion reaction than put in for 60-70 years now. The H-Bomb is very good at doing that. You will say, yeah, but an H-Bomb is a one time use thing, and it has a habit of destroying everything. If you look into how the second stage of an hbomb is speculated to work, its pretty much identical to this experiment, that is by design, not by accident.

From https://www.nature.com/articles/d41586-022-04440-7

Herrmann acknowledges as much, saying that there are many steps on the path to laser fusion energy. “NIF was not designed to be efficient,” he says. “It was designed to be the biggest laser we could possibly build to give us the data we need for the [nuclear] stockpile research programme.”


*contained fusion


I'm asking why this somewhat arbitrary line being crossed is something I should care about. It doesn't imply fusion will reach a state of practical application. Why is this more exciting that achieving a ratio of .1, or .5, or 2, or 10? It seems entirely arbirary to me, and smells of an argument that somehow this has made the end goal significantly more attainable.


>why this somewhat arbitrary line being crossed is something I should care about.

That is something personal and unique to each individual. In 1903 when the Wright brothers flew a heavier-than-air machine for 59 seconds, 99.99999% of the people on the planet wouldn't have cared. The airplanes you've flown on are vastly far removed from that original one. Same story for the point contact transistor in 1947. None of that solid state physics is used for modern transistors. Some people like to be early adopters for new ideas and things. Some don't. And that is OK.


Because until now contained ignition has never produced anything meaningful. We've had failed experiment after failed experiment. Now we finally have an experiment with a meaningful more amount of energy out than in.

Is this the right approach? Who knows. There are many fusion designs in the works, and those may ultimately be the right call. Or some yet-to-be-created design. That's even probable. The NIF is for simulating nuclear weapons, not creating energy. None of that takes away from this breakthrough - we've never had meaningfully more output than input on a repeatable basis. It's proof that contained fusion for energy isn't just hypothetical, which will also mean funding & interest will generally increase from this point on.

I think you're setting too high a bar. It's like saying no milestone should be celebrated until we have a working metropolitan-size plant running that's cheaper than anything else. Punch cards in the 1950s are insignificant compared to modern SSDs, yet they were an important step even though we don't use anything like it now. Breakthroughs are breakthroughs.


>It seems entirely arbitrary to me

I agree that the original press release flubbed the explanation. Here's how I think they should have presented it instead:

https://news.ycombinator.com/item?id=33971377#33977940


> It's big because the people doing it call it big.

How does "[It's big b]ecause sustainable positive energy out has never been achieved before in 60 years of research" translate to "because we say it's big" in your head?

You might not consider it big, but a specific reason was provided and it had zero similarity to your rephrasing.


An answer matched in tenor and tone to the question, but nonetheless entirely serious,

is that because while the obstacles are grave, the consequences of failing to overcome them are much graver still,

and to the best of our collective knowledge,

industrial scale fusion would be the least bad answer to our energy demands for the next epoch.

That is true but also does not obviate the need for other parallel efforts and other technologies whose challenges are also very grave, e.g. the need for very near term very large scale carbon sequestration, for a modern electrical grid with deep redundancy and resilience, the need for effective safe scalable stores for energy from whatever source, etc.


> the consequences of failing to overcome them are much graver still,

Why is that? Fusion is not needed, although if it turned out to be cheap that would be nice.


While I'm sure that you're correct, the obstacles are large and there is a lot of overcome still, I can't help but think of James Watt & (my ancestor) Richard Trevithick - the inventor/pioneer of the compact steam engine.

Watt went around telling everyone that Trevithick and his compact (ie high pressure) steam engines were too dangerous and would never work.

Yes, some exploded. But then we got steam trains and even today almost all power generation on the planet is high pressure steam-electric power plants.


> A large, complex machine that explodes the equivalent of 500 lb. bombs to generate heat to drive a turbine sounds like an engineering nightmare.

And using actual bombs and explosives to dig kilometers down and mine coal is not an engineering nightmare? Dying of gas in the mines, fires on oil wells, oil spills, these things are 'engineering simple'?


We don't place precision optics in those blast zones. We don't put structures there that are repeatedly exposed to blast. Over the life of a inertial DT fusion reactor there will be about a BILLION such explosions in the reactor core.


I wonder how much energy each stroke in the largest diesel ship engines has when compared to the energy released by individual bombs/high explosives.


Q is irrelevant, you need throughput. If your Q is one million, but you are processing one tiny capsule per second, you are producing too little money to pay for the facility.

If you can process a tanker worth of hydrogen per second, Q can be just above break even and you will still make money.


Irrelevant? Seems like Q is one of two factors in that calculation. If the throughput is tiny, you're useless, but if your Q is too small, the same is true.

The higher the Q, the lower throughput needed for feasibility.


Offset the financial viability with the cost of freedom from energy reliance and the need (more like craving) for military interventions


Solar panels (+ synthetic fossil fuel generation if you need unlimited energy storage) would be orders of magnitude cheaper way to achieve that.


That would be a valid argument if the only alternative to fusion was petroleum.


Exactly. Beyond power generation, humanity still uses petroleum products in their chemical industry. Which is why the shutoff of Russian natural gas hurts Germany much more than other countries, they now have a starving chemical sector.


I'm confused by this. Does the US have the productive forces and resources to replace petrolium with solar panels (and the required energy storage)? Does it have the nuclear fuel to replace petrolium with fission reactors?

What alternatives to petrolium does the US have that it does not rely on others for?


What scheme do you imagine that fusion could be used to replace petroleum that would not also work when powered by solar? Production of synfuels using hydrogen, for example, would also deal with solar's intermittency, leaving the energy sources to compete on the basis of levelized cost. The levelized cost of solar has become quite low, and it's very difficult to see how any fusion scheme, and DT fusion in particular, will ever compete.


I specifically asked about the production of solar panels. Are you assuming that we already have all the panels we need to replace petrolium sitting in a warehouse? What good is solar in an energy independence plan if we can't build our own panels?


What? Production of solar panels is just a matter of building and running more factories. There's no significant limit to this.


Nuclear fuel actually isn't that expensive or rare

Those crazy sci-fi stories from the 30s and 50s where everyone used nuclear power (and it was so cheap they didn't bother to meter it) were all completely accurate from a non-political viewpoint


Oh, interesting. I read/heard a few places that most fuel used for fission was controlled by a few countries. I was unaware it was abundant.


The stuff is kind of all over the place, but it's not high purity. The engineering to refine the material is highly controlled.


Foreign energy reliance is finished and has been for some time. North America can produce more petroleum energy than it uses. In both 2020 and 2021 the US was a net petroleum exporter.


Military interventions to blow up other people’s under construction fusion reactors will still be a thing, worry not


Naive question: Once they get out more than they spend, can they just loop the generated energy back in and keep increasing this exponentially?


Here's your most exciting paragraph

> LLNL’s experiment surpassed the fusion threshold by delivering 2.05 megajoules (MJ) of energy to the target, resulting in 3.15 MJ of fusion energy output, demonstrating for the first time a most fundamental science basis for inertial fusion energy (IFE). Many advanced science and technology developments are still needed to achieve simple, affordable IFE to power homes and businesses, and DOE is currently restarting a broad-based, coordinated IFE program in the United States. Combined with private-sector investment, there is a lot of momentum to drive rapid progress toward fusion commercialization.

It's fusion Manhattan project time.


100% agree. We should be dumping as much as we can in getting fusion up and running ASAP. It could be a silver bullet to stop climate change alone, and by driving energy costs lower, enable huge innovations in AI/automation and increasing material wealth.

$1T to move fusion forward just 5 years from eg 2040 to 2035 could alone have a huge ROI in terms of climate mitigation and decarbonization


How easy is to write $1T, but with that you can basically install enough solar cell to power all of USA energy need.


Here's some rough numbers, hopefully i haven't made any mistakes:

Total US energy generation 2021: 4 trillion kwh

Average power generation: 457Gw

Cost of Topaz solar farm: $2.5bn Potential output: 550 Mw Cost per Mw $4.5m

Spend a trillion on similar capacity, you get 220Gw potential output.

But that's assuming you get 550Mw average. In fact, it produces about 1282Gwh output a year, so average is more like 145Mw.

So at a rough estimate, you need more like 10 trillion to meet US electricity demand. And that assumes you have unlimited battery storage for free.

And of course electricity is a small minority of total energy use.

Incidentally, I think it's amazing you get to about 10% of total electricity needs with only $1tr, i.e. around 4% of US national output.

https://en.wikipedia.org/wiki/Topaz_Solar_Farm https://www.eia.gov/energyexplained/electricity/electricity-...

```python trillion_dollars = 1e12

mw = 550

cost_farm = 2.5e9

cost_dollar_per_mw = cost_farm/mw

num_mw = trillion_dollars/cost_dollar_per_mw

num_gw = num_mw/1000

num_gw

kwh = 4e12

mw = kw/1000

gw = mw/1000

gw ```


> Cost per MW: $4.5m

That's quite high. Here in Germany they assume 530-800€ per kW_peak for a utility-scale ground-mounted system (2021) [0]. You can add additional CAPEX for inflation and expansion of the grid, but <$1500 per kW_peak (aka $1.5m per MW) should be quite possible. Especially if you include scaling effects.

If you add batteries for night-time balancing, LCOE roughly doubles for now.

Economically, no other energy source will beat non-winter day-time PV in the foreseeable future. Imo, as long as other plants are running during those times, investing in PV is a no-brainer.

[0] https://www.ise.fraunhofer.de/en/publications/studies/cost-o...


Don’t forget that, 1 trillion can get you lots of leverage and cost benefits In installation and solar panel production. I’ll do it for 1 trillion.


No you can't because solar isn't a continious source and fusion is.

No amount of investment into solar power can make the sun magically shine 24 hours a day.


Of course you can invest enough and make solar equivalent to continuous power.

Solar + reverse hydropower, solar + synthetic fuel manufacture, solar + appropriate sized batteries, solar cranes-with-weights, etc.

Solar is so cheap that all those are plausible. Fusion is still far from ready.


I suggest we do not lay all of our eggs in one basket. Climate change can be best countered if we use the sweet spot of all current available and anticipated technologies, including photovoltaic, long & short term storage solutions, and fusion. There is no need to bet on one technology at this stage.


There is a market need.

Solar panels are cheap and the market appears to be fine with unreliable power[0] if it's cheap enough.

[0] Unreliable, but predictable that is. You can estimate with decent confidence what the weather will be like tomorrow and you can definitely tell what it's going to be a few hours from now, so energy auctions a day ahead are feasible.


Do you have week long blackouts currently? In any given year you can count on a week where a panel only produces 10%. It just has to be cloudy or rainy. However much solar we add, even with storage to get through a night, needs some other source to get through the cloudy week, unless we are ok with blackouts.


A panel. Panels spread over hundreds of thousands of square kilometres produce power averaged over that area, so they're much more consistent and predictable.

HVDC lines help with this a lot. It has gotten to a point where there are serious plans to build a long, undersea HVDC line from the UK to Morocco which, get this, is poised to cost less than the equivalent(GWh delivered annually) Hinkley Point C nuclear power plant:

https://xlinks.co/morocco-uk-power-project/

That's how cheap the combination of solar and HVDC is.


Glad it will work there. There has been local opposition to new transmission lines in the USA (see Maine recently) so when folks say “just build powerlines” all I can think of is the difficulty of the land assembly and permitting. Technically it is easy, it is all the humans in the way that make it hard.


I mean, you could literally build a ring of solar power collectors and microwave transmitters in orbit. It would always have 50% coverage.


Or just have a bunch of power lines running east-west.


Energy storage is not an unsolved issue.


Batteries


Solar can't do that without massive batteries that don't exist. Even then it's not viable in areas where there isn't sufficient sunlight.


The batteries do exist, but it’s better to use them in cars. If we start diverting our battery supply for grid storage it will drive up the price of EVs.


<during the day>. $1T doesn't buy nearly enough batteries for the storage we would need.

We should be installing more solar & wind, but that alone is not a solution to our problems.


Fusion can give us 10x our energy need. Imagine a world where energy is simply not a blocker for anything.


Will not be a silver bullet, electricity production contributes less than half of global CO2 emissions. Still need other solutions for transport, industrial processes, agriculture, etc.

Further, it's possible that fusion plants might be prohibitively expensive to build and maintain, even if their fuel is cheap.


People should understand that it is all an energy problem, regardless of how that energy is currently delivered. Transportation can be (and, of course, is being electrified), most industrial processes and agriculture currently use fossil fuels as inputs because its the cheapest source of input.

For example, nitrogen fixation for fertilizer currently uses natural gas almost entirely for the source of hydrogen. Vastly cheaper energy means it would make much more sense to switch this to electricity, https://www.frontiersin.org/articles/10.3389/fenrg.2021.5808....

But I agree with your sentiment, there are a lot of engineering details to consider before one can shout "free energy".


If there’s a glut of cheap electrical energy, there would be a rapid push to electrify things like iron and steel production (8% of emissions), heating (10% of emissions) and transportation (10%)… and so on.


I’m not thinking about just replacing grid energy, but carbon removal. If we can get fusion scaled up and efficient it should be no sweat to use it to just remove carbon from the air

In fact carbon removal might be a great way to subsidize fusion at the outset so that it can be overprovisioned/have a guaranteed minimum price


$600 per ton of CO2. The industry is trying to get to $100 per ton of CO2.

Energy estimated is 1,200 kilowatt-hours per ton of CO2 removed.

Nuclear fission energy averages 0.4 cents/kWh. That's $480 in energy costs alone (ignoring profit margins for removal, etc).

Why would bleeding edge fusion that requires cutting edge lasers, magnets, and various other containment stuff have a cheaper energy generation cost? I could be wrong, but I don't think nuclear waste (which is the primary difference with fusion) is the primary cost contributor. So why would you want to invest in more expensive fusion to remove all this carbon instead of cheaper fission? Nuclear waste by the way isn't waste. It can be reused in breeder reactors and it's used a lot in medicine (I could be wrong but one of the reasons nuclear imaging has gotten more expensive is because radioactive materials are more difficult to obtain due to reduction in global fission energy).


Those are figures for current electrical production schemes.

After decades of R&D fusion should become significantly cheaper than fission per kWh if only due to having less of a regulatory burden and smoother permitting process.

It’s more expensive now, but in the long run it will be cheaper.


You hope. It’s not “more expensive now”. It’s infinitely more expensive because we can’t even build one. It’s unclear how much a fusion reactor will end up costing but I wouldn’t use hope to blind myself into thinking it’ll be 10x cheaper than fission. There’s certainly reason to be hopeful because solar and wind are cheaper, but construction for those is also relatively simple and easy to mass manufacture most of it. Fusion reactors will probably look more like fission than not (ie complex to mass manufacture, still require complex structures, need regular complicated maintenance due to radioactive decay in the containment materials etc). Additionally, solar and wind don’t pose real threats to established fossil fuel interests. Fusion would and it’s unclear how they might respond from a regulatory hurdle perspective.


Fusion plants should be able to save a lot of money though simpler permitting processes, not needing the same level of containment structures, needing less security, and less regulator scrutiny. The equipment itself might be more expensive at the start but if we build these plants at scale those costs should reduce over time.


The first commercial reactor designs are probably 50 years away and another 50 years before costs come down to be reasonable / we can build enough to start replacing existing fission and coal. This is from the CEO of General Fusion a leader in the space. I think if even the fusion people are saying “build fission today to solve global warming” then that tells you something about the time scale this is happening on.

I suspect the regulatory environment is from regulatory capture by the fossil fuel industry. Otherwise why would Gen IV reactors, which can’t meltdown, be suffering many regulatory delays? What kind of nuclear proliferation concerns exist for reactors built and deployed within the US?


Most of those other things you mention can be electrified with current technology. Once that happens we'll need the electricity to power them.


There are a lot of bigger picture items around energy. Part of Europe's challenge with the war in Ukraine is that a lot of countries buy Russian gas & oil.

>Russia accounted for about 55% of Germany’s natural gas imports and 35% of Germany’s oil imports last year, causing Germany to resist a blanket European Union ban on Russian energy.

If every country could assure their energy independence, geopolitics would look very different. Yes, there are still many problems, like natural resources. But if you can assure all your residents can keep the lights on, their houses warm and the economy moving (EV's for transit, technology)... its very reassuring.


Cheaper and cleaner energy has never in the history of enery lead to global reduction of pollution, actually pretty much the opposite. Plus, nuclear power is already virtually free and has been available since the 1970s. If stopping climate change was an energy source problem, it would have already been solved then.


Basically the only energy that is cheaper and cleaner than fossil fuels is wind and solar, which have only gotten to that point in the last 1-2 decades, so it seems difficult to make claims about what has happened in "history".


I’m not anti fission. But the reality is that many are and the NIMBY effect is strong. You don’t necessarily have that with fusion.

Also, there should hopefully be much less pomp and process around fusion compared to fission due to the safety/radiation profile being much safer.


“nuclear power is already virtually free”

What? Cost is THE problem with nuclear energy.

You must have some bizarre way of framing things to make today statement - you should at least tell us what that is since you must know that you are making a controversial claim.


Way more than that. This would shift the entire geopolitical structure of the world. Humanity would never be the same.


Good riddance.


LMAO. Good luck to get funding passed through GOP-controlled congress with all these oil and gas companies rallied behind. There is zero chance that the USA will ever do that.


With abundant electricity wouldn't we face another type of global warming? We will would start to consume so much electricity that this will heat up the planet?


Relative scales of various factors matter here:

We aren't "heating up the planet" .. we're "insulating the atmosphere".

Increased C02 (and the close following methane and water vapor increases) serve to trap more of the heat radiated outwards that would otherwise escape.

The vast bulk of that heat comes from the visible light of the sun which passes easily through the atmosphere coming in, gets converted to IR energy as it warms the earth and oceans, and then escapes outwards.

Our changes to the atmosphere have disturbed that balance.

Your comment has some small merit, but you would need to work through the heat output of human generated power and then compare that to the daily heat energy originally from the sun that radiates outward.


> We aren't "heating up the planet" .. we're "insulating the atmosphere".

We are doing both. We are doing more insulating than heating, but we obviously generate more heat.


Did you actually read my comment to the end?

If so, how'd you do on the numbers?


Of course I did. It is rude to suggest otherwise. Your blanket assertion followed by your long explanation is confusing and less helpful to anyone asking the original question than it would be without it.


Good, so by what rough order of magnitude does the sun heat us up more than human power generation then?

Is it x10, x100, x1000, .. more?

This is after all the part that matters.


I have no idea. Obviously we get more heat from the sun than we generate on our own. That does not make your blanket assertion true.

The original question is a good one. We will use more energy as it gets cheaper (see monster trucks, Las Vegas, Dubai). We should be thinking of what we will do with the waste heat when everyone has a fusion reactor.


Your repeated assertion that I made a "blanket assertion" suggests you're either someone with english as a second language or else someone who stubbornly digs in (unless there's another explanation here?).

To be clear I made a qualified assertion; the phrase "Relative scales of various factors matter here" does the work.

> Obviously we get more heat from the sun than we generate on our own.

Yes - but, again, How much more?

If it's barely twice as much (which seems unlikely) then heat from our activity is a major factor in all this.

If it's 100,000x times more then I stand firm, heat from our activity has effectively zero impact on the AGW issue (although other by-products from our daily routines are the crux of the problem).

If it's some other factor then what relative signifigance does our human heat generation have?

> I have no idea.

It's a shame you didn't grab an envelope and make use of the back, it's a classic Fermi problem [1] of the kind I and my class mates were posed in high school and the kind of thing many other HN commeters would delight in taking a run at.

If you're feeling game you might like to start with the daily heating of the earths surface from the sun, and then look at the petajoules of energy generated and consumed per day and take a stab at guesstimating the waste (unused, released into the lower earths surface layer) heat as a percentage (or google efficiencies, etc).

Have fun!

[1] https://en.wikipedia.org/wiki/Fermi_problem


Stubbornly digs in, indeed.


I'm not going to set my thermostat to 78 degrees because electricity is practically free.


But some people would start driving 6 ton vehicles because fuel is free. Limiting factor for crytpo mining would be only number of hardware you can get (provided crypto survives for so long). Some people would probably do the math and decide that thermal insulation is too expensive, and run heating/cooling on max. I could probably come up with more examples.

EDIT: from quick google query it looks like per day earth receives equivalent of yearly power consumption (not only electricity but also fuel). So we would need to x356 our energy consumption which is not that much unreasonable if you could have for example heated outdoor pool for free anywhere in the world.


> silver bullet to stop climate change alone

nothing will stop climate change. The earth’s climate will continue to change regardless of any human interventions


Most of the funding for the Manhattan project went into the industrial infrastructure required to produce plutonium and enriched uranium.

It will be time to unleash resources once they have a working fusion reactor design in order to build fusion power plants and the industrial infrastructure required to supply them.

Until then they should of course get the resources they need but I don't think throwing money at them will necessarily speed things up.


A uranium-gun bomb has a very simple theoretical basis, but enriching uranium is very expensive (and was even more expensive in the 1940s. Producing enough enriched uranium was the only hard problem in making that bomb; in fact they did not even test the bomb before dropping it on Hiroshima because they were fairly sure it was going to work and wouldn't have enough enriched uranium for a second bomb on the timelines involved.

Plutonium was significantly more easy to produce, but it did require some novel engineering for the implosion lens. They weren't sure it was going to work and did, in fact, test the bomb before dropping it on Nagasaki.

I think the Manhattan project is a great example of where more funds can help; if the funds were more restricted, it's entirely possible they would have gone with the "sure thing" of the uranium bomb instead of spending resources on the less sure plutonium bomb. Trying out multiple ideas in parallel often "wastes" money since if you try ideas in tandem, you will always try the high-percentage ideas first.


I would love to see "fusion Manhattan project", this planet is well overdue for new gigantic R&D projects such as Manhattan and Apollo.


Who are our modern J. Robert Oppenheimer, Enrico Fermi, Richard Feynman, Edward Teller, John Von Neumann, and Stanislaw Ulam?


> Who are our modern J. Robert Oppenheimer, Enrico Fermi, Richard Feynman, Edward Teller, John Von Neumann, and Stanislaw Ulam?

they're working on getting you to click on an ad


They can't exist any more for structural reason.

This generation was classically educated, without TV or social media in their childhood. They spent the time we're wasting on HN reading _books_ and following the discipline their elders learned in WWI. They had plenty of occasions to tinker.

I claim the brains of those generation was structurally different from ours, and we're talking about the best minds of this generation.

It's a trope to say that our "best minds are working on ads" - the reality is that, no, we webshits are not the "best minds".


> It's a trope to say that our "best minds are working on ads".

"The last generation was better because they read _books_ and had _discipline_ and didn't waste their time on frivolous garbage" is also a trope.


Doesn't mean it's wrong.


It probably is though. Sure, they didn't have as much distractions, but they also didn't have the sum total of the world's knowledge at their fingertips in the same way we do today.

People having been saying "this next generation is inferior to the last one" since the ancient Greeks. If that was consistently true we would already be in an Idiocracy scenario.


> If that was consistently true we would already be in an Idiocracy scenario.

Been on Twitter, Facebook, or Hacker News lately ?

More seriously, I really fear this kind of stuff is, to some extend, new : https://sitn.hms.harvard.edu/flash/2018/dopamine-smartphones...

(Although, maybe it's comparable to the "opium epidemics" of the 1800s ?)

Ironically, I should stop having this kind of conversation... On social network.


This general idea that a previous generation was better because they lead a less pampered life goes back to some of the earliest writing. Yet here we are.

World changing people seem to me to be very much the right people in the right place at the right time. The best way to find them is to try to create those places now.

Opportunities for WWII and post-war era research don’t exist now. Everything with funding is very short term, politicized, and narrowly focused within a micro specialty.

Realistically, I don’t think that will change until it has to.


I mean, sure. But at the same time they were constrained by the tools of their time, had no internet for instant information access and spread, and scientific collaboration has never been at a higher level than it is now. There's no reason to believe that people who grow up with instant lookup and massive computational power will somehow be less capable than people whose only tools were pen and pencil. What is possible now couldn't even be dreamed of back then.


Webshits are not working on ads


> they're working on getting you to click on an ad

They're not, and there's zero evidence to back that frequently floated premise up. That's a particularly laughable myth created by those same industry people to feel better about their terrible life choices. If you can't do something meaningful, at least you can pretend to be a genius doing nothing meaningful. It turns out that both things are false, they're not brilliant and they're wasting their lives.

No, the brilliant people are working at TSMC, Intel, AMD, nVidia, Applied Materials, ASML, Illumina, ARM, TI, et al.

They're working on CRISPR. They're working on mRNA vaccines. They're working on stem cells. They're trying to cure HIV just as the same type of people cured hepatitis C. They're working for Moderna, Pfizer, BioNTech, Roche, Novartis, Amgen, Regeneron, Sanofi, Gilead, Merck, Glaxo, et al. They're trying to figure out how to roll back or cure Alzheimer's. They're dedicating a lifetime of work into exploring the human genome, so that future generations have a much better, much more useful map.

They're working on robotics at Intuitive Surgical or Boston Dynamics. They're working on self-driving tech. They've been building out the massive, global cloud infrastructure. They're at NASA, or SpaceX, or ESA and they're doing the work to get us a base on the moon or to Mars. They just got done building rockets that can land upright. They're building a massive, extraordinary, global satellite system in Starlink.

They're working on fusion.

And so on and so forth.

Ad clicks? Yeah right. They're not even in the room.


A lot of wonderful people are doing that, but do those jobs pay anywhere close to the ad companies? Surely there's a lot of bright minds lost to the allure of money.


At this point, it's not even about "money" in the traditional sense (wealth, prestige, etc.); rather, it's about stability, the alleged "American dream". I live in Chicago, so I'll consider the local national laboratory, Argonne. They pay their software engineers $101,888 per year, according to Glassdoor ($71,640 after state and federal tax). Using the 28% rule most lenders use nowadays, with today's rates, that's a maximum mortgage payment of $1,671 at 7%. However, the median house price in DuPage County is $335,000 [1], and a 30-year mortgage (with 10% down) has a monthly payment of just over $2,000. No dice - even for a highly skilled professional living in one of the most affordable parts of the country. Keep in mind that you still need to pay 2.3% per annum property tax, besides owning a car and saving for retirement. It's just not nearly as feasible a path towards financial stability as taking a $"TECH" job with west coast pay.

[1] This is up 1.1% year-over-year, and only up 6.5% p.a. over the last three years - not a pandemic-driven bubble. Source is https://www.redfin.com/county/733/IL/DuPage-County/housing-m...


Wow. That was long overdue. Thank you!


Thank you for this, sincerely.


yeah. But some are apparently working at Livermore Nat. Lab still. Also, I feel like there is a bunch at SpaceX, Tesla, NASA, DeepMind and OpenAI


Yeah, I think we have a lot of sleeper geniuses out there

I'm a pretty smart dude. I'm no big deal on HackerNews or in Silicon Valley, but I look easily 10x as smart as most of the normal people I come across in the real world. And I regularly come across people so much smarter than me, they have to explain things to me the same way I talk to a toddler

I'll bet a lot of geniuses are congregating in cool orgs like those where they can make a real difference in the world.


Hear hear!


Hiding behind the names of institutions that got smart enough to not give the peons fame or recognition.


off-topic, but related. Ulam's "Adventures of a Mathematician" is an excellent and very inspiring book.


Elon Musk ::ducks::


lmao, brilliant scientist Elon Musk is NOT. A closer comparison would be general Groves, someone who can get the team and resources in place so the work can get done.


He's actually the complete opposite based on reports of what it is like to work with him at SpaceX and Tesla in recent years. Employees describe having to avoid him so he doesn't meddle in the projects and screw them up.

Maybe he was like you describe long ago but something...happened.


The question on my mind is: where do I sign up to join this effort?!

Edit: I'm Canadian, the question is rhetorical.


Assuming you have experience in software, then https://www.llnl.gov/join-our-team/careers/find-your-job/0d6...


BTW, they don't seem to have software roles at NIF: https://www.llnl.gov/join-our-team/careers/find-your-job/liv...


Aren't they required to post salary ranges by some Californian law?


These are federal jobs so all of the pay bands are public knowledge.


This is not quite correct. LLNL is a Federally Funded Research & Development Center (FFRDC) which is owned, as a facility, by the government, but managed and staffed by a non-profit contracting organization called Lawrence Livermore National Security, LLC (LLNS) under a contract funded by DOE/NNSA. The board of LLNS is made up of representatives from universities (California + TAMU), other scientific non-profits (Battelle Memorial Institute), and private nuclear ventures (e.g. Bechtel.) LLNS pays, with very few exceptions, staff salaries at LLNL, and they are not beholden to the government civilian pay schedule.

https://www.llnl.gov/about/management-sponsors


Law goes into effect next year.


Is it enforceable on the Feds?


Former LLNL employee here, they hire a LOT of foreign nationals. Several people that I worked with there were Canadian.


Be that as it may, a number of positions at LLNL, including many of those affiliated with NIF, require that candidate is a US person and is eligible for a DOE security clearance. A security clearance is not necessarily binary on being a US person, but a number of national-security related positions may require not only the clearance, but also that the candidate is a US person (or outright forbid foreign nationals.)


Google "national ignition facility careers" and this is the first link

https://lasers.llnl.gov/about/careers



There's always General Fusion.


There are already funded commercial fusion projects underway. No idea which will bring a product to market first or at all, but they suddenly seem a lot more plausible.

https://www.nytimes.com/2021/08/10/technology/commonwealth-f...


That's a very apt analogy, as both this and Manhattan are weapons research programs.

I'm not very excited in hearing we'll get even more powerful thermo-nuclear bombs.


Fusion bombs have existed since the early 1950s. Technology rapidly developed to the point that they can essentially be built to be arbitrarily large, far beyond any practical war purpose. There is no need for any larger bomb than what was built many decades ago. None of this research is necessary for bombs. All of the difficult problems fusion power generation faces with long-term plasma confinement go away when you're just trying to squeeze as hard as you can and are willing to use fission bombs to do it in an otherwise uncontrolled manner.


> None of this research is necessary for bombs.

And yet that's exactly why the NIF was actually built. They do plenty of weapons research: https://wci.llnl.gov/facilities/nif I'm told the building was even built to switch over between civilian and classified use unusually quickly, but I'm having trouble turning up a citation for that right now with just my phone and 2022-Google.

> All of the difficult problems fusion power generation faces with long-term plasma confinement go away when you're just trying to squeeze as hard as you can and are willing to use fission bombs to do it in an otherwise uncontrolled manner.

Not if you want them to fit in a submarine warhead. This sort of work is not easy to do well.


> And yet that's exactly why the NIF was actually built.

You're both half-right.

The NIF is the replacement for nuclear tests. It's necessary to maintain the arsenal in a working fashion, as the warheads degrade over time and have to be replaced with new ones. https://www.npr.org/templates/story/story.php?storyId=655921...

The NIF is not for more powerful nuclear weapons, as that's entirely unnecessary. If anything, most interest these days is in less powerful weapons for potential battlefield use.


> And yet that's exactly why the NIF was actually built. They do plenty of weapons research

It's not the only thing they do.


It is necessary since they banned the testing of nuclear weapons. Before they would do this kind of research by imploding a cylinder of uranium encasing a hydrogen core with X-rays produced by a "Fat Man" style bomb. Now they implode a cylindrical casing full of hydrogen by x-rays caused by a laser vaporizing an outer layer.

“It’s a big milestone, but NIF is not a fusion-energy device,” says Dave Hammer, a nuclear engineer at Cornell University in Ithaca, New York.

Herrmann acknowledges as much, saying that there are many steps on the path to laser fusion energy. “NIF was not designed to be efficient,” he says. “It was designed to be the biggest laser we could possibly build to give us the data we need for the [nuclear] stockpile research programme.”

https://www.nature.com/articles/d41586-022-04440-7


Do more powerful bombs really make any difference? Seems a bit like worrying about the impact of climate-driven ocean rise on the pressure at the bottom of the Marianas Trench.


That isn’t what this would be used for. In fact, yields for the largest deployed H-bombs today I think are smaller than they once were (due to better targeting capabilities).


This is true. The issue is that already a relatively small nuclear weapon is perfectly sufficient to wipe out most to all civilian structures. However, it does so in a roughly circular area, and you need to increase the initial explosion a royal lot to increase the devastated area by a bit. And as you increase the overall spherical blast of the weapon in order to increase the circle of doom on the ground, more and more explosive power just vaporizes air.

That's why MIRV was introduced. One ICBM delivering 10 - 20 small warheads result in much greater devastation than an equally heavy warhead in one package, because less power is wasted on air and space.

It's morbid math, but it makes sense.


What do you think the US nuclear weapons research lab will use their research for?

You're right that increasing the yield was a bad example from my side, but the purpose is to improve the weapons, nothing else.


The purpose of NIF, and it’s not hidden, is to maintain the existing US nuclear stockpile since we can no longer rely on using underground nuclear weapon testing to ensure they still work. There’s a very big supercomputing capability funded under the same effort. Instead of testing the weapons by exploding them underground, we use computer modeling with the modeling validated (ie backed up) by experiment (at NIF) to make sure the stockpile works and can maintain its strategic deterrent. The euphemistic name for this is “stockpile stewardship.”


The B53 bomb was built in 1961 and it released 38 PJ or 10 BILLION times more energy than this experiment. Data gathered about plasma and fusion at NIF temperatures and pressures is not helpful for the insanely different environment of a nuclear bomb.

> What do you think the US nuclear weapons research lab will use their research for?

Why do you think that fusion is not enough? Complete strategic energy independence for the US, and dominance in the electricity sector? That's so, SO much more valuable than better nuclear weapons.


You should be very excited because we live on a planet with independent competing countries and well... you don't want to live in the US or Europe with China or other not so friendly countries building a bigger more powerful nuke. If a weapon can be built, it will be built. How, when and if it can be used are things you can control not whether someone somewhere will develop it. Especially in war time, all bets are off.

Although, it would be interesting to see fusion reactors on planes and ships powering other types of weapons like lasers and more powerful railguns or faster icbms.


Good things often have dubious or downright evil origins (which would never be justifiable a priori).

A relatively recent example: development of cancer chemotherapy began with the incidental finding that the chemical warfare agent nitrogen mustard reduced the white cell count of affected soldiers.

We make progress building on the shoulders of giants, but those giants are often standing in dung.


Soviet Union built this and it wasn't really practical and ended up leading to test ban treaties

https://en.wikipedia.org/wiki/Tsar_Bomba


Imagine if Musk spend 44 Billion fusion


Given it's Musk and his stated primary life goal, the most ridiculous aspect of the Twitter debacle for him, is: not only of course did he overpay for Twitter by at least 2x; not only is his net worth going to contract as Tesla's stock compresses (such that the poor Twitter decision is going to be that much more painful in relation to his overall wealth); but the $40x billion could have probably paid for getting Starship to Mars. He's not going to be as rich in the future as he was in that moment, and he'll be relentlessly mocked for the context as his ship takes on water (eg when he's worth $60-$80 billion and spent $44 billion buying Twitter and SpaceX needs $10+ billion infused into it to keep pursuing Mars).


He didn’t pay 44B. 13B are bank loans (that need to be paid back by Twitter, not Elon). For the other 31B he had some co-investors as well.

But yes, it was clearly a mistake.


>the $40x billion could have probably paid for getting Starship to Mars.

Maybe he doesn't actually believe in Starship.


It appears to be working/progressing properly so far (and quite rapidly compared to norms in the industry), including the Raptor engines. I doubt that's it.

Musk has very obviously poor impulse control. Someone more contained, patient, less impulsive, would have waited and taken a more strategic approach to acquiring Twitter (which would have left an opening to let the stock implode with the rest of the tech market, after which one could have pounced and grabbed it for far cheaper). On the flip side, that less impulsive person probably wouldn't have started SpaceX in the first place (given the suicidal fiscal task involved and context at the time in the industry), or wouldn't have gone to the financial extremes required to make it succeed (betting essentially all of his wealth on Tesla and SpaceX).


Are you suggesting it's all an elaborate scheme?

I know it is popular/easy to hate on the man right now, but this is a really strange take.

Given that Musk has been talking about mars since at least 2001, many years before he had the resources he has now, and almost went bankrupt funding spacex's first orbital rocket, it's hard to believe he's pretending.

People seem happy to believe all negative things they hear about him, but discount anything that doesn't gel with this negative image. It's like how the same people who put all missteps of Tesla/SpaceX at Elons feet, will also discount any of the successes and say he has nothing to do with them.


The boring company was literally an elaborate scheme on the other hand. Fool me once, shame on you. Fool me, you can't get fooled again.


I dunno, if you look at the Boring Company through the lens of "this man really wants to go to Mars", it kinda makes sense. Probably a lot of Mars colony infrastructure should be tunneled underground, given the lack of atmosphere / magnetosphere.

Things like Hyperloop also make more sense in that context.


Or maybe he expected Twitter to actually make money, or at least not lose money.

And this could still be true.

People here act like he bought Twitter and then deleted the website. This isn't the case.

Did he overpay, yes, but its still a business that is worth something.


>People here act like he bought Twitter and then deleted the website. This isn't the case.

I was told that Twitter would collapse and die any day now a month ago.

It seems to be holding up well enough during the current mega world event known as the FIFA World Cup.


Imagine if we've spent 10% of the current military budget ($600B+) on renewables & fusion. We wouldn't have to fight those wars for resources.


> We wouldn't have to fight those wars for resources.

You mean wars for oil? Fusion would not solve that, that wars were not about energy per se but about control and domination, keeping USD as world reserve currency and US as world hegemon. Look at Taiwan and chips situation, no oil there, there will be always some "oil" out there that you will want to control instead of giving that control to your rivals. It's game theory 101.


okay but oil is a fundamental input into everything hence a much bigger threat than say TSMC's nextgen fab. everybody understands that with some effort those outputs of a strong economy can be recreated but not inputs. Arguably lithium or water may become that but we are currently far from it. Sure there will always be something we'd want to control but we can go in more calm, clearheaded way towards it rather than just use force because we have the biggest stick.

people love to invoke game theory but fail to explain why only we spend more than next 10 countries military budget combined. plus its quite unclear what we get out of it because of all the secrecy & likely its mostly inflated costs and kickbacks. The wars in Iraq/Afghanistan themselves have cost close to 3T over time & thats outside of annual budgets. Personally I'd have preferred to take medicare for all for that amount of money.


> Sure there will always be something we'd want to control but we can go in more calm, clearheaded way towards it rather than just use force because we have the biggest stick.

Unfortunately no, you can't go more calm because projection of force is what keeps status quo and your rivals in place, not liberal values or clearheaded minds. There is no world police, the one with the biggest stick makes the rules. US is not the first hegemon in history of the world, we had Roman Empire, we had Dutch Empire etc, all of the world hegemons were major military powers. There is always some challenger waiting in the shadows to take over your position, you can't just sit and be calm because you will lose what you have, I assure you that others will not just sit calm but claim what's yours if they see sign of weakness, history proves that again and again.

> but fail to explain why only we spend more than next 10 countries military budget combined

Well it's easy to explain, US have it in its doctrine that it needs to have military strong enough to fight 2 wars at a time so it needs to spend more than at least a few countries behind it but a lot of this budget is probably not spent well. The problem here is that when you stop being world hegemon with the biggest stick, your currency stops being world reserve currency which means you can't finance your debt the same way as before, which has drastic consequences to your budget and it would be really fatal for US. Sometimes you just can't stop the music even when you don't like the melody because silence will hurt you.

> plus its quite unclear what we get out of it because of all the secrecy & likely its mostly inflated costs and kickbacks.

Agreed that's inefficient, you probably could achieve the same with lower costs but how much lower I don't know if anyone knows.


TL/DR

Lasers are pretty inefficient. How much energy did it take to make 2.05 MJ of lased light?

Does this breakthrough have a reasonable path to to make more electrical energy than it puts in?


Another good time to remind everyone that the inventor of the maser (which led to the laser), Charles Townes, was discouraged by his department chair (allegedly): "Look, you should stop the work you are doing. It isn't going to work. You know it's not going to work, we know it's not going to work. You're wasting money, Just stop!" A few months later, it worked. [1]

[1] https://www.theregister.com/2015/01/29/charles_townes_nobel_...


> He later explained that he was able to refuse to stop his research because he had tenure, so there was nothing they could really do to stop him.

Not having to chase your next meal can make a big difference


This is why tenure for fundamental research is very important. You have to be free to fail without consequences or you won't be able to take the risks to figure this stuff out.


Exactly, this is the biggest problem with modern universities.

I wish billionaires would fund things like this directly as in no patent no nothing, just from the goodness of their hearts, which they claim is there seeing how much they all pledge to charities... But the reality is different, Oxford pledged to donate the rights to their covid vaccine for free then was urged by the bill & melinda "foundation" to reverse course and sold it to pharma companies.


"If we'd known it was impossible, we never would have succeeded!"


I don't know about everyone else, but I'm taking this particular moment just to swell with pride and excitement for this achievement by science and forget about the details of how much more needs to be done to create the first power plant. I'm remembering when I first learned about fusion energy development, how distant and unfeasible it seemed, and regardless of how long the road ahead still is it's incredible how far we've come.

Happy Ignition Day everyone. I can hardly believe we really made it here.


When you consider that they laser they used consumed 300 megajoules from the wall plug, in order to send 1.8 megajoules to the target, the fact that they got 2.5 megajoules out looks puny in comparison. Even newer lasers only have 20% wall plug efficiency according to the press conference.

So the important point here is, there was no net energy gain. They spent 300 megajoules to get 2.5 out. The scientists only talk about the 1.8 megajoules of laser energy sent to the target, not about the 300 megajoules of electricity needed to send 1.8 megajoules to the target.


The NIF is not intended to be a power plant, and inertial containment in general is probably not a great design for producing power.

This is scientific breakthrough. The best point of comparison is probably a fusion bomb, which requires an initial fission detonation to create enough pressure and free neutrons to force a net-positive fusion reaction. But at the NIF they do it using only lasers… incredible.


Energy is conserved in the universe so there is never any net energy gain.

See how pedantic and not helpful that is?


In nuclear fusion, mass is converted into energy according to the famous equation E = mc^2, where E is energy, m is mass, and c is the speed of light.


Mass is energy. Add energy (in any form, such as heat) to a system and you increase its mass. Thus, in the NIF reaction, the mass lost from the pellet is mass imparted on the surrounding environment. Immediately after the fusion reaction, before the energy can dissipate further as heat, etc, the reaction chamber system has the same mass as before the ignition.

There are some nuances regarding the distinction between rest mass vs relativistic mass, but they're not really relevant in this context.

I think what trips people up here is confusing mass with matter. Matter is also subject to mass-energy equivalence, of course, but AFAIU in most common types of nuclear reactions little if any matter, per se, is transformed.


Fair enough. To be extra pedantic, mass-energy is conserved in a fixed inertial frame of reference.


That's a non sequitur. The laser ignition facility is not a smaller version of the entire universe.


Power plants add energy to an electrical grid by converting external (chemical/nuclear/kinetic) energy into more electricity than they consume. There's no loss of energy/mass overall, but the amount of available electricity goes up. Since the laser would use electricity from the grid, that should be taken into account.


The point is to get some of it from somewhere cheaper/free - mass, or outside air as in heat pumps.

You can't run your laser on mass or air, if you need a coal firing power plant to run your fusion reactor, from which you get less than you consumed from the coal plant...

It's great progress, it's just not as close to viable as it might sound like - more breakthroughs needed.


I have yet to find someone saying it sounds like fusion power reactors are right around the corner, but I have found lots of people shadowboxing these people and attacking the scientists for misleading press releases.

Seems like an overcorrection to something I haven't even seen anyone here say.


I think to a lot of the technically minded, but non nuclear physicists here, it initially sounded like less (paid for/electricity) energy was used than was put out. That's extremely exciting, and the actual news is still fantastic, it's just that 'actually, we needed to pay for over 100x more energy than we counted as the "input" energy [and it's possible to do 10x but not 100x better than that]' is quite a massive caveat on a 3:2 or whatever yield.

I'm not saying they've claimed anything wrong or deliberately misleading, it's just a misunderstanding/misalignment and possibly made worse by the PR teams in the middle.

In other words, I don't think it's an angry 'well actually' type correction so much as it is disappointment - it initially sounded even greater.


Not necessarily, it depends on how the reaction scales. If the reaction does not scale linearly (as is claimed) you don’t necessarily have to get more efficient, you just have to up the power until the output curve has increased past the input scaling. How big that is is determined by the efficiency of the input device itself, but it isn’t a question of if it will ever happen.


Yes, sure, it's just still a breakthrough or so (or at least work, I don't know how within grasp it is) away from what one may have (as did I) initially assumed.

Tangentially, it does seem fairly intuitive that it should be non-linear in that 'jump start' as it were: a fire can be grown arbitrarily large having started from a single match (or flint or whatever).


Right... look at some of the reactions:

"In terms of the physics, we are basically there, and the rest of it, at some level, is just engineering," he said.

[1] https://www.cbs58.com/news/wisconsin-reacts-to-breakthrough-...


You are correct: this is the important point. Until this is actually powering homes at scale and competitive price- which, clearly from these numbers is a fantasy- developments in this field should be dismissed. It's embarrassing to see HN so excited about what's really quite the failure.

(Sent from my ENIAC)


> ENIAC was the first programmable, electronic, general-purpose digital computer, completed in 1945.


And a huge waste of $400,000.


Also, they’re not counting all the meals the scientists ate —- this is total BS.


As was poin